Skip to content

Optimize TPU Flash Attention (20x XLA compilation speed-up on 32k long context) #265

Optimize TPU Flash Attention (20x XLA compilation speed-up on 32k long context)

Optimize TPU Flash Attention (20x XLA compilation speed-up on 32k long context) #265