Skip to content

Commit

Permalink
fixing documentation errors
Browse files Browse the repository at this point in the history
  • Loading branch information
erfanzar committed May 28, 2024
1 parent bd45223 commit 2c752dd
Show file tree
Hide file tree
Showing 7 changed files with 12 additions and 11 deletions.
Original file line number Diff line number Diff line change
@@ -1,7 +1,8 @@
# AttentionModule

## what is `AttentionModule`
AttentionModule
========

what is `AttentionModule`
--------
AttentionModule is a EasyDeL module that can perform attention operation with different strategies to help user achieve
the best possible performance and numerical stability, here are some strategies supported right now.

Expand All @@ -14,8 +15,8 @@ the best possible performance and numerical stability, here are some strategies
7. Wise Ring attention via "wise_ring"
8. sharded Attention with shard map known as "sharded_vanilla"

## Example of Using Flash Attention on TPU

Example of Using Flash Attention on TPU
--------
```python
import jax
import flax.linen.attention as flt
Expand Down
File renamed without changes.
File renamed without changes.
File renamed without changes.
8 changes: 4 additions & 4 deletions docs/FineTuningExample.rst → docs/finetuning_example.rst
Original file line number Diff line number Diff line change
@@ -1,11 +1,11 @@
## FineTuning Causal Language Model 🥵

FineTuning Causal Language Model 🥵
=====
with using EasyDeL FineTuning LLM (CausalLanguageModels) are easy as much as possible with using Jax and Flax
and having the benefit of `TPUs` for the best speed here's a simple code to use in order to finetune your
own Model

_Days Has Been Passed and now using easydel in Jax is way more similar to HF/PyTorch Style
now it's time to finetune our model_.
Days Has Been Passed and now using easydel in Jax is way more similar to HF/PyTorch Style
now it's time to finetune our model.

```python
import jax.numpy
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
## EasyDeLXRapTure for layer tuning and LoRA

EasyDeLXRapTure for layer tuning and LoRA
---------
in case of using LoRA and applying that on the EasyDeL models there are some other things
that you might need to config on your own but a lot of things being handled by EasyDeL so let just jump into an example
for LoRA fine-tuning section and use _EasyDeLXRapTure_ in for mistral models with flash attention example
Expand Down
File renamed without changes.

0 comments on commit 2c752dd

Please sign in to comment.