From 1c716549e2846de6c1da66d8d6778776bf7213b3 Mon Sep 17 00:00:00 2001 From: mbarak Date: Fri, 6 Sep 2024 11:01:13 +0200 Subject: [PATCH 1/3] Hydra --- content/awesome-SSM.md | 31 +++ content/posts/from-mamba-to-mamba2.md | 278 +++++++++---------- content/posts/hydra-a-double-headed-mamba.md | 129 +++++++++ static/images/hydra_model.png | Bin 0 -> 44787 bytes static/images/semi_vs_quasi_separable.png | Bin 0 -> 73196 bytes 5 files changed, 296 insertions(+), 142 deletions(-) create mode 100644 content/awesome-SSM.md create mode 100644 content/posts/hydra-a-double-headed-mamba.md create mode 100755 static/images/hydra_model.png create mode 100755 static/images/semi_vs_quasi_separable.png diff --git a/content/awesome-SSM.md b/content/awesome-SSM.md new file mode 100644 index 0000000..ca3d879 --- /dev/null +++ b/content/awesome-SSM.md @@ -0,0 +1,31 @@ +--- +title: "Awesome SSM" +date: 2024-09-05T10:39:29+01:00 +draft: false +--- + +Similarly to the [Awesome T5]({{< relref "posts/awesome-t5.md" >}}) series, this series will cover a bunch of posts about State Space Models, their extensions and applications. + +# Basics +- [Mamba, Mamba2]({{< relref "posts/from-mamba-to-mamba2.md" >}}) + +# Bidirectional +- [Hydra]({{< relref "posts/hydra-a-double-headed-mamba.md" >}}) + +# Theory and Limitations +- [The Expressive Capacity of State Space Models: A Formal Language Perspective](https://www.semanticscholar.org/paper/The-Expressive-Capacity-of-State-Space-Models%3A-A-Sarrof-Veitsman/e7f47e8393c697696a3fccd9ff906dfdb49fe736) + - look at SSMs from a lens of regular languages +- [The Illusion of State in State-Space Models](https://www.semanticscholar.org/paper/The-Illusion-of-State-in-State-Space-Models-Merrill-Petty/917479a7a72ee7c1fb320c14d770e30ef322ef28) + - look at the limitations of SSMs, especially when it comes to tracking state in Chess, Code and other domains + +# With Graphs +- [Graph Mamba: Towards Learning on Graphs with State Space Models](https://www.semanticscholar.org/paper/Graph-Mamba%3A-Towards-Learning-on-Graphs-with-State-Behrouz-Hashemi/2dda6da7375bf5e8bcf60f87b17ba10757f3bc57) + - we leverage SSMs an alternative to Message Passing in Graph Neural Networks + +# Distillation +- [Transformers to SSMs: Distilling Quadratic Knowledge to Subquadratic Models](https://browse.arxiv.org/abs/2408.10189v1) + - idea is to take an pretrained transformer and distill it into a SSM + +# Reinforcement Learning +- [Decision Mamba: Reinforcement Learning via Sequence Modeling with Selective State Spaces](https://www.semanticscholar.org/paper/Decision-Mamba%3A-Reinforcement-Learning-via-Sequence-Ota/9b8130a2a5d3398f4993f540ddd01d440d99d62e) + - apply SSMs to Sequential Decision Making diff --git a/content/posts/from-mamba-to-mamba2.md b/content/posts/from-mamba-to-mamba2.md index af7c829..26136b0 100644 --- a/content/posts/from-mamba-to-mamba2.md +++ b/content/posts/from-mamba-to-mamba2.md @@ -11,14 +11,8 @@ externalLink = "" series = ["Awesome State Space Models"] +++ -# Abstract -This is not my first gig where I write about State Space Models. I already mentioned them [here]({{< relref "posts/hungry-hungry-hippos.md" >}}) and [here]({{< relref "posts/butterflies-monarchs-hyenas-and-lightning-fast-bert.md" >}}). Now what is the deal with this Mamba(2) thing? They are proving to be an alternative to the strong Transformer++ architecture (Transformer++ models like LLaMa are based on Rotary Embedding, SwiGLU, MLP, RMSNorm, without linear bias, sometimes with grouped query attention and/or sliding window attention). Hold on, if this Transformer++ models work well, why do we need altneratives? There are multiple reason: - -1. **Performance**: Self-attention with a causal mask has a quadratic bottleneck, and as the sequence length becomes longer, this becomes a problem. Resolving this issue is a field of active research. One possible solution is to use Linear Attention, which we will cover since it is one of the basics Mamba-2 builds upon. Another possibility is to use Sliding Window Attention, which constrains the context for the next token generation to the past N tokens, where N is the window size. This alleviates the memory requirements, though it makes the model less capable. Technically speaking, State Space Models scale linearly in terms of sequence length (quadratically with the state size, but in general, this is fixed). - -2. **State**: Attention is stateless; there is no hidden state that is sequentially updated. This is both a good and a bad thing. It is good because if the model needs to look something up, it will take into account everything it has seen before. This is super important as it enables in-context learning. It is bad because it has to keep track of everything it has seen before. With state space models, we have a hidden state that is updated every time we have a new input. Because of this, we can view the hidden state as a compressed representation of everything it has observed before. Again, this is both good and bad. It is good because this compressed representation is smaller than the whole sequence, making it more efficient. It is bad because the hidden state has to be large enough to store everything that is important and at the same time remain relatively small to be efficient, AND (it is capitalized for a reason!) the mechanism that updates the state has to do it in a meaningful way (this is something we are going to explore in more detail). - -3. **Alternatives**: I get it, this is subjective, but if we only do research into Transformers, we may never find anything better. +## Abstract +This is not my first gig where I write about State Space Models. I already mentioned them [here](o Transformers, we may never find anything better. Before we dive into the details of Mamba-1 and Mamba-2, let me give you a brief summary: @@ -26,87 +20,87 @@ Before we dive into the details of Mamba-1 and Mamba-2, let me give you a brief **Mamba-2**: Mamba-2 is generally just a simplification of Mamba, with stronger constraints on the structure of the hidden space update matrix and moving some projections to the beginning of the layer. This enables the usage of common scaling strategies used in transformers, like tensor and sequence parallelism, and the ability to split input sequences across multiple GPUs. Also, the authors build a solid theoretical foundation behind SSMs and Semi-Separable Matrices and prove they have a primal-dual relationship with Linear Attention. -# Mamba -## Structured State Space Models (S4) +## Mamba +### Structured State Space Models (S4) -Structured State Space model is defined as a one-dimensional function of sequences $x(t) \in R \rightarrow y(t) \in R$ mapped trough a hidden state $h(t) \in R^N$. +Structured State Space model is defined as a one-dimensional function of sequences {% katex inline %} x(t) \in R \rightarrow y(t) \in R {% endkatex %} mapped trough a hidden state {% katex inline %} h(t) \in R^N {% endkatex %}. -The actual model consist of four parameters $\Delta, A, B, C$ and we can express it as: +The actual model consist of four parameters {% katex inline %} \Delta, A, B, C {% endkatex %} and we can express it as: -$$h(t) = Ah(t) + Bx(t) $$ -$$y(t) = Ch(t)$$ +{% katex %} h(t) = Ah(t) + Bx(t) {% endkatex %} +{% katex %} y(t) = Ch(t) {% endkatex %} -- $A \in R^{N \times N}$ is contained to be diagonal -- $B \in R^{N \times 1}$ -- $C \in R^{1 \times N}$ +- {% katex inline %} A \in R^{N \times N} {% endkatex %} is contained to be diagonal +- {% katex inline %} B \in R^{N \times 1} {% endkatex %} +- {% katex inline %} C \in R^{1 \times N} {% endkatex %} -Because of the constraints, we can represent all matrices with N numbers. To generalize to a multi-dimensional input, we apply the SSM independently to each channel, making the total memory requirements $O(BLDN)$. +Because of the constraints, we can represent all matrices with N numbers. To generalize to a multi-dimensional input, we apply the SSM independently to each channel, making the total memory requirements {% katex inline %} O(BLDN) {% endkatex %}. Since we work with continuous time but process discrete data, we need to discretize the model: -$$ h_t = \bar{A} h_{t-1} + \bar{B}x_t $$ -$$ y_t = Ch_t $$ +{% katex %} h_t = \bar{A} h_{t-1} + \bar{B}x_t {% endkatex %} +{% katex %} y_t = Ch_t {% endkatex %} -- $\bar{A} = f_A(\Delta, A)$ -- $\bar{B} = f_B(\Delta, A, B)$ -- with $f_A, f_B$ being discretization rules. For example we can use [Zero-Order hold](https://en.wikipedia.org/wiki/Zero-order_hold) +- {% katex inline %} \bar{A} = f_A(\Delta, A) {% endkatex %} +- {% katex inline %} \bar{B} = f_B(\Delta, A, B) {% endkatex %} +- with {% katex inline %} f_A, f_B {% endkatex %} being discretization rules. For example we can use [Zero-Order hold](https://en.wikipedia.org/wiki/Zero-order_hold) To actually compute this model, we use global convolution: -$$ y = x * \bar{K} $$ +{% katex %} y = x * \bar{K} {% endkatex %} - K is our kernel that is implicitly parametrized by an SSM -$$ \bar{K} = (C\bar{B}, C\bar{AB}, \cdots, C\bar{A}^k\bar{C}, \cdots)$$ +{% katex %} \bar{K} = (C\bar{B}, C\bar{AB}, \cdots, C\bar{A}^k\bar{C}, \cdots) {% endkatex %} -The benefit of this is that we can use Fast Fourier Transform to compute the convolution in $O(N \log N)$ time. +The benefit of this is that we can use Fast Fourier Transform to compute the convolution in {% katex inline %} O(N \log N){% endkatex %} time. -### Linear Time Invariance (LTI) +#### Linear Time Invariance (LTI) -Just from the definition above, we can see that the $(\Delta, A, B, C)$ do not depend on $x$ nor $t$. This is one of the main drawbacks and the reason why State Space Models were struggling with in-context learning. +Just from the definition above, we can see that the {% katex inline %} (\Delta, A, B, C){% endkatex %} do not depend on {% katex inline %} x{% endkatex %} nor {% katex inline %} t{% endkatex %}. This is one of the main drawbacks and the reason why State Space Models were struggling with in-context learning. -## Selective State Space Models (S6) +### Selective State Space Models (S6) -![S6](/images/selective_state_space_models.png) +![S6](https://n1o.github.io/images/selective_state_space_models.png) -One easy fix is to take the same model as above but make the parameters $\Delta, A, B$ functions of the input: +One easy fix is to take the same model as above but make the parameters {% katex inline %} \Delta, A, B{% endkatex %} functions of the input: -### Algorithm +#### Algorithm -- we have input $x: (B,L,D)$ (Batch, Length, Dimension) -- output $y: (B, L, D)$ +- we have input {% katex inline %} x: (B,L,D){% endkatex %} (Batch, Length, Dimension) +- output {% katex inline %} y: (B, L, D{% endkatex %}) -1. $A: (D,N) \leftarrow \text{Parameters}$ -2. $B: (B,L,D) \leftarrow s_B(x)$ -3. $C: (B,L,D) \leftarrow s_C(x)$ -4. $\Delta: (B,L,N) \leftarrow \tau_{\Delta}(\text{Parameter} + s_{\Delta}(x))$ -5. $\bar{A}, \bar{B}: (B, L, D, N) \leftarrow \text{discretize}(A,B)$ -6: $y \leftarrow \text{SSM}(\bar{A}, \bar{B}, C)(x)$ +1. {% katex inline %} A: (D,N) \leftarrow \text{Parameters} {% endkatex %} +2. {% katex inline %} B: (B,L,D) \leftarrow s_B(x) {% endkatex %} +3. {% katex inline %} C: (B,L,D) \leftarrow s_C(x) {% endkatex %} +4. {% katex inline %} \Delta: (B,L,N) \leftarrow \tau_{\Delta}(\text{Parameter} + s_{\Delta}(x)) {% endkatex %} +5. {% katex inline %} \bar{A}, \bar{B}: (B, L, D, N) \leftarrow \text{discretize}(A,B) {% endkatex %} +6: {% katex inline %} y \leftarrow \text{SSM}(\bar{A}, \bar{B}, C)(x) {% endkatex %} -- $A$ is still diagonal -- $s_B(x) = \text{Linear}_N(x)$ -- $s_C(x) = \text{Linear}_N(x)$ -- $s_{\Delta} = \text{Broadcast}_D(\text{Linear}_1(x))$ (we choose this due to a connection to Recurrent Neural Networks) -- $\tau_{\Delta} = \text{softplus}$ (we choose this due to a connection to Recurrent Neural Networks) -- $\text{Linear}_d$ is parametrized projection to dimension d +- {% katex inline %} A is still diagonal {% endkatex %} +- {% katex inline %} s_B(x) = \text{Linear}_N(x) {% endkatex %} +- {% katex inline %} s_C(x) = \text{Linear}_N(x) {% endkatex %} +- {% katex inline %} s_{\Delta} = \text{Broadcast}_D(\text{Linear}_1(x)){% endkatex %} (we choose this due to a connection to Recurrent Neural Networks) +- {% katex inline %} \tau_{\Delta} = \text{softplus} {% endkatex %} (we choose this due to a connection to Recurrent Neural Networks) +- {% katex inline %} \text{Linear}_d {% endkatex %} is parametrized projection to dimension d -### Selective Scan +#### Selective Scan -Since the dynamics of the model are dynamic, we cannot use global convolution anymore. Because of this, we define selective scan, which is a hardware-aware algorithm. The actual implementation is rather [involved](https://github.com/state-spaces/mamba/blob/62db608da60f6fc790b8ed9f4b3225e95ca15fde/csrc/selective_scan/selective_scan_fwd_kernel.cuh). The main idea is that we load the parameters $\Delta, A, B, C$ from HBM to SRAM, perform the discretization and recurrence in SRAM, and write the final output of size (B, L, D) back to main memory (HBM). To reduce memory requirements, the intermediate steps are not stored but recomputed during the backward pass. +Since the dynamics of the model are dynamic, we cannot use global convolution anymore. Because of this, we define selective scan, which is a hardware-aware algorithm. The actual implementation is rather [involved](https://github.com/state-spaces/mamba/blob/62db608da60f6fc790b8ed9f4b3225e95ca15fde/csrc/selective_scan/selective_scan_fwd_kernel.cuh). The main idea is that we load the parameters {% katex inline %} \Delta, A, B, C {% endkatex %} from HBM to SRAM, perform the discretization and recurrence in SRAM, and write the final output of size (B, L, D) back to main memory (HBM). To reduce memory requirements, the intermediate steps are not stored but recomputed during the backward pass. -### Benefits of (Natural) Selection +#### Benefits of (Natural) Selection Because of the selection mechanism, the model can choose what to store (or not) in its hidden state based on what it currently sees. It may also choose to reset its hidden state and start over. Selection enables the model to have strong in-context learning capabilities. -## Mamba Layer +### Mamba Layer The core of the Mamba architecture is the Mamba layer: -![Mamba](/images/mamba_layer.png) +![Mamba](https://n1o.github.io/images/mamba_layer.png) We are already familiar what is happening inside the SSM (Selective Scan) part of the Mamba. Prior to it we have two projections that expand the dimensionality of the input, than we perform short convolution as in M2 Bert with [torch.nn.Conv1d](https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html) on one branch on the other branch we apply just SiLu non-linearity (This is the same as the Gated approach found in other LLMs). After that we perform an additional projection, and we have all the inputs prepared for the SSM block. The output of the SSM block is than multiplied with the residual gate branch and finally we project the dimension back to match the input dimension. -# Mamba-2 +## Mamba-2 Mamba is a cool innovation, and it has led to multiple cool models, especially attention-SSM hybrid models like [Samba](https://github.com/microsoft/Samba) and [Zamba](https://github.com/Zyphra/transformers_zamba). However, the authors recognize some of its shortcomings. Its biggest weak point compared to Transformers is the lack of research in terms of scaling. For Transformers, we have multiple system optimizations on how to split up a model or how to split up processing long sequences into more GPUs. Here is two of them: @@ -115,190 +109,190 @@ Mamba is a cool innovation, and it has led to multiple cool models, especially a Mamba-2 is designed in a way that allows for Sequence Parallelism by passing the recurrent state between multiple GPUs. Tensor Parallelism is possible because of independent parallel projections of A, B, C, and X inputs of its SSM part. -## Semi-Separable Matrices +### Semi-Separable Matrices -This is a special structured matrix. We say that a lower triangular matrix $M$ is N-semi separable if every submatrix contained in the lower triangular part has rank at most N. +This is a special structured matrix. We say that a lower triangular matrix {% katex inline %} M {% endkatex %} is N-semi separable if every submatrix contained in the lower triangular part has rank at most N. Here, we are more interested in a special representation of N-semi separable called Sequentially Semi Separable (SSS). -### Sequentially Semi Separable (N-SSS) +#### Sequentially Semi Separable (N-SSS) -A lower triangular matrix $M \in R^{(T,T)}$ has an N-sequentially semiseparable representation if we can write it as: +A lower triangular matrix {% katex inline %} M \in R^{(T,T)} {% endkatex %} has an N-sequentially semiseparable representation if we can write it as: -$$ M_{ij} = C_j^TA_j \cdots A_{i+1}B_i$$ +{% katex %} M_{ij} = C_j^TA_j \cdots A_{i+1}B_i {% endkatex %} -- $B_0, \cdots, B_{T - 1}, C_0, \cdots, C_{T-1} \in R^N$ are vectors -- $A_0, \cdots, A_{T-1} \in R^{(N,N)} $ +- {% katex inline %} B_0, \cdots, B_{T - 1}, C_0, \cdots, C_{T-1} \in R^N {% endkatex %} are vectors +- {% katex inline %} A_0, \cdots, A_{T-1} \in R^{(N,N)} {% endkatex %} To express it in matrix form we define the SSS operator: -$$ M = SSS(A_{0:T}, B_{0:T}, C_{0:T})$$ +{% katex %} M = SSS(A_{0:T}, B_{0:T}, C_{0:T}) {% endkatex %} -It turns out that every N-semiseparable matrix M is also an N-sequentially semiseparable matrix. The main const of N-SSS representation that we can compress down the parameters to $O(NT)$ +It turns out that every N-semiseparable matrix M is also an N-sequentially semiseparable matrix. The main const of N-SSS representation that we can compress down the parameters to {% katex inline %} O(NT) {% endkatex %} -## State Space Duality +### State Space Duality Let's start by exploring a special case of 1-semiseparable (1-SS or just 1SS). This can be written in the Sequentially Semi-Separable form as: -$$SSS(a,b,c) = \text{diag}(c) \cdot M \cdot \text{diag}(b) $$ -- $M_{ij} = \prod_{t=j}^i a_t = a_{j:i}^{\times}$ +{% katex %} SSS(a,b,c) = \text{diag}(c) \cdot M \cdot \text{diag}(b) {% endkatex %} +- {% katex inline %} M_{ij} = \prod_{t=j}^i a_t = a_{j:i}^{\times} {% endkatex %} M is an 1-SS -$$M = 1SS(a_{0:T}) = \begin{bmatrix} 1 \\\ a_1 && 1 \\\ a_{2}a_1 && a_2 && 1 \\\ \vdots && \vdots && \ddots && \ddots \\\ a_{T-1}\cdots a_1 && a_{T-1}a_2 && \cdots && a_{T-1} && 1 \end{bmatrix}$$ +{% katex %} M = 1SS(a_{0:T}) = \begin{bmatrix} 1 \\\ a_1 && 1 \\\ a_{2}a_1 && a_2 && 1 \\\ \vdots && \vdots && \ddots && \ddots \\\ a_{T-1}\cdots a_1 && a_{T-1}a_2 && \cdots && a_{T-1} && 1 \end{bmatrix} {% endkatex %} -### State Space Models are Separable Matrices +#### State Space Models are Separable Matrices -We make a special assumption that we have a State Space Model without projections (no B, C) and the state dimension $N = 1$. Then we can express the multiplication $y = Mx$ as a recurrence: +We make a special assumption that we have a State Space Model without projections (no B, C) and the state dimension {% katex inline %} N = 1 {% endkatex %}. Then we can express the multiplication {% katex inline %} y = Mx {% endkatex %} as a recurrence: -$$y_t = a_{t:0}x_0 + \cdots + a_{t:t}x_t $$ -$$y_t = a_t(a_{t-1:0}x_0 \cdots a_{t-1:t-1}x_{t-1} + a_{t:t}x_t $$ -$$y_t = a_t y_{t-1} + x_t$$ +{% katex %} y_t = a_{t:0}x_0 + \cdots + a_{t:t}x_t {% endkatex %} +{% katex %} y_t = a_t(a_{t-1:0}x_0 \cdots a_{t-1:t-1}x_{t-1} + a_{t:t}x_t {% endkatex %} +{% katex %} y_t = a_t y_{t-1} + x_t {% endkatex %} We can generalize this further by expressing any State Space Model as matrix multiplication by an N-semiseparable matrix in a sequentially semiseparable form: -$$y = SSM(A,B,C)(x) = SSS(A,B,C) \cdot x $$ +{% katex %} y = SSM(A,B,C)(x) = SSS(A,B,C) \cdot x {% endkatex %} -## Linear(Recurrent) and Dual(Quadratic) form +### Linear(Recurrent) and Dual(Quadratic) form We already know we can express a State Space model as a matrix multiplication by an N-separable matrix in a sequentially semiseparable form: -$$ y = SSS(A,B,C) \cdot x $$ +{% katex %} y = SSS(A,B,C) \cdot x {% endkatex %} -However, if we naively first compute the $SSS$ part and then multiply by $x$, we end up with an $O(T^2)$ complexity. There is a more efficient recurrent way. However, let's break down the quadratic form first, since it has a tight connection to Attention. +However, if we naively first compute the {% katex inline %} SSS {% endkatex %} part and then multiply by {% katex inline %} x {% endkatex %}, we end up with an {% katex inline %} O(T^2) {% endkatex %} complexity. There is a more efficient recurrent way. However, let's break down the quadratic form first, since it has a tight connection to Attention. -### Dual (Quadratic) Form +#### Dual (Quadratic) Form Here, we take a small detour from SSMs and look into Linear Attention. We can express the attention mechanism as: -$$Y = \text{softmax}(QK^T) V $$ +{% katex %} Y = \text{softmax}(QK^T) V {% endkatex %} This is the most common form of attention, called Softmax Attention. By applying a causal mask, we get the following: -$$Y = (L \circ \text{softmax}(QK^T)) \cdot V $$ +{% katex %} Y = (L \circ \text{softmax}(QK^T)) \cdot V {% endkatex %} -- $L$ is an lower triangular matrix with ones on and below the main diagonal +- {% katex inline %} L {% endkatex %} is an lower triangular matrix with ones on and below the main diagonal In linear attention we drop the softmax to get: -$$Y = (L \circ (QK^T)) \cdot V $$ +{% katex %} Y = (L \circ (QK^T)) \cdot V {% endkatex %} This form is way nicer and we can rewrite it using einsum as: -$$Y = \text{einsum}(TN,SN,SP, TS \rightarrow TP)(Q,K,V,L)$$ +{% katex %} Y = \text{einsum}(TN,SN,SP, TS \rightarrow TP)(Q,K,V,L) {% endkatex %} Or we can express it as pairwise matrix multiplication: -1. $G = \text{einsum}(TN,SN \rightarrow TS)(Q,K)$ resulting shape (T,S) -2. $M = \text{einsum}(TS,TS \rightarrow TS)(G,L)$ resulting shape (T,S) -3. $Y = \text{einsum}(TS,SP \rightarrow TP)(M,V)$ resulting shape (T,P) +1. {% katex inline %} G = \text{einsum}(TN,SN \rightarrow TS)(Q,K) {% endkatex %} resulting shape (T,S) +2. {% katex inline %} M = \text{einsum}(TS,TS \rightarrow TS)(G,L) {% endkatex %} resulting shape (T,S) +3. {% katex inline %} Y = \text{einsum}(TS,SP \rightarrow TP)(M,V) {% endkatex %} resulting shape (T,P) - T, S are the target source dimensions, for autoregressive self-attention they are the same - P is the head dimensionality -### Linear (Recurrent) Form +#### Linear (Recurrent) Form Until now, we have just removed the softmax operation. However, we can go further by changing the order of matrix association, resulting in the following: -$$(QK^T)V = Q(K^TV) $$ +{% katex %} (QK^T)V = Q(K^TV) {% endkatex %} -With this, we can re-express the definition of $Y$ as: +With this, we can re-express the definition of {% katex inline %} Y {% endkatex %} as: -$$ Y = Q \cdot \text{cumsum}(K^TV)$$ +{% katex %} Y = Q \cdot \text{cumsum}(K^TV) {% endkatex %} - cumsum is just the cumulative sum It may seem that we got rid of the causal mask. This is technically not true, since the cumsum operation is a causal operation, and we just hid it. To make this clearer, we can express the same equation using einsum: -1. $Z = \text{einsum}(SP,SN \rightarrow SPN)(V,K)$ resulting shape (S,P,N) -2. $H = \text{einsum}(TS,SPN \rightarrow TPN)(V,K)$ resulting shape (T,P,N) this being optimized with subquadratic matrix multiplication -3. $Y = \text{einsum}(TN,TPN \rightarrow TP)(V,K)$ resulting shape (T,P) +1. {% katex inline %} Z = \text{einsum}(SP,SN \rightarrow SPN)(V,K) {% endkatex %} resulting shape (S,P,N) +2. {% katex inline %} H = \text{einsum}(TS,SPN \rightarrow TPN)(V,K) {% endkatex %} resulting shape (T,P,N) this being optimized with subquadratic matrix multiplication +3. {% katex inline %} Y = \text{einsum}(TN,TPN \rightarrow TP)(V,K) {% endkatex %} resulting shape (T,P) Lets break down the equation: 1. Expands the dimensionality by a factor N 2. Uses the mask matrix L explicitly, we flatten the dimensions of (P,N) resulting in multiplying an lower triangular matrix with an vector. This just just an cumulative sum operation: -$$ y = \begin{bmatrix} 1 \\\ \cdots && \ddots \\\ 1 && \cdots && 1 \end{bmatrix}x \Leftrightarrow \begin{matrix} y_0 = x_0 \\\ y_t = y_{t-1} + x_t\end{matrix} $$ +{% katex %} y = \begin{bmatrix} 1 \\\ \cdots && \ddots \\\ 1 && \cdots && 1 \end{bmatrix}x \Leftrightarrow \begin{matrix} y_0 = x_0 \\\ y_t = y_{t-1} + x_t\end{matrix} {% endkatex %} 3. Contracts the dimensionality back to P -## State Space Models and Recurrent Linear Attention +### State Space Models and Recurrent Linear Attention The hints that there should be a connection between the recurrent form of Linear Attention and the State Space Model should be obvious. Lets remind us about the definition of the State Space Model using SSS: -$$ Y = SSS(A,B,C) \cdot x $$ +{% katex %} Y = SSS(A,B,C) \cdot x {% endkatex %} The SSS matrix M is defined as: -- $M_{ji} = C_j^TA_{j:i}B_i$ +- {% katex inline %} M_{ji} = C_j^TA_{j:i}B_i{% endkatex %} -By constraining the A matrix to be diagonal $A = aI$ we can rearrange the terms a bit to get: +By constraining the A matrix to be diagonal {% katex inline %} A = aI {% endkatex %} we can rearrange the terms a bit to get: -$$ M_{ji} = A_{j:i} \cdot (C_j^TB_i)$$ +{% katex %} M_{ji} = A_{j:i} \cdot (C_j^TB_i) {% endkatex %} The equation for M in matrix form becomes: -$$L = 1SS(a)$$ -$$M = L \circ (CB^T)$$ +{% katex %} L = 1SS(a) {% endkatex %} +{% katex %} M = L \circ (CB^T) {% endkatex %} -- $B,C \in R^{(T,N)}$ +- {% katex inline %} B,C \in R^{(T,N) {% endkatex %}} -Now we can compute $Y = MX$ using einsum as: +Now we can compute {% katex inline %} Y = MX {% endkatex %} using einsum as: -1. $G = \text{einsum}(TN,SN \rightarrow TS)(C,B)$ resulting shape (T,S) -2. $M = \text{einsum}(TS,TS \rightarrow TS)(G,L)$ resulting shape (T,S) -3. $Y = \text{einsum}(TS,SP \rightarrow TP)(M,X)$ resulting shape (T,P) +1. {% katex inline %} G = \text{einsum}(TN,SN \rightarrow TS)(C,B) {% endkatex %} resulting shape (T,S) +2. {% katex inline %} M = \text{einsum}(TS,TS \rightarrow TS)(G,L) {% endkatex %} resulting shape (T,S) +3. {% katex inline %} Y = \text{einsum}(TS,SP \rightarrow TP)(M,X) {% endkatex %} resulting shape (T,P) If we assume that S = T, we end up with the same equations as in the Recurrent form of Linear Attention. And that is it, we have our duality. -## Mamba-2 Layer +### Mamba-2 Layer -At the beginning, I mentioned that there are few differences between Mamba and Mamba-2. One of them is a stronger constraint on the matrix A, for Mamba-2 it is $A = aI$ in Mamba it was $A = \text{diag}(a)$. The reason to constrain to $A = aI$ is that we can express the SSM as a matrix multiplication of an 1-SS matrix, which is more efficient to compute. +At the beginning, I mentioned that there are few differences between Mamba and Mamba-2. One of them is a stronger constraint on the matrix A, for Mamba-2 it is {% katex inline %} A = aI {% endkatex %} in Mamba it was {% katex inline %} A = \text{diag}(a) {% endkatex %}. The reason to constrain to {% katex inline %} A = aI {% endkatex %} is that we can express the SSM as a matrix multiplication of an 1-SS matrix, which is more efficient to compute. -![S6](/images/mamba_2_architecture.png) +![S6](https://n1o.github.io/images/mamba_2_architecture.png) -In the image above, we can see the differences between Mamba and Mamba-2. While the idea of Mamba was to have a function $X \rightarrow Y$, in Mamba-2, we instead think of a mapping of $A, B, C, X \rightarrow Y$. Because of this, we can parallelize the computation of the projections at the beginning of the block. This enables tensor parallelism and reduces the number of parameters. This is also analogous to Attention, where $X, B, C$ correspond to $Q, K, V$. +In the image above, we can see the differences between Mamba and Mamba-2. While the idea of Mamba was to have a function {% katex inline %} X \rightarrow Y {% endkatex %}, in Mamba-2, we instead think of a mapping of {% katex inline %} A, B, C, X \rightarrow Y {% endkatex %}. Because of this, we can parallelize the computation of the projections at the beginning of the block. This enables tensor parallelism and reduces the number of parameters. This is also analogous to Attention, where {% katex inline %} X, B, C {% endkatex %} correspond to {% katex inline %} Q, K, V {% endkatex %}. -Additionally, Mamba-2 introduces a larger head dimension $P$. While Mamba leverages $P =1 $, Mamba-2 leverages $P = \{64, 128\}$. Again, this is similar to conventions in Transformer Architecture. What does this head dimension in Mamba mean? If we have a head dimension of 1, we are computing an SSM for each channel independently. By increasing the head dimension, we achieve a sort of weight-tying where we share SSMs across multiple channels. +Additionally, Mamba-2 introduces a larger head dimension {% katex inline %} P {% endkatex %}. While Mamba leverages {% katex inline %} P =1 {% endkatex %}, Mamba-2 leverages {% katex inline %} P = \{64, 128\} {% endkatex %}. Again, this is similar to conventions in Transformer Architecture. What does this head dimension in Mamba mean? If we have a head dimension of 1, we are computing an SSM for each channel independently. By increasing the head dimension, we achieve a sort of weight-tying where we share SSMs across multiple channels. -Overall, it may seem that Mamba-2 is less expressive than Mamba. However, due to optimizations, we are able to train Mamba-2 models with much larger state dimensions (in Mamba-1 we had $N=16$, whereas in Mamba-2 we can go up to $N=256$ or more), while also being much faster during training. +Overall, it may seem that Mamba-2 is less expressive than Mamba. However, due to optimizations, we are able to train Mamba-2 models with much larger state dimensions (in Mamba-1 we had {% katex inline %} N=16 {% endkatex %}, whereas in Mamba-2 we can go up to {% katex inline %} N=256 {% endkatex %} or more), while also being much faster during training. The model also adds an additional normalization layer, which improves the stability of larger models. There is nothing more to say about Mamba-2; it is simply a more efficient version of Mamba, incorporating many lessons learned from Transformers and the strong theoretical foundation behind SSMs and Semi-Separable Matrices. -### Algorithm +#### Algorithm -As with Mamba-1, we cannot use Global Convolution. For Mamba-2, we need an efficient way to compute the matrix $M$. Luckily, the computation is much simpler than for Mamba-1, and we do not need to implement a low-level GPU kernel. The algorithm consists mostly of matrix multiplications. +As with Mamba-1, we cannot use Global Convolution. For Mamba-2, we need an efficient way to compute the matrix {% katex inline %} M {% endkatex %}. Luckily, the computation is much simpler than for Mamba-1, and we do not need to implement a low-level GPU kernel. The algorithm consists mostly of matrix multiplications. -![Mamba-2 Blocks](/images/mamba_2_diagonal_off_diagonal_blocks.png) +![Mamba-2 Blocks](https://n1o.github.io/images/mamba_2_diagonal_off_diagonal_blocks.png) -This is an example for $T=9$ where we decompose it into chunks of length $Q = 3$, we can generalize it as: +This is an example for {% katex inline %} T=9 {% endkatex %} where we decompose it into chunks of length {% katex inline %} Q = 3 {% endkatex %}, we can generalize it as: -1. $M^{(j,j)} = SSM(A_{jQ:(j+1)Q},B_{jQ(j+1)Q},C_{jQ:(j+1)Q}$ for the diagonal blocks -2. $M^{(i,j)} = \begin{bmatrix}C_{jQ}^TA_{jQ:jQ-1} \\ \vdots \\ C^T_{(j+1)Q-1}A_{(j+1)Q-1:jQ-1}\end{bmatrix}A_{jQ-1:(i+1)Q-1} \begin{bmatrix}B_{iQ}^TA_{(i+1)Q-1:iQ} \\ \vdots \\ B_{(i+1)Q-1}^T A_{(i+1)Q-1:(i+1)Q-1}\end{bmatrix}^T$ for the off-diagonal low rank blocks +1. {% katex inline %} M^{(j,j)} = SSM(A_{jQ:(j+1)Q},B_{jQ(j+1)Q},C_{jQ:(j+1)Q} {% endkatex %} for the diagonal blocks +2. {% katex inline %} M^{(i,j)} = \begin{bmatrix}C_{jQ}^TA_{jQ:jQ-1} \\ \vdots \\ C^T_{(j+1)Q-1}A_{(j+1)Q-1:jQ-1}\end{bmatrix}A_{jQ-1:(i+1)Q-1} \begin{bmatrix}B_{iQ}^TA_{(i+1)Q-1:iQ} \\ \vdots \\ B_{(i+1)Q-1}^T A_{(i+1)Q-1:(i+1)Q-1}\end{bmatrix}^T {% endkatex %} for the off-diagonal low rank blocks -#### Diagonal Blocks +##### Diagonal Blocks -The general idea is that $Q$ is rather small. Because of this, we can use the dual quadratic form of Structured Masked Attention (more on this later) and perform the computation for each block in parallel. +The general idea is that {% katex inline %} Q {% endkatex %} is rather small. Because of this, we can use the dual quadratic form of Structured Masked Attention (more on this later) and perform the computation for each block in parallel. -#### Low Rank Blocks +##### Low Rank Blocks Here, we have three parts (the following example is the breakdown of the leftmost bottom block from the image above): -1. $\begin{bmatrix} C_6^T A_{6:5} \\\ C_7^TA_{7:5} \\\ C_8^TA_{8:5} \end{bmatrix}^T$ this are the left factors (C-block factors) -2. $A_{5:2}$ this are the center factors (A-block factors) -3. $\begin{bmatrix} B_0^T A_{2:0} \\\ B_1^TA_{2:1} \\\ B_2^TA_{2:2} \end{bmatrix}^T$ this are the right factors (B-block factors) +1. {% katex inline %} \begin{bmatrix} C_6^T A_{6:5} \\\ C_7^TA_{7:5} \\\ C_8^TA_{8:5} \end{bmatrix}^T {% endkatex %} this are the left factors (C-block factors) +2. {% katex inline %} A_{5:2} {% endkatex %} this are the center factors (A-block factors) +3. {% katex inline %} \begin{bmatrix} B_0^T A_{2:0} \\\ B_1^TA_{2:1} \\\ B_2^TA_{2:2} \end{bmatrix}^T {% endkatex %} this are the right factors (B-block factors) -#### Pytorch +##### Pytorch Compared to Mamba-1's selective scan the implementation is way more straight forward: @@ -326,76 +320,76 @@ def ssd(X, A, B, C, block_len=64, initial_states=None): """ assert X.dtype == A.dtype == B.dtype == C.dtype assert X.shape[1] % block_len == 0 - # Rearrange into blocks/chunks + ## Rearrange into blocks/chunks X, A, B, C = [rearrange(x, "b (c l) ... -> b c l ...", l=block_len) for x in (X, A, B, C)] A = rearrange(A, "b c l h -> b h c l") A_cumsum = torch.cumsum(A, dim=-1) - # 1. Compute the output for each intra-chunk (diagonal blocks) + ## 1. Compute the output for each intra-chunk (diagonal blocks) L = torch.exp(segsum(A)) Y_diag = torch.einsum("bclhn,bcshn,bhcls,bcshp->bclhp", C, B, L, X) - # 2. Compute the state for each intra-chunk - # (right term of low-rank factorization of off-diagonal blocks; B terms) + ## 2. Compute the state for each intra-chunk + ## (right term of low-rank factorization of off-diagonal blocks; B terms) decay_states = torch.exp((A_cumsum[:, :, :, -1:] - A_cumsum)) states = torch.einsum("bclhn,bhcl,bclhp->bchpn", B, decay_states, X) - # 3. Compute the inter-chunk SSM recurrence; produces correct SSM states at chunk boundaries - # (middle term of factorization of off-diag blocks; A terms) + ## 3. Compute the inter-chunk SSM recurrence; produces correct SSM states at chunk boundaries + ## (middle term of factorization of off-diag blocks; A terms) if initial_states is None: initial_states = torch.zeros_like(states[:, :1]) states = torch.cat([initial_states, states], dim=1) decay_chunk = torch.exp(segsum(F.pad(A_cumsum[:, :, :, -1], (1, 0)))) new_states = torch.einsum("bhzc,bchpn->bzhpn", decay_chunk, states) states, final_state = new_states[:, :-1], new_states[:, -1] - # 4. Compute state -> output conversion per chunk - # (left term of low-rank factorization of off-diagonal blocks; C terms) + ## 4. Compute state -> output conversion per chunk + ## (left term of low-rank factorization of off-diagonal blocks; C terms) state_decay_out = torch.exp(A_cumsum) Y_off = torch.einsum('bclhn,bchpn,bhcl->bclhp', C, states, state_decay_out) - # Add output of intra-chunk and inter-chunk terms (diagonal and off-diagonal blocks) + ## Add output of intra-chunk and inter-chunk terms (diagonal and off-diagonal blocks) Y = rearrange(Y_diag+Y_off, "b c l h p -> b (c l) h p") return Y, final_state ``` -#### Performance +##### Performance Mamba-2, like Mamba-1, with hidden state size \( N \), has the same training speed \( O(TN^2) \) and inference speed \( O(N^2) \). However, the biggest improvement is the use of matrix multiplication in Mamba-2, which is much more efficient than the selective scan in Mamba-1. -# State Space Duality Additional Notes +## State Space Duality Additional Notes Overall, the State Space Duality paper introduces many concepts; here are arguably the most important ones: -## Structured Masked Attention +### Structured Masked Attention This builds upon the notion of linear attention, where we expressed the causal mask matrix L as a cumulative sum. However, we can generalize the mask matrix L to any matrix that supports fast matrix multiplication. -![[/images/structured_attention.png]] +![[https://n1o.github.io/images/structured_attention.png]] In this case, we view the attention mechanism through the following equations (this is also the quadratic form mentioned earlier): -$$Y = MV$$ +{% katex %} Y = MV {% endkatex %} -$$M = QK^T \circ L$$ +{% katex %} M = QK^T \circ L {% endkatex %} Where L is our mask matrix, which we can choose as we like. In the context of State Space duality, we choose it as 1-semiseparable matrix. -## Multi patterns for SSMs +### Multi patterns for SSMs Again, this builds upon analogies to Attention, where multihead attention involves applying self-attention multiple times and concatenating the results. We can achieve something similar by applying the SSD algorithm and broadcasting it across multiple dimensions. -### Multi-Contract SSM +#### Multi-Contract SSM This is analogous to Multi-Query Attention, where we share K and V across all the heads of Q. For attention, this makes a lot of sense since we cache K and V pairs. In SSMs, this is equivalent to sharing X and B across multiple heads of the SSM, and having C (parameters that control the contraction) be independent per head. -### Multi-Expand SSM +#### Multi-Expand SSM Here, we share C and X across multiple heads, and B (controls expansion) is independent per head. -### Multi-Input SSM +#### Multi-Input SSM Here, we share B and C across multiple heads, and X is independent. For an SSM like Mamba, we consider X as the input. Because of this, it is a better fit to have a unique X per head. Technically, we can view the S6 layer introduced in Mamba as having Head Dimension P = 1, which means that each channel has independent SSM dynamics A, and it is a Multi-Input SSM where we share B and C matrices across all channels. -# TLDR; +## TLDR; This was probably a lot to take in, so to sum it up, we introduced Mamba. Mamba is a State Space model whose dynamics are dependent on the input, which improves its ability for In-Context learning. Because we want efficient computation, we need to derive a hardware-efficient algorithm, and to do that, we need to enforce structure on the matrices used by Mamba. Mamba-2 tackles the efficiency problem by enforcing even more constraints, making the model more contained but easier to scale and allowing for larger hidden dimensions. diff --git a/content/posts/hydra-a-double-headed-mamba.md b/content/posts/hydra-a-double-headed-mamba.md new file mode 100644 index 0000000..f03bbe2 --- /dev/null +++ b/content/posts/hydra-a-double-headed-mamba.md @@ -0,0 +1,129 @@ ++++ +draft = false +date = 2024-08-29T11:53:51+02:00 +title = "Hydra a Double Headed Mamba" +description = "" +slug = "" +authors = [] +tags = ["NLP", "SSM"] +categories = [] +externalLink = "" +series = ["Awesome State Space Models"] ++++ + +# Abstract + +State Space Models are awesome, models like Mamba and Mamba2 boast unparalleled performance especially when it comes to long sequences. The only downside is that they are causal, which means they model one token at a time, looking only at past tokens. Bidirectional models like [Bert, CodeBERT and GraphCodeBERT](https://codebreakers.re/articles/detail/bert-codebert-and-graphcodebert/) have been shown to excel when it comes to code understanding. One way to put it is that by looking into the past and the future simultaneously we can get a better understanding of what is happening. Hydra is a bidirectional extension of Mamba2 that builds upon solid mathematical foundations, instead of just naively taking two Mamba(2)'s, flipping one of them, and somehow combining them. + + +# Quasi-Separable Matrices + +We build upon research done in [State Space Duality]({{< relref "posts/from-mamba-to-mamba2.md" >}}) and the Mamba2 model. The main idea of Mamba2 is to express a State Space Model as: + +$$ y = M \cdot x $$ +- $M = SSS(A,B,C)$ is a Sequentially Semi-Separable Matrix + +As a reminder, an N Semi-Separable Matrix is a lower triangular matrix where every submatrix contained in the lower triangular part is at most rank N. We can express any N Semi-Separable matrix as an N Sequentially Semi-Separable matrix. This just means that every entry in $M_{ij}$ is a product of vectors and matrices: + +$$M_{ij} = C_j^A_j \cdots A_{i+1}B_i$$ + +- $B_0,\cdots, B_{T-1}, C_0, \cdots, C_{T-1} \in R^{N}$ are vectors +- $A_0, \cdots, A_{T-1} \in R^{N,N}$ are matrices + +Since Semi-Separable matrices are lower triangular matrices, we can view them as a sort of causal attention mask. Since we are also interested in the future tokens, we need an matrix, that is non-zero above the main diagonal as well as below. + +## Definition +Quasi-Separable Matrices consist of a lower and upper triangular matrix, with an special vector on the main diagonal. + +$$ m_{ij} = \begin{cases} \xrightarrow[c_i^T ]{} \xrightarrow[A_{i:j}^x]{} \xrightarrow[b_j]{} && \text{ if } i> j \\\ \delta_i && \text{ if } i = j \\\ \xleftarrow[c_j^T ]{} \xleftarrow[A_{j:i}^x]{} \xleftarrow[b_i]{} && \text{ if } i B L (H P)" + ) + return y +``` + +- more on [SSD]({{< relref "posts/from-mamba-to-mamba2.md#pytorch" >}}) +- the shift is required to make place for the diagonal entry of Matrix D + +Here is the detailed description of a Hydra layer: + +![Hydra](/images/hydra_model.png) + +It is a bit (way) more involved, here is the actual [[Source Code]](https://github.com/goombalab/hydra/blob/main/hydra/modules/hydra.py). + +We have initial projections of the input, 1D convolutions, discretization, and flipping. We multiply it with the actual Semi-Separable matrix once for the forward and once for the backward direction. We do some elementwise product (selective gating), shifting, and merging the results together. That is followed by a normalization and a residual connection with the original input. + +### Pretraining + +Hydra uses the standard masked language modeling objective, where we mask out some tokens and try to predict them. + +### Remarks + +One thing that stands out is that the implementation, in comparison to BERT, is much more complex. + +### CLS Token + +The authors use a special pooling technique to average out the CLS token. + +## Efficiency + +It is worth mentioning that we share a single Semi-Separable matrix for the forward and backward direction. Because of this, Hydra introduces only a few more parameters (the diagonal matrix D) than Mamba2. + +Unfortunately, the authors do not provide any empirical results on the runtime or memory usage of Hydra. But since it uses the same building block as Mamba2, we can expect it to give us gains especially on long sequences. I would like to see some benchmarks on the runtime and memory, especially compared to FlashAttention + + +## Performance + +On the reported benchmarks, the performance of Hydra was higher than BERT across all tasks, with a couple of exceptions. Just having an alternative to bidirectional attention that is on par with BERT is a huge win. + +# Conclusion + +It is nice to see adaptation of SSMs also to bidirectional models. The research is only in its early stages, and only time will tell if Hydra will be applied in practice. There is a lot of research on hybrid models that combine State Space Models with Attention; especially the combination of Hydra and M2 could result in a powerful model that can handle extremely long sequences and still remain efficient. diff --git a/static/images/hydra_model.png b/static/images/hydra_model.png new file mode 100755 index 0000000000000000000000000000000000000000..37f212b35dba14e1c982a39301f5d24a184d4874 GIT binary patch literal 44787 zcmb^ZbyQVd_XZ3f-~fj{NJ(>4R2u1yLn+t$-1qbR#y7?{-Z9=kL^gZx6?4rs*PPe9<_=d?mLtHW!i7K}1dkp{t3x2@Q1Gt? zh6Voe022HQ{Dg^b|)P>3+rh9gs%1wuA1hLMjZPpf|LoDnU*^6b0|*%j87sB9SMgQ zy_y{JUc!gbaNdrcXi_X*}`4by6rK8T{$u^Fwp0*Ix2u< zfIpz}l|o_UE@>Pn!C)%L0EJ0Mg?w<9 z2vkRhqN5?e-|mYjVBfjVD8T@SqeE|nBDI7VB$?3Ai6{Tuz0;IP_)t~eU*fZ?1PuQ2 zW`wctsf=ooGHo)<{;U55O@PE_+dnt(orR?t;8}lO;so75KXIcv!^Z*xk#I--eA(^~ zXMn>7Q9mEt1f8MJQ=+=W!a({v4fVR?I%VQL&oOW30A0Cp??IzWVMoG;dMYTIGH(G3k$81!| z16BocF}LN#tX?!QZMk*9v6k(3HRSa8`nEOFelyuPc)6QH^PQ~C`k}J;nPdB{i3S^C zT0zgKf$UZY1ct$;C!$>MxJ>T7->Ek)I-6r`gkfY|!mzwo?f58Ru-@!mY+Ge6C`6VAp z28Am98aMVEbDEF3ZHlaL*nf<~q7wO@;(Icmw}%IT+`b|J8otEdd9%TsYko;L?QS|Zo`0&`<$BB~=$nW+djlziGv$5yjJ=hpw zZ{Y@4rxt2L-wO1J%+S#w^UHJXK|IdCR6Tdvu?^!XJ@R!vm>#UZeLoyQ#*%Q34$(;_ zxu>G8UQ#()~Z)?PmC6%1F*+DqjM0P^Lun-M8hFNok;w7@03FY)NnqlDXXgv)!u z!O-=QBBIOPF8ZEJ3`kkmB`@8ZBK?oV*NKtxoiu)OrS>Ttm3poFR`y33(X}0?M)Y{lJZt|V&bP$cuVna+s z(`wvbuZvu~JZWYA#0+&zS$N<2Gxbla!|#p7Yx^>&2?sKTAGcEb5RnHT&-!KC)-RKf zx2%trNUaWJYS2+YOb8}wwx!KjG|B&r&42Z|IpRl5`%4In#b%<~!ED9=*~W<3junZP z%-VfP&ZeiCz*YQ;2*N3z+1PS-f_Udz3k_)5r>Qyzse_xbsYmH{EhV!)xshhQNorXl z9tK@=>eyO(V7p=g?Tz->`+Coi@-P^O25ksR$d2G932XnC2s8o~5p~-8jKCIlZu~3l$(v|_Vr~+Z^y&Ix{b#}D* zcGh7&#;)Zww9>qv`PF2e4BpxKc5C7q1|)?DC@tR`)6R%n8Y}I=T* z9vWl?9VG*uI{V!2U_tJy2l4g&;p{{6L90w!*S$A|J!fA(yHjr-kzBIx(AeG!`Ok5K>nA~RinoXbY0bL+-Gngh^3vp9?mjuWL5qyu) z6p=S3|X_e&KU?YfVj zq01qx6DmWbW!m;?5;DaVej_({5f4RM!_}#loQ%cUvm*Gj6-%iwDf`oSNj7Spk1MB| z<3#%ZrVNmbhP)(1N#Rqnu`d(X=%iiibXR%k_}W3kgVDl_#+#oXZ>yy95ftKHxP`P= zTgebHQ)JhO48*0EI%gn-2jA+^*U4?qGo&i<&O%Er*KagE9+ zGM@?_NiZ7<3xs)&GD{OmeaU!rvM}0otuExw=0qHe7}2bEG~-98hWiJ1DbJZfAlLP$ zE9sb1z1_24noxs7zBg)?4(+j94ES<5vSFzhdFNPhLM69htrjDQ{dHUqL$C;)DCrGz zi)J|51Teq?*JtCcpM9Xfriae2MW0Hlws?DFPrzTU2kezNGwaLDdNXLZ1V(gjBoln(i`m zr}NttB6rR!lkZFFCyyr!$bT$2bqu~GNEndM#zxWDpd40I?Mxjcq1q&DbEfnnH;RAO zTcfON6N~ExI!+|o7ce$^U^5^SRTeSNul%=LuZ>!Jn>rtXZQGdjp-bcNDt&6^=zYKC zIw^W>KzNwY#(0Pz7x$ZHMFzk=3MVmUGBQ1V;5U-%B{od%EA_s(_ z8a27VaU(A!5q$TF`M%qU>dn8{ys)m}vED^({K3Z&&u~8>)CB_2JTKP_#&q1;TKEYMkBu4bqpP+ImdB zb>t%YHczE>8k{FHu00{tD6r^>tP2_>4jT!9*gtUFGI(s>;;<-tX}f=KV5rhVM3 zPcP37%N0*UD2%>s*X?v&obDTHVZLwLi{P>Wry($#kT*~JQXa4~-4w9T@f{E0INF+4 z*xel{N$@y%cEI}A!4HvdaNFK`9auZ#snEKQr0-2)%m4gY+95-|Kz^t|Avs|U4H8TN ztWw*b6X;PpiHeA%-^Fg2FLvl%v8$7(<5!X7Z1Tdczf}t{AaGx>^8Wa7+z2j_{kfCf z#luF0XunD2wivc+ zHRBYHyVY{G&fx@vg-WTM9At;k&nLnTVoVh;14FxRK*tMttWM2>zu%_CmuNo28_ zZSiqF-7{;!iNPd^-B6xZ$T?@YKtW?`s(!@mihtTU-jcQ3i6n}e|J6W- zVEz;lgy9S|AVkn4sr^be@HBc-x$k8PxyaYkLE60k+$sRau4ya+b=JC|mJ^8!+`8u| zjSeOS9eoG(4gzBWW~vMa+<>bC;pw1G0ERc<^@_XThQvdZM99Y*i#4~Ig)#I{kh zVSop={d@CM2;P4`3`N30@BwK;xyXST;(l~UVC??67Iq{2?`sJdItCDc8%kQSV3p7A zK*w9MWcJ0f-|UJVdhP0-EwQYwmQe708xGhk$vo0M4j{2w9pYZAaamk{B2+h3Pc&0# zxGKiua*Ea9%6r*h6yxEH**_NZbE|W{nkC<5dYC_&@cNm^7aXDA@(M(0l^#R|4M}GB z|4hYVAY)ezCRT72GowssGhJ6VHEMQBDRg9|!MrrrN+7`et=n+j$3X8~NGoRidpL)Z zl26Iw)yf(bKbFfH*<8j?T&b|-U5zbF zCwH3Vmtu?$Wry;(|1)2>afpQdA&E5;*FfOtL3pX=MXp%&qXbw+?sIA{buH0-m+G`M zb`7P=h8U;%yZB7Dl-HCK;uV41x5v3QWQwM0NxJ$zowLbIpNl_Qu6|m6?n;q4eT4Yf z-IXmI(&jULuqNDE$}*jvncS%sN71N0X2iS$n?5k~YlBVMaXr4l7#xBFKx1 z6fElwArK0lZ)&R9mGs>MZYyxEs|dK9pM;%#l+4|(pULA$?0E6_x)fPw&ClY^3;N-rAq=2sGpKzashlaqXlvjXlC#`*V1ub{!8gWj zu&HF+@{wCky|}#lMg0%s%cJHCkFXzh2J_|B9d+*z@pflx$DxB-Z{AHhuB&lCvSo{=as^n=VQ$b`unE&4J1YD_395qOo=O%ezpX&XT=zHJp22B* z8~srADJn);c^IF%ya$*`E;xz(F$EnKg6kZxkGD(uPNs0QVIs3v1nz7yVC$IhqBI`P z1PwZCeuuiL2y1G+ESfLnvailtwoT;)OYs)vEeFzw`x;EuJt0H4ZGUf9Wc|CVj*5(r zO6l=*k|Y=j6>2^&Ndp6XX4TETx1jf-<8I$ltOqt}l)Aj0oluqAhSh((q;oYUe}It4 zM?#R8f7bZ-j|UEmE#s?7W?xpnw6|*pacf4)ec4v2y;$n@azR*1a1S(cC`|7NB@$y$ zOv``ej+TOM7sPPmQ02t z$@FOx3G$SoIIlP+{X;q2{3ctgYlwK6yU)92>sfenTsio1tUa8p&wFIF8oZTPMOV4M zz4}>LRx@Sa*X60ii!YtY&9OADWOl8w#{b->;4lK`07*X;6KJUrS|RfS=jpk|#OFtY!K8$A=y#;31<9M@8U!Fo}s70mz66h3%jszyFU8%KsN9FRM{g z3Kjv?ojTXeG)G#9|EWLRKV=^H2t9*nFy7Y7BWhYP?-V}UsfWOW7QF;WNY7zex8K!1 zy^!a@eGsLG6VeJn*Q$u&=q3t$FMdb>FD-xK<8zJ92r?!FLek*ZYY_T|n@u;kkZhD5 zp?NP()jRQcpV)hzE+?0$<;yy6Hy^8q-(V~T9@uELm4@f$$MIj094)H0&nZw|(fZG- zdxMw0({5KL1|UGnfj3zD&=9Uy{ij#@burX&2z7g5w#l znAfjgCsxTF0h~YtmQkJa+6~dY?%N4Km7G_j2gQ4J_E-AXfb$Dq zO;Ruw8GkB}@4fM^HleEm-Qd=1emkSuZOwOjya-{I$K`K)&X2;QvKCjf{R%J7k4H+i z-dc9YA~dVam1;NZmi%kaQP@OR1kcm1S4#H;wMsRKE1SJf9nX#??K)lB+K*>|B3m7< zkMWp(hL!5qmQT6@hmhoOZ6vJRxV2n_oRb2qwpCr-Y|5H_;b%%wi>oWBLSPZi4u+~r zpAK-N*=_Bbyvt8BKG=V-G$UnwcG2jbG;`sqM!OM^*M zfb(2IGi4+pW+1Rg_z}N*4qt&TtUw}>T)3liPS9})J}d6FHI)awJKElw`C%qCvcEq` z*PaL`^@$jWCCY2iatXHvjsd3IIMA8Eq>-?O!)U<8(5nAV@8f3)^?6)&%-3C@JJL68 z!9a}*=B&|A+n!Ov-0XL2M2uLq=>$>PUM5rGCQeZ$<@9nVjuc-HGrBU^i0VU1f=Es zhld;E8*+zq2A*xAw}A~21B(6blJXVB^!;KGtK&`%5V=b?SmbR zN3I%pJn^_PH{TNe)|3C%;PQO;vhL33*$*_&ptqqk2F!>-Zld6Pd5gNWndU|TvD*AO zd>*~_XjVa5gJU{o!G#Z)0^BQD`HubwyY|Bi5&IiO0h81fDf?h3hOZZag}&jpe}ZH1 z?$gGuulVUwyrN7HW6-!CfMeCUqD(LEes!d0RgLQ2w@bi;mEz^~PW>ID`H)IgpA z`(yn^d1KydUHkXAVpkwuD=T6aY8NFC*k7r&+X~5%f7{ya)N2~6R@l$Qpzx?JB485l z87V7vw$D-O!}lHY@MdnV3GvzS$^?iMO*-(QIYleepYSMPKii4z${CnlE`7+}@r-E|uK2tx(>s4j}R&EcxoZcML|?luO5-a@h!HI$*6uN%2%FYgxxv z1``@6jEV1y3dAq>C63&Pz{#tzo83CI6%q5Vo!e$Buw3a&75hjK+y0cteh{CXsk`h< z;^}T zDJv5SjVyU!o-{c+G}q3ih_61tIZ5#z#B^kubrdh@(|q`#zDLk12)~d z?k&PNN3repQh`x^kiz-MWH#-Pc=w(1?7~eAq2buEvspR>W~Z$aZTHntzyT|#?EPSW zDRLVpil&F= zg+>S2V4T=IqF_i0Rk_qMUc#*2W%Nc-?MMRMVX=GDKEeR89NQvIR^q%lr!H@$5#lJE zv2l-Mn8G8N=6gz!f_A}A#l8&D6?&^6c|&0NpZH;ZzyewBaK5}OI6=Iv2@ z`Cd{hO>znNf*QBw?k+tz@QK`T_Br_o<~sx9YfCPk~_a5Uc2j?T19s3uN@)^f<% zmJZ(yNup~i6s?@@s5+2fi0~=;cBDuZ`E@I-cB!X8u_`rX=j*e_=1gkgk|^-pZDGN2 z8<}JF%wjNte4g^dDedYnUk3-3;Zg2&m#Mlkv03jdGpsM2KfWAzM)*J5tt2zm%~4EU zc)>W>^$cg1U!7w+=Rvd`1#-y{HoR~)Pd56Ix%R;hJCw%p#mk|mUzkTW@%CF4pN|b1 zVwy$K3Tq#j&V7j#(X28Eo83w0vthm(TqHl?R?kmBn@M`|b>}XoOe8tk20opLbgHf2 z&q^0zkUW-LBFJ4&C1#O;y^wA7#W3K1GW4c#Sd{afEHbNUq4C)$!B+RlgzaJdh3o9o z9@o93_wTn}u!`D~?-sI(rbu>h9?$Vp&K%A1Prp&D(&CH=pb3Alh!qlfUr=GfgOKto zULqyJ?LlHH?y=LV^B#sVBYJKiy`{H^@7u%XH=lKx$2#&210Eyb1?pLwxLoAuTfSd6 z$T&P|xu#oGOUs=2WPEQQU)WZ91n8X~v5gj>Wqr7xI;e6> zYwfj%l|S|_LM6;vLe0V11yxJeSJy4)NeQ%^jgVH$6@#f;| zf?g)DRL7c*gwW4B=~WL-8kMufL^DZuM#sMl2lttR{7v4QfWg^N0&0Wv#`CSl&Q_2i zU87>|X-^J}qrDB{!2tM=DB|10<`^%g$^;MTPg)8}w5Ly#UJu=bs>3n}sy)9YY!)^1 zeOs`{4Uoms!leu;n`~*8Kr>70m)=;NEc7ywCqB{g;_Tfp|CS@UAiR5m+tNz;_O_-C zOsDBtsN?Q}Ozp;0`pq4;73^3(^627gndA$~ zhps5!k4U-I*II>hB=7Xo9&uNp;22_0r z*gNO3a;Bu70QWF2LeiXbDoDch`SK}q)OZTbMX~d!Xp-5lipV*Ie5A?)SdALO!hlQafREXHLIA3amf6I18;v(gDpqn$Qr6+(#NlyT zVEBI+KyX>ZTIxjyirFqQJUd92IvgDyQ=)sf2s6u4~T3#9b}Q}D#lbniX511L3Cem;naeTD%dNLTBI4lIXriEiB(|BXAzPa# zzi!8s1WEGFZG%?dOZn)3{{MN{)n$;)5F1p-<_#M2Y5WXZMR zg3bq__>|}fPBfG;kVC`Zks}#y8*B;7B`+T%mt$RQU<%VzfZQ^j9 z+p=P+Gl1CZj&wA)ZW0E$zxefq29Ma1AkMQ#SPIhS8uU)xQR!X6uw^^^zonfI=bZA7 zY1ObVfVPLwG=)vV?N>_Vi^k@0td=|od2Zut!=ef>0KSpMl{voxumhKZfTECBvfdyj zD(A6}^wAQ+XcuU~7}Ssbx!pI0&3b1lGq!dYi%9x&6}7pxQrnXZQ@vLYh?yCn0T`&w zK!*O5NwT797xRWA=dI92Wa+RaHd0LqkMUY5y4G#+@ z3G%ToLNi1B<6KGu%(btudi`nK{A21Z2I^oDHd$5}T8GSW zN2~MUmdQzmoIKVY)~3D2-Pn!`8XOrbw8ske)R?E!pLDqx0^m@> z;5PHm5x~ayFies@3|tc*yk){l_VObLn0Mv`L|(Osa2P}{DD)mgecCs0iV0MWHltUr zW65#I@+()?ZT9?l_3k24DV=ZVc8tiv1VsnNdGRZ1oVqumUbNlSns2q0qB!6W0oRQ` z?~aUlA3p!h=-^*}?{d>euse=-q;6+encc;{2<@RHX)W-KfQW1U=HqeTtnl7-uKVKD zSHN|3a5T5ypZaRoU_U6mU_j9Zi)-;)uGDCu5_PSUT9doO?osdF(!0o8`~yo*d`_DZ z@2h>XIWNlc>>(x4R>=^U*KQiz@t2-@7mLkgYNdhcCOFmPQI(PooJcO)+2}^{z0vN{ z#So1DNFO)g^AENm*kiVxbeh%=758_p1vOreq00(5MTBHQ1NK0cyTOy;=KOw-Y3GeE zuiLcnGEVpPb4l3;_xmXcEOJU0b32Few=i4^*dH32eL0zghv8uT&GDIea^Ai`7hzmz+u%ZNk$%}bM`-#>d? zeiM$L?n||w*zaLksI#K6lEBes=Ow)aq#kFEM9nR@4oHLGG=J(nnV{?M6ga)0l!>?b-KJ#&_Hg#~!Z*VAH{O-0@0X{Hb68 z;d6Pi?1d{);q$Fp8zIyl-om7*PPch&vD?OQGuGTTzAIg17X&2d>M^+2dC??8NLh8h zZ7QX3`<2J9EI1F`Um3`x{`g!--TouwF;kg-?W;nSY-J`UCexZW-sj)H0W6a3^0M9n zQ=#wda6?8v;z>tn_zeX6@)I;z4FRkddp2H#=|*dxJyX1Ovpx6IFZd4Lw&Lw;^Rih6 z!^CT$Swj${$&IWR6B)R|j`a_&nn}<0r&NfSi=?_n3i4I6yt@U^AWwo6lumRtxV-2n z2zsYg_+MoeiN_eeyeYiGp5N#=wVp&68ut8%KdAZ^96jECzqLxO_x15mH=m)UX;{{Q zM@&GqT@UiiXXNUoqr*==hZo8iBL%Wg9i%=xjtkF;Ulf*|(Ll(ijOJ(3@rDn^X9_wm z<%UX+yGG)4SrP|}?W_3AUJG2kMx!61up`I~+fO9je=Y&z!a~B4IxtT@%aSIE{I6-b zBnW65hYnWVmG(z`(A@LI8GYR0MBc`P!CcGXx`%?W03rz(5*7^k5h59c6`ln%peWNL z^uGs3cY+`|aX^5NUFYoYMPC6=U^f)HOJy1WXdfvUuz*J(3KeGfL&Al$fJd>xEJ21v zyG!Q}6L$v_79i*kPk{owQ7Eeb<3_>_|2}a2_nR#F%(SsN36N)4!EH1WFh%rWhz_OQ zUynuo`&eUJ{B#A>gcW^`<=+8O0QAS00Se%dKR^&Wz!_U5V0~!8ZFj)0hG1NN{|>Pc zjOhwE9)Azd74SkXFp(yN4A8sa*WaMu6TE+h@B<@OryGnihe_;L7QQ2p}aV`cYjV7QY z?Lv0LIKCBmFF>OgQN*K++s6qNC0ZD zkS`mZ2mGV-ERPjRwM3JtLSSWhfCeu+NW4J%XP)L3pMr6m0ovcdb#Q}G{##oBh7|~g z&kR#NSoLe7{ve9j8gI4PCTi`n_fi=6{*^iR{&#)5dafNn3Z>$Ek>Yy~iyQ#x`GAex z{&7&>wmyWMRVPP;uR{D^!N^`+?5lV#gx@oub_{kiG znUH_;zl*|cnh^s(Z!ld*N=T=kN3HV^zB=>2?iVR<@isw;n1h zyVBYLGuFd{=JoG-ng>$Dd-?Ah7z4Tp=EBew&|-w= zRA}T&9_m)xc%0XvSYz})0rV)IeRY18Ss0-?A?}B3Q~&1LD(W(5EsIqd zQ5@^Y*$jV^Rivb`GU7{5MWI=4p!++FnzwWq_cz90B0|=M%zdL^vHG4vl-SD!2>rsp zXIp0vwFjhdyrQL;G~>80PaH>|c6uC#2Fb~)YZ=1Ul?C4V{fZjhXDOER$Qo?RWfP9G zT9eEc^ZD#ss{Jm)8Y7*g4F2NMnLT>@lX*_OI`b_OpL?TF6niSsi(hOvak`%n2vVO5pYk<;x{nVI48tM5MfH{zb~B`tnQjC3`T#L$K8m zaD+hXi?_xVqzN2q5k?-B(T;w=MLse;DUK zXOwnfb*|%XXrEz+S}}3-PbWkDC!#K2JF>}gXM}kWb=4v2RAjfpo=gj%P$~1MhMxA@ zOV8wiO>j_Psh|kJm6T7ey+5C1FuH$ASS{n#vdca2So2&14Y5N2lZ4?&!>Go~l8<5! zN!X1vHb+QclFovD^I@^WeskwY*d5sib76GG)FMGo3gbMpT$R@zYB>tyk86op$8v=? zBrjhl7i`6!X}IJI2;i!{qjUN-$2%)`uMW8HpKC-4Dp%{#j=;RV|zS_)D|Xwy6E>?{dboBX7H;C_4im6z&J0%+ zfY24rHTD;FZHe^x7kbswA8h>Mp&#jBOckET?XRz-hRE+chWbT`MN>bwNyFaPfc^}=O zR`s15|A>cQ7z+06SpI`+u1fQ@#(`UT{jV#N| z52We8`#2>{3jGnAJuILI_K?o&_XCHF7N7HVw^X5WdL#MEbPE1wM^vwnqxa454D)D5 z&k3)}&y4QRwyf5#8QkLSi z0m(s;lC2i@x{I93k5!Fu^k5V$i}-Ae5O`3+&-=*U?;95m?@S>Z9=FlB+8fO1^y+pS zTCjcd6pBOUcH;8aq$P%HZ1f#6@GS%s4OI;HiVPK^@>Lbm9CggMBOGAm+qn<242gK3 z+-%xbS#{)Sx4{B5dZzEjAnVT6-$r-B0j0K7I1am??fb4EE|P9TB>k10<~tUzJJ0f8 z6JURPjOYE_4)VADzUi4;b>Pr&s7*(_qH3v#3B%S@X-o3!=)EybRTt{mlt ze<67!c{uqJ{$N9~JU7s5Ey~|{hhN<+$HVs19M!|_vZ*zn#^h;V50^9A(8@)(JNLzW z34^20`on``+a};Ia+y9>Qrco|n!Zm&t`5KB51N77x%~XqPS-1n;>a;HNUOtq&(`jZ zcl*D6{QmnkkCs6mCEIh2Yo&-Ih@;L^8~W0EeDy)QysPMmVQ+dYMr#oJ{!jrY|F@2| z73fr`$r`71T7T-0@OJ!*(Yo!&Yau2E!VN+1H_MKP!?=CkGROXHYtd~r1xoVem?UMW z#Cm-)v%|!daGJ{IdaHfkomT>POgi}?E}F*sET;+<+cA1oRy&2ChX>@KkZbNIp&;&h zGaX1(zgJ`d%E`lgu`Uy%9_ZycELMBN->|yQ!}f#nu0_#^itQLqD#qW-q66cz$x*Sc zOTIyB#~tMB%4V$Gu{}Rnr55kIz8(xbJJ2IuJD~9xb_u(zC-~N$5{ItEEWW0|WQEfE z9K^BE0_LDvA=!^Gli<1I5UJugeI`j_PoLid)$0Db&1-0CLaNDkS>BQWNB6HSSqJ}J zlr2}2*HoS^dHcC&TBWA(T?OpdM)`IkHT#%gvxZm0Io+mRk_G4}ex6ft!KJI zZGVe3kT?(who{tKu?Br_UwWOBE0&r73Af2ZKt-HCh^zpv?1At=ZC#BY87OfZ{91bH ztaLq<{cHSvw!l$8Ak|Lr77xRI%h)4x5DI7q8%b}K#ph5>1zT8trJAL_LG`yGfZl_K zzJ@$PvglF6FF6@Xs zpBm)wQwB3kNGrfxx27HuhB@8s_!3RM;b!Rm6o?2_E#%M#@bNqgNqpiaxTW{^%pPh5 zoJ#9KiK|)@eLV`3HOPNd!3cIVQtx{xg5|gSonQ#CtcDo%kZs<(ofPFy_j6?CqeQ)q zR6*3)`Jqm80q{<=3vqtL$HLkR*?0~m!*U%Eiv`qq^nXhniGa4gzl>_AkG|z5-26cK zLl(G`r@#~9PbuPFv2k*tD&lw51Q^%sb!iek5;ZS={sIM9+J%@ulE1;)ea}1&A_sP{ z4xaO|`+-p|;lKQQyT-yK8?1xW4`wV%552LGaKk({Xw(Nh6OcTV!J>;e0gYI;`J@DF2Fr$jssB=76GY`tWIL65#Zve1C8Jem455>(@}e zp&=fnEw{gQ;E#WU4$Tx9SPGyNXX$;kHmhq{_|*gXcmxZp> z0W|mEyFc?~3t+`#l^VCU%nLca7Weawlg*M%rAE})PN$)YstV*2^G;4qoHszo9nEgo z@J0mdB^&ZC)NugaBU2J~feu0} ztwYSLS6u-(Yr+W|Ckm!n(6}$a2T?v1^)>^=0i%s>b_g`9af2@+XTS1!%=>N%`J7cd zW;5{N=;nVqrYrEH44F$RvQ&c`5(0sO^91>@0%>o%v@yD)Y(a0!Uzz+SG*b+xZUX^2 zwjM2$^srTW^VGt1nBFg&pKbiw?Nu6|pQZslpyO%!Quv($_{0)0+HX}2rtz{&gK8h= z^Ud1G_K=f_`PcENI?#GHNIPCTx!#$)VZ*(!u6>o;tC=pA-(I#g(jotHQ3J^Xzb;yx zHFk`7a}pupXeZ{LXQ}2ria%aNQSG-$qR+I-m7pBo@FSPem_J}6-fSlox~bZ*YuWO zD>5d(>9BF+vmlA3Y3-gVU+<@8a+dL9Kz{n4r_~Vxe8F_ipVbUbYk#GSFRkvcZE}T_ zEDpzSZD0YKdJ4{eLgRo#oJ=@P-e8&q3BP&1RTyb+5H=Fh2mMdT)pi!F2va7?#r~Mi z#a=Gt+wBiH(Gu|j_qkD(u029JL~%IPt~9ibI@9}suGp^=Qx zuo@q1i8h7(?|6XjZJ()`ZJPJJkjQ)Pl>RdP`|d%P^U~-z6}44I(N|5rtVy~%YDHRJ ziIbFo+0Fe=v|1t7jvLVTYcaw(H#%bZ^|Vya%R`Yc-jLBAS?C3-Ly~MNmlb zKMSc1({>)%Bn>ROH_Y4VcrVP-k%V1F)kQ(%k%F+$@$Oumok&tWC_DIXp#_ozD)Cg~ zS{yiNdYs(tbo@#&%A}^xu`8pOtHZ89bVJB5&Nh|!|EO02{NjR`ci8qbsv5xRooW4& zgdKC&2Jh)hUscvHIE3Y&u)ab}5~qEFf=Lo4>64Zrd3jAVvK~oYGZlN-vrO$pM%v5G zZU$f;r~jI|H@E0mWeFF?0RE{t{o@w?O>Nd>fk{et7MUNix8I(t_T>DZbw{T_|EbDZ zm3MW>Q|ygZ026K;$z&yaG;Q^Em5b?~RuTG zW%;JhzaQ&=XA=MiA}66jf7OVh=*H-Ykg%rzMxCU%TShz8qWJ?ol{Qt1AdFe%X`6 zhKJ~e!X#hnR+ykg(+Iu>NcFuRpd!$*+1~MRU2A+-kj;P#9__3{+mecu+?@DmNO?La zeqCnJWvsbIz=)a>s6Z~v67?Dec`PeX&xC>j(}Y}@|G-Lz9bW+t`4=eGFY?%583Od| zwVnBnp2Ob&>xGa!B}Gr-9TP&WY+nD@_4^zH3I{9^PfVo)GGUVB+5b3D$^O-r0n+Nj=!fO%Y;I`Utq;B-?zg`f{qEb389Jy zi0Fd+w<&yy_xLp;T|g%oNE1{M!3p+XnUeqimnXS*{RVHzM5qGyQHjXy(ouj0jt(dL zJ2?tR_iDev;HSa@EYT1^*#tl?91s>LK*j`kv6sWX+K$ZLfEiDd3v!9OyglVhPr1iE2u&+(80Qjp0oY}v!F1Mh;l61%eK~g5tm9+Hy)z@T>YytjKGkiC0R0+G|Y}?mx zl6%FV8a*#FVx8vdeD=B!5Zp(U_@|d#1bf@TcL3f``LTj@HvGbMn+g5m8o<{tQ(I6K z2LB<%bF^OEx>~-~12)|-9!FEGGDDqafWl49E_4P}Q#BX}V>ucOsJ7pOZGb0gd^O-3DA~;y(JuS6S_Y6}TW%^xtnZ+R9RTrp z?rLY?I|+x}zoS6@b_hW6EGt~@7rR{}kEB6}vhhj zE)VI{+R~CSYj(RB$tSTgah(DhxsKi!&u zcEt}T-oa;o#2JBxhc!XruL@^DR=8|vGppqVZ_s(7QgMJpaYjq-ZeJq#(R$3_eECMO z&OjMJ>zA0x=9pfE5m3usU+l;D)}Y4#c76A|O|)y;4RwG~Kgf{WpKNS0skdg(0evLl z)%zt8u)As$U_y+;F%c>Jc3B(q9=&X}FTdF2z5*X|Xyih&hrI0U1KeYya;LU&lH*%2 zGy6w`;*XcPW8Xe|stZbG?uK$Gd*rqpO*}^=qfr{@nXm>(JVJwEfqb|Ex1V^6aosZ; zPylh3NqROnESZXMl&aI|DT`vnc&A@WJ}YX_E=soaK{lF-tV-R)10i6dftbTayXM62 zm8c7%W{_u^OByKUL7T&KB}Y%%b6oBTKVK?u+{)`s=GbskJqe`l``Q=Lc(V&>-f4ps z#RZRG4(to`V()&+u3LHMZUM+KAx{XT%?ILQ(I}@B0%E@A-NiYzy%U>Shs7`D)>xrk zw@rS|etb@GdAgc?HbL1HuOh}&^A3E{VUeX4tOviSrE_YjocIso?NP9=zq#Ep*sF>SoR49IT?BG1${d8(FflQzDO$BTL zjaMN1Y%g?T^Dt2x(gbKl?%=Z{otHbq3LIuBYP-~Cpfb7h|KaQ_kM9y^1U)GkhrX^Bk27xZa7-31cnoH77d?7At_>W^q}&B)?U_DGHrmn3BZp=^ z$OZG``xLhK?#A~)WJ-~AM9y2_K{1HqEwIjrHe9S{XG|cHo>XxR3{I_BuC=tPuUoBR z>M>M*>;W`V4Q)H6u|I$hbEF?5F?Pi04U4azmbxZP&`Uq_BXI8+B8_V({_&E)n>4MBQc3NY~(f` zu%6PFu|CS2@!kk|djf3$+>K2?(G!zn?mh*tSR`O*sg!Bx@vgGuikMbPbi>~6e5%{k zSQLl)n>>JeY-E@{a5)-@q6Qn;6wjT`n$)`@G*}Vy*v*dd`_LN3P2GjYw#A9m;qg*% zAn~xM5UYMERQU3ETB7$jeqPa^r9hccDGpWwvzA&16cj=HNauL-vGL+Z2Vh->?Da69 zUz&tJ0xA!p=@+7TU!rO>qsjzeJlD3E09%dTLs`bdgauuWB{_~DY)yw z2?RLp#b%jvp#31aT3Q~;dDki8d?=zHFf)pS3b=*7_NU8)>^J7a26d0zQS zlpT3TKqpm;8;+fs>8(Kr@OZ{)m-%|2OT- z@yKcjMeJjV*A6eT`|yuN=H3wAx2AH4_?95>i6XIcc=`|v@S+fn+a9!j}j;LN-)}yGOi53wRYdU&Zr#Lem$g>Yv$ng zS=aW&jDo!&0Hzl-P2g;9TCl)4Qb3f9K5kXzVs_@$rC?FXdi?gth4O+YWWsi0Z7pXv*+6G93IDsy;K z7{)@|)&qDJD)q#scSWpYQ8N<;iw#m~!0(%+zw#d{>P}IH74^wb_wBoOv#mktmQzy#>I9zMOVdgzccf?YQSX1no4XRSdrwIxCJ<`QN2U9` zwGVFKe^s~fb-dN-bpC@X%Dn(;VBH{^gKgVIrFc!avBD^)<3!!Q3ii3*JxdUJLJBmk z(^PW5v~AQqN^az8QM@Ti9%``?%jZQ+ZMKwzhh$W>xyVxf1gJE&%~)|cmU$+F`q~jW zS$a}nXvw)HsvB*X<7y!k?#=P#f$XiySIjv9IZ5Gf4H?9 zSdUWwo+4(DXlfbyakTJVWFW*5lKdD8fd-5tSX{Pc&En%X{J7_t-cpv+A&I~J?dP2b z^s<$7&h@^bllp|zf^cr4-jT{X-W+W|f#hw2|5^(ub1S~0BsbxcRfW)h>ef5ZGB0Og zLk3_wxY__4!!&YMCfe>+n!+aR+(gE%D2>Q5pK7wnSEJ` z9?3qUf?z-Sl4|8IAMPYoWHL0akBO%knd)KvjR{<nls^BDr?brCi8iS2$51m_Hy`*W56aZFv8PR+UwGk0}2YYo*Z3p!j zgah3?47TOsz#9<#{0gX@!vRL7;8V0qndR>%)CmQo>*v2t+o`Mq_j>60Z5V)Q<7WK6wBYXk&pv^(a zQj6l%wV|f9b0$#e4M7fxa~D#qTJ9jKH^<0htT;g6Qh5`_wg#p6h(LSW8IAkHY+8sB z^9OEEC(G9qD$JZ+Nblq+L%5-t!mGFwq0xxOvzl5ly@TqPBm3S_7F3Z`2#cfC|@Y^@VYYytaH1WaM`AI@ZOSHG_74S68bPp zOUm8Cu2TZ{04Fno`R;UD_Cg{|i7gfnAT-D%RNk>++c)1LLoW8@RKU2Rv%nC~4J1Vv z>hzv$=uIk~|HdSwrH-IbO1b}=uc&ms?r1XPL@9WNh=M&Ra_Q|-=yXsL$_-T78|vZ2 zP1{yv1r``0;0=bw-YY=1#iM)FXLYK(zjZegtwg$$2-*LtZ;sZm5E-TbfkfR2Cq2(L zERPnY2t{w74(`>h2%Sy?(nV$yq4_~1+Qs8u;Vt{|!_p2~;y09e;iLuIOkuW$j4sK? z_vWhTonN$oL;JoSeIIZH4g+9Fwt1gBIkP8Bp-b@vSd}*=9cp#ZBn>5w5k@^DVI`c_ zHuNo4m-z?BJe~zj)aY>P)^sp%Z><1953_2r$jYXWKh>>CAmUhBRVQ21%g}C#mwQdm z14L?6V&Xa@hV1d_Tq!MN?pd()mxIq+X5?RVaKlh=TO2NRqo*IgOqEnX>PC}B^827e zib8+BIF+SH&NUjj>KSF7-7ZD8=kcwMRO<}oW##+_DyLm<^2z8Fl)Tla7p%=Yr3HN(rLZ1dfzQ*3EPEbRy0XfbxwIV?e!#82eY@ zWBTw0qYi>o$bDSIZ5qa zavu)OtPD0m1}~<|Fxc7QMh4i}Z=aNzA^aaMeuaaT+3V%( zyp-hBN=*isA9ElyAOPDj$;Lw*jL~ZW#8^HkG01(GL;`B+MYz5J0eXibQ+NUqT1AAC zeZU`G>Z*QaO6$PKZTY*?xOz(3r$PUVl%cegN3w~REj5F}Qu=AbXXRN-!a_Nyvi>AO z&e<~emlB1xrIo?+lk7v*%y|GHHb{P#ZP?^%Nz7fOSHnNML@k->J*9S%*<(=ssJ8Ao zt#>Cv(e(H{BMNQ{@cM6dfvSV1I&{HcL9gUbS4Og}<0WGsmnKw}p~;@c+NHxx7HC605eXQb2! zfp#gI>1Q}MuomHqg^CCifuX>;aFE5W;OLR)k}z?Am-zDucpr<;GNVQK0@6+__^rFp zC*(ir)v=LLnFE1HiFE6xWTQAvGoc^E+v(j399$xN0ktR=Je>gzFB4uXj&%z-gvtXl zHVh3IAa6u&t;GJ2&ItYI{?C8J(0}*Af6`@dy?sdhFtH>ce$OC|>!MNp;NF~XpmO{F zwsz+s7KOaTGFYcDlN(UpCvY@~(d_mm7HAgMUC^v%B`B{XP$phC%K%{rM^6KQIPo}T zD6bj7MgT&92((*JLwJGS4~qu}%1isF>IbP%c3)tl7LU8G%=&LzeXr}IhHB`6k^p@n z)(;_`1Fu%kf+lq)F1BdE-2x_>f2 z?EqR@kbwcpQ2$Scawx;pbp~j_AV9`-4c1T%5XKGZnGBQxNI=(fiX96zspmQaDaa7P z^-l(G#>zERTge z_ld+EP?GWjb>p}JCNvZTl8w@uI67!7j|$}h!(bV33|Y7bVBLR)h8bb#9(_%CmucP_ z2;f4TCL zO_DG+a>jdM12PGSwKlp-&S4f%lgY4BR(wDN~)RYHncwh>F zfJvdsY=~fbRH!2zA=xz>>CY$y`NKGqH5&UGApNtPKy9B~RK~@a0m$O@wxIO$R)63_nWCKr@aN)?Brss637zEEdjRYRSq5}RXz4<(AORKZ z8Gz%?)wCOX;mCTfOaDpU*dx=0N!Hbkbfmw#1MHPTo$9#kzH`Zw2_mz;h6d897W|Eq zikHdHpC(1SdG)%}-2ggt@z(dS9?CI4TcO0?DA6PEEkrqb&L5v(;7h`m5f)pUx%!D(^kkIjsq^dKiwaNQj&=#PAlUCtMwIK zbx-4835MmJaCD`Q{ThhN);k{Oh2%N7cc2hL(ACFvSW~zEo{X!7rYCt*uV+-|h^(5a zNRqk*vCQ8g%O*kDI63N-(&u&vyCqsNcvV|Bv$d|uP`;Hh#$Mbjc+dFr8V=O`WSDM_ z#P|J>kr{YEZ7Kt$7S;NC?0kK3BtLHSxb#2sBlK3>?UHZU&pYJeOJ1_qWC6b8A;-!@ zuN~FmDEdcvaje@g6!dIz-x}dT#d19I^2Vp-P5i45TdP#wAJ&akT4Vr|1t7(ObJf8n ze*4gVifPLz`;$iRN0p_wcW0`dV&2zGdO2e9&_j zqJ!}&m?TmZbR?fkZqtTQsEwa^wR}-}Xkl&SaHsm=8T5m}rj0u#nYI5lBR=SWQu3bn3>D160YzEBW|kO9nYl zlg70uB+o7@BHkm=fXktpAvPFtKi3JX$`bUTA1+BVH52`4;#B$ zZ;*YMND-?kvgJp{$HR*8J+?@Fy;r&Xm8F62?F7+N+dJe|yf8I#-!r0_uK`bs>z>xZ z=2zwE%Nt_d`u6x|()3dW|p*2$uz*gi?ksaz;~NnvrLOQtsNOTvbOF?LLA?q{v;Hpkp{%yyV{ z(@k4ZPD~N{*q6?FPU(l_44fBkg9)oNHJhHf0{i2eR*MwYERlX%dHUEi4xkRzof>$} zsJHkEQ!)bZC!)vP@Xa94wekqKjFnFq2Z)KRH`V-@@Ez9&i?xL+tXo8tu2Vg`~d%= z=AX{eAo`QN==d{UKY0??SN8OM>`*=7E5QI3m38rg;1FVA)d?fm>zlAHl!o~+*RWdQ&38W!^i4~b+1@Vr6)MT`<=PP0Ph z63KJqC2c42tEzI&11x=j8X{W5_GvFuiBeSjvR37I#dJCwcy{mUBfM}zH(^Q~cdUc~klJT3|%pBMcO1T_38atRdC1y0t;Xxks`M2x(H1Nk&bDZBtJE=1m zFo!q{covM0G|<`wRUikU>pMJ70}JfS0$=VHr%wbCFD}ntE$Iitlf8Ova0XQqs6`UDm87oGK zN|I0E_xk0;V6qn>p%ed2EsMGE!}GP#xc9cMo!%<$Rqed5H~Pj7vhLGGUf)dMwunC_ z;PX6f+s8XqsPc3T?~f>6sqi{H`MM9dpKvkQL^DLl|M$k=DJA6FadyPUusBqQI#Llk zXfc(!5VC(B(rY^H&5(mY1E&YNiG>ygfvz5rGTPr8tS}6x#iu|qeO5#o7sxdL^F?Fw z0p+hO$A*n;#Lo=KKVdN6{^yL~)AdlbeuQzFya$P(|1S{!NaHOlq(a=<)`7mIq~-Kr#{sqJb1I^a&Cl zQ4sRv#jxxj+=w>#VK)D)4y6$~I~A!-WX6C{SCNluHU{T&ldy0rhs z1kNeR0EtB?l7=dh2=ECKY>9virytY{(D24uG}ixqw!Z!XrMC3_XWFp+;Lg`yz-)4g z`TSQU>?TU|pJ^z;@R9$wC9r!aP*#w-gy8ajQa?HWCpFpS|D<+3|0gwV{r{wYUS#|9Xg z>tTb+AWZ&m7Nh|e+Vy%saeD?0@Smjy-=%k`<|1g7497_0mB zKAT-BEuQs$rci;{Ba^`ylaA7p+2aol)CyAVwamo;-#riZ4q=S%4Z1LZnWl<_`34{T zOqZjxl2W~Yl`npP>O1|odJS_}XmL;Pt1d|I$NZ=;PH-uiZE~A3srJ-CnmO5CuG(H? zzh4UGi#1T|VrSF)VUt}onl4xB=Nez;lY@l8T`LaeEd!W{yC+@l8AWtM1;)ZRrv$+( z7>d}fL`iEvAc8PEK{c1b1m)$kQ&b7TFMN~Z8IaP-Os2!9qYI+!b@E&(rhYUPys zNx<<>890XJAf)`6hVca;JEY)crOn(mS_jZMFb|{7JOczFQLNhRh@c>sT?i+=lAU0g zuF9yIw&ryyZQ0tc(2(f}9i2RptwLxMt|$Xp8N z%wI5R3tlmXBM;00^R%WX!E@Yt!jlcH@*WnQjT2Kc0Cn*T8B5@Mm92|`gyiUNz_MAp zr*aRtu-OU$`r1Z+Ya8LKTtLC6&W?$MR5J*mKLbpkfWpemX%+xh6A(|A0gML;sg+3g zUc?<$+MRPndHK}2j(4oXs4qd`QrFlXB&0yn7c1WGmuXRrV&K>2KaQeEaI!zxAMo5n zp_;kI2O$UYF5gAD6=u5Pjh2N5o|`4ue=t7cw`bT5CIV;(pTU^5uDwN%LHig*0A67h z(L>&_BRkzkvEPjzDH*;G$n4}7Y>_0t11t&frBu)=eJ(S+_im;k9^jp%)I5+N#FDG{ zBpw@fz5@~#Ia_j2P|)nD5u#0sLi#4{8*zQFOOuzSkOTX*+Uk#UDtK~T^JYK*ooCm0 zBqZsZe>}SZ_Dvf14_-<9HhAOylT+YB0vXu~XCJU|XXQos<;{I8g6#g_)~M#qxyhOkSZGAf`vBLOPptev?IO<$*UKW(&x<0~nG>&ELU-%LVTI#imJO zkR$!RdsBHlEuaK}x&{ur8DK)KhYd%&w{xN6l4XJr29Kji*q)RHLY@INjvGV}4=At# zivvdQbm7g)7K<8gQwHlTQ0 z4ilKPP9Tls5$idaZQYeD@w)P{#||<6tM+38I3!%i+x|fJ)$eLPz&9@6hm8^fC@2R& z)0Tj7A! z_I*BFK#l>8G^mQC(4b+G0ac`Y2O#bDbG*#nDB@P8rMX)IKs>L>r>Fl7Ns}#5fvbo1 z4c{>R1u?u|&J(I`J3Ohd)I-1lJ$ym#d$p|Y&HrWxD+VC*LS6-SkP5o)=U)Cex~rk7 zxdI$2_JO{ACFPh5@U>(>AgnoHTCVdn4RnMS<=@_OZX-#4&Nl?4sHenJZq^CDfK4^Y zpMXXI7*ec7uj1Y_C=TYTXDkC+%=_h{BW30vST(7+U HT*ZrC<_2GmDVbxa(EhKkynWFal~n`*DkkpjTmTINM0lz-E;O!RrIt0CypTF?yb@Tt#b; zAflr#o8C13Eg2KQVO*8;TK+B6(5hI!F8^GFr1RDl7TMzdbKb2f*RE z&#(&g<{%h^?&rvOlI{NX+ReKa6GWO8*kHO_SR@;N^-{_T(y_zuf%~KzxLz=M8nC-+ z-z5Z=8~(~Mg6#wk*+_68l}^ro`{41C3T^}tA+r!LB04 zi|tnVT2UvS`8ikIWJKifT7MdFss$=^Mo9fbDf|wVmI;arvKO;A?(_1h(G@fleQJ)Q zcH6g;8f+pK47}MhJ{UL44K}^%vP#R!3IN1NR%E0m2*_oh_CO~R$(GR2c+bQzosVY4 z1)L{#l3X-0b@G6}_DHBaE^-B0Ob0GT49d?-p9Any2-vBCp#yu#_G2?*ca%D5$dRwvm zOz{3>U5?z1RKcvaL8k)%nitcnD7}GT%2~NJJbTf6D?$$dy=rk!pd51)msvlYE1(Gg z{Mx2OesX_^U52|xJ3ZB~$M87)e^V(Pa{icSxi?^+-^DP+jCIOnO z)X-5tf^!P6rTEh21Q4aa+pQY3a0;$l^9^g^G{U-Bu=a|>QlpMRC~(LC&on@>7UPAu zk>0*@C&a=z(8!>jJOW`Xe*X^7dtYiH+dzj(Z7DY0?mtsHZ7G`g%o_X6OQ8~W7|L05 zmh1VMe|2mVI5}h%doQj|tcjkhgNZSnQa)(HT$RqCtP3471xucM13=IVyevz^f6w zJ?ti%C&#_r={tSiS*9f2KEJ(Ttv3bC#*ZC;sMK^AB=46WCKUYoMnP+-VI@Yn8u>O`Pv)7Dx;76w}WyW#Kh@{ALAX*ncCD=wA2Sbueg0;pynUR=skN^E5EwY^eC zmBeL-x=`OGuIV`vW{ulPXcQTCOTEwQz}X;ZQfvf{a?WU&ZoJHQnq7&R%y0Ys?|cY) zK>K4x9$Y9cXQ2Y6oaIkXr=Qi7djQE9j2AbKxji-Mnn$17BM+2#So zU$3|d?oje_F?vLI5a>me$#?s#Z4=@K+BYvw&Rd!9&E>(^Ej8kgb(JcGXn-MO0HXw zc;#og*U%XIxmSJPhukdzpNCYCDR*|v1q4muE+Q~wo<6~H_V7%Y`fMCR@xlA3rRi#S zO5{)+igD%2hDP~48vcfg>N9LxTCvm~UkX%g?k8@g#H5JU8~Q0Dr(42;Y*Nh%^)+}< z$VLEUNKj)ndz744K{4lxhQo2zuiwywXpy((z-`^hJGZ#`24GR%QGD+|aksfuKytfQ zmDW9Q>W~MDy;U_>0+0w;KJS3ZI#9?@oS#uCGrZuS=jzxjKP{(wzy$GIz5>4l_&$z+ z{|a|@xJ(zljQAY7GPJ26eIM%i2Dsnua=)ne1cZPDlm6qV+dX;Q^a9OcD1f5}@oSIa z<&Rv%*H-}0_2-6shJ)y*#iJoE=e_Ni`BPwi3De9B{esbUd&Cg;bFW&%~>>6%ITH!`==7^^F5pcP&o^)RJ?tNf z8L{EZ$A69E*SHC)A$ZBZXL_V`)h?;fo2E_m<2WuHedp|Gd7vd)!V83LdKQh6;cdU% zkN6|m1Xja3Be7qFX3wp}Y-M^r=q+nw01C&I7*t^-M2XO6u4m6NpBw}%ZMnTQspu>= zE_P2DPP55DE#Qx!7)OU70c;wEB<}$5yyK5hH9+QTQ5PI&G)(y^xk8+M>E)Z=wLuQd zUjTn@Ror97!hPPVJzO12;LJwe=VOx-y>^Tm;@wkH_4Jk#2c8VXqPSkGQb$n23qujM)3s8HYtlj0Hc`_QMSWMm6Rr|4 zeUsw8d!-^y-2D8DQ^3S^Fwxi`=(3v`_xgKf1MspW9L@*WCKGAR>q5sd3p0Az@Dtl_ z)tM#O(ij5KlV27TYIEkz!<$w-84fSTRz4G|e2MxEKpFB>XoQl3h%XbUkUWomJGu6z zh67foUM`+yfy?_biN0E{Ijaea$Z*XP!$Mkq$h)3fVU&^F95vMd|@Oh$MZ3M^y$?O zmzHLIkU;$6FFr0Uh%5=jGg5&(qSxYMZ?IvDyqN)>#MXMQ$Uyc47AD~{h9D!?aM9HH zcFC9YrV{%SaPM*;TYO;s@>I?7?uR34H}UVt3VFHORL8eq63u(0+ng;a?z=&2r*kg2 z5b2Bmt;MDEm%`J~14vp18t2NJyr-4VUG7=-H}eySCI#={cxQooD{{`&s9AcoWR)<5 z`v{ep^x!3Gn=|?|KSdOPamYvU;wjew%Z3NIRe$D66De-cq6nVqhoqFi*GkJaTc^Qu zqv-aclEZ(gtmtn_mXs);VtN~B(MU|O9^!(aL5-f7bsLz0b23j(#jt!vL8YN)SWQ89 z?aVhP;5t>>N|A+Q0wuISHn1_?QaGqv<9WJ_mXD`I2(>*69~lWaZ2tej5uTCwzWkXV z?JVZS)SWy@aCy zi);y{ECOkAW^_nvi4YB#vd_1jFOr9OtucYZ>RNz?%&W)1O^FUmVo@CH92%!kNj2d2 zdkRE6@7{t>^{e*;V7?&A6_%JR0+3RpTa1(-FOx8-i6;Vzqbo3q7&l2>fz&QkYzH2- zz}UNOMB;mOz9iP=@SK#F1rMT!^oD^D8Zb)GP*5?=Kkr-+M#Hxgs0Kkxwt%`0svC|5 z*=zwZ-?7himOsudKzjh^7RLg%cqj|7MT1u%WX_F5mV^Rffh>Pt$sn!~WV;7udML}E zXEIdZi|ZajS^mC_L46?fC;VI-3uO8G690c_Vn>W*juE{ywF>>h%N{G8)txvk#p|I~9lZwpRr zH(?K$kL-uh{#I)uSpwpmMRJK;t!M1_ZBa!X;grzziJb)=|2{c7s%3O<`U)N&C|;%a z%2`-csY~cUP|OCQ?$p(fKv@RO(N3273QsSo8lUYy8XFR8DA=Mjq&s$6oN@z#EpkX@(OYrHCbyYUx#&Bv5zLr+zljR4V82HV zqYyJAfcc(Iw*$EpyW$y+=}I5~T9B}4MJPnG`M!KZF}+z6tO204DC{(bukWHMl|FAP z3!ODtz(zYv^Wrcy?+5-%3D-N#Iz(P-ZTxOX3SmdJ(Pskz|(EgJWdvxHCP_)JMKKnuEw|>N#YA$ zvmtZ#fXj3_=Qmu|Ol#_~g~Q?c;{}HsEWl3_Xt{6^)2qBuL?f|_VXp$GP5_>zYpqfH zH3!;fn`foMmfFekh92Cnl$e6-Gn&CkJ9t;1W5|nRUCo5HQJ?zsWwA4p7~r&hW~H=J z0OM2@+40laTAp0#I>XdS$Z#_a3P*F@6tLLr|p&Apf>vs240m~feL-JMSwM5>ey*vD%NYsP==S$ z&zlG5@;xX`?9lQ|`ur;;A7hmC6RgYQIYO@I6CTlBETBBP%dL*2s5@K&4-3^0>shFw zVv>!3?abe@I|XccDa(e9N7Xs6M^hCsP z9f8L~m*+kQ)9rq2F6WQ3Prr)-T*NvELO{KB5)&3piCu!pw^<-g5by?I_XPeF=?g%6 zk44t2T=BEO!q|L63UsEI%24E$%gsw79%xzH+e?%YPT2WzO!nL)Rsc6kC^gsS?$N90 zcfbQ|q3a%n2@BpcS_DD$kYUiQq!8Cq&KWa|J92iBiv2 zp`e{)Xm}S~xIb&rS-=e{@|>E^&Mbw!3lCi<;cFj1al;nS_Y|F9v`D}uw|{CI2(@Kd z@PHNL9d3~jJoUj#8>ry_uD^rP%2}zMr=REZJYSIx5TW?Xh1RgH5E%QT6n6j2plX$h z${7#WT0tTXgmHop!l~d9KCmUmu05&UcyoIzRv z?Cczw*Q#Cq^zcl`{%~KR&6W1NETVUlQmWeoFV%Q0$GHA`G=sg*QIXxu`OOyFyZE)A zPK=hf2Nj{56 zJQrTuNv!ewc*N)Tbt_(OxR|2i3n3l;+AUAHOHnE2bgnyS6E~9@I4e}Nla{?Jxnue9 z3m>%E?c#rowk7XTtfJ{nco@OsQ_Z2!ptl;S&bt^70tpnxChh)?OuZ5dpAGqy85M0u zN_mR;qs~x7Xuh701urq*quxKX)t(l_+a|#vbQM6fg|FAhJr}9KT!HS8gM0{PC5s#U0+m1F|3a zi%D8#*2wvX^0+UDvuVOE>}wfwpOTE5B^pQwi>)a99v6}SS!NOZ$R%RTVOha0mLqK~ z|G;ZZ{H(z_;k10lV=XkXm}8FDU7nf7*x`9DtMm5BAfIEm{ocsKQ2vNwdBekUMm`T=V3wzfx-5DK= zQza@~W;$NbvV0|i?w684Po;L^yyUv$VwK@OOPQZ0`s!od5hCeq)QBpKPj6RU(y_`a z%ywhN`%q8gs7^77WlCvuVd=S;)62d|jupC=!-Oezd4X3%OHbqqW5rcNubSZx=+vj) z)1D31%k2orWnIy#r)s<<0z`x^%iw{eaoOubWGC6^S7tL4{f&3m(-b3NE6sAh-2Le= z`{79ukuJIBF8Sto*5O-yG3n;8mdOGIcfAM+u?$zLu1GD;+ z(_0_9A+kx?n&X7}{4vr{z(9fBKN60@CO2ruh2EyhkuLFe*iKy~4i^`g z^zv5TCx4NV?_oNWQoK2{F*tm~{>>{tI~+L0j7NbN)OegC%ooXeAw_;+fa z@jLf<`bFt~CMOWnso3L4e)PiyPx# zVD9Cok1Z0YwTcm9S2r~aR$Yd;?d0x$bZ~r=K%#qEhOrP~Zl{&A((yZ1__5Se{hF{# zhF^-Q{9pJ=IeqX7AD{cg*qGjEFv|T@6G!3;Me^mKIX}mEW#%C#I>d3wwz18YT)m6z zpkS?UZz*YQ%yz)Od9Is8NyW2%TVZsiN_WnhFrPK%72yN+$jS!PjYy3qitroc+ChvI zQioq+JrRb$h5KdO7qo8oQPaVs@9lXY?4NET~P@Bu<6!=*hHi zJh3G4-J8n`VM=?>>|qd3X)xr6c;{LFelkwtD)Xr+KR)|{F1rG9wTD8i$gj~0*=HVA zjBP`XRd+M?jPf)={H8o@vnhraZt`(+PaaQe(+o1=U#x>L8kJ35Hrg&-3)0mJUIV}@ ze(kSe(MOY_PdtP?{Oc;qzv{j?#{S`wmlJBXZYHe!W$=lu7MS&#>+nOyC8KW zV7S7WL9wjDYtlbYaj(v#j6N?!zZw>sYJ9vBlY=c0DbVi9HguBicVo8;EfN=d7VoM(!$3^mXgfY0l5I#}|9~C1HvL#lW*sbdRHHX zWy<2yyZYbxHa4c>Z*&X{@l5bMpa3!qvtT$xVX>Mw3i!;!y2M4)A zWr>5kqcSd4KhK79Dn~Zr_{aNR7U0pnG_E2Y+?Nyf9d_{8ikYD3yZogfXETg12 zW{o<&hl|FiAlh!zfw-4r+~8RAxW8dnY%*go52b|Py;Y*r^&~3op}1SVM;H*MztR)X|UIsGDY*353RNV_f-7K-o3VzoZs!n2BNAD#+Dvf zg*p@zPYbxz8@3=%y$G$)ND!wYR_KAF;k~2iFdwOU^0kMMTHC=}c9-|y-U`=?c^iE5 z$j`Zx-@+{`j7zRySU}55>p^E+Gbm9}#+J=>^vcRQ_z;s_BM7+VQKR=$Lg4CT(yeL=92$3rEu2O+W%WAs` z*QaOygaSQpiqYr^Du}Up)7U zW!KZVtdE3~NLmBd2L#$XzDluF4HRK4WE#(;)s?l|2Z_^9mOeVJKP|V@G&=6FAK&?s zyZI(MH|5Zm2t0B9-%W6MegZ7fEM9$Tup!AtaOju2LQl<0Bue532>B^RxR#{&lAETc z2aWbqv^o|<=}-8(&ET0gMI#A#oK3upQGkPvf|iT@>DR*j6iowW9vXwvH?aQRirnnW zY1Qs?f&O)Wf(;%H3tYU!^J#r4uN_RYT?whVHd2lOn%b?`#%9a@7MA1a&B^5{ApzVK z5~sonS*S`Nw3qqsF`?K?pHFvKw`xKIGhvx)45>GW5dHd|_pLIwa;#sCG3Mj%=ll7c zmi1@Uq({maY$w#ZKF*z)3aGgW0vNEo{)w)ngo{kSB+yyN_0%;q4E3SI`6Pv6o$=Hq zo=h{GIv(hz6tK#+xN3qX+e){;lu4cSjE?l{JcoIlCsl~Qjra4GqnNXI=Uv7(Zl7b7 zMVJYV4qu+gBAg>koLR$%$r`z`KBvM&f*&3&yhRNoa?wU zs88KzCwKVx<+}^@X87X=A8?d0G$i4uN=skww%GvUk=%1VdYN25LtxvL#Hx1k?Aof( zGnx~BNYfzTSnHBdBkw%Qt}MuHu0;_AGn<c2Q$7$8$i(u@V%qZPFvUO z3hgk(3&RPqA1X%uc@JmsCh4SXop$c^PMO)uaW9@|L&TmeP&-N*fn|p0(jF|>SosYi!)Nv3%Wm=)`-b-zN<#8la zZto4bkpO#McB|jhWv_-p%}Fo)(a}hjm+?{ZaxMDyZ5=&458d`+jqpfZJd_sy^i2b6 zIrwYxB>v~lHM)V-SUdHY^=Pwm&#f%fo^L{Z>SyE&Jln=kT-fFgP~UoyUwY)5HE0S# zY6o%@|2vE`AEuqirKMOeDrfup$>tix@8zAjYKDBqpO8IK)H@Vh7^=`TJsfM}pQT;$ ztgFkT@&=`Z*oQ22ElL;Lwi7MFb>0y6~(A);vS!OfHq~ zk5xv|lo6`b9w|>co4kVSq=mKgu6Q+^dUA~XjoT%BYJUyv)gPR?93;3d`+2`RsPrt% zwA4$|IPG-X&v>av>OYAyfo0P(;vdC<`z}I|b&m*xj*@5o?oJFpAU$8$qI@99j7H~m zUV5zmLAGJj^hRY`bp2tXt5La)Av}qdXL&FCLDXyNg{?06)sL!EDt+s}zdj?Vm*sO9 zRcrXuoqqknUtP#iL_|y}jx?@xbO?MdQ%Sly_2-hfPu7gBGYsQ-Pw{uowzu}3;$%nw z?vVczAX#sRQdM- zEk3?--h8MfYDyjL2G|XjmU1+pUv7@vA-B^FVH7k1udC$x8?{3nmi zCwlyweyOjwMk#Cata4;jh4y>iev?UBJk{*^^zEDI=ds2I_mwp~ds0388nlXZ`~eS) zOkQ4I*A<KfuL%b>Y+%NslRNZSc6E0AE9(%qC0*w!|c9=-%;xKso(@J|!I! z*LdhYhm2H34I#C?xyo2c`$;n8^nKGN)ep-TETgUE)43D+qV`;by7mE4k*mCV4~P3S z(y)ExsTAMX7NXqAQ~Ver>(^_yIywhOl@`XdsOi8oLz$VX+KXO2ip@$Cvg@~MxX3=g zki_;AzcYP;U)+;LsyB}&ll&sb)?zBlD8DIS;%m1;73)JMGEu(TZXvGs^c;LQxgH3u z4d*fRY0|goqW8}2XI~WWpqLDd`8ec8x$E$ZeLvlAoU!jRGGN#Bjf{SxbWnb3#EsAX zeLVN3s6R#wuZoyr!m86A<`-;adbv>H1}B+nyiU__PiY!_J2FAr=*v{1 z`I>Mr%HEEaBGZ@o3}>U;&^ql$m1_0@mi#a}mz@ehUYuj0dy%;j7AE(Ts|Dwsn0AZs z%Wzuu0u-dv96!AZmJkMGKq?u8tC+_L#Cfo1<r|V$G*z112xw77Y z#Jmaru;>|31nmyGkG&=Bw*R-H|>N4_K^omKv0pI7}N=Zwd4)vKkKa;p; zWqZ4LJqyn22P5d|O~BLYf+pKpI4R|a_iiOO_S87tvE;61SgR+XwkkNMp;rb}-w9Ur zHzc>8Cv=>%Gr-Py6TYf$iHpwL6)wFhv%gsH$+=$-Vw|$OCHM!KzWbn*k`&P1@7}g& z@iUykTr$;Nwwe&J^(`$=Rnu+bcy8$Xw`u&x7V4oG7USYxw3IR}A4zUN;}H0mj3+l+ zuwli*wNAyRYwP0-mrrnv=le%5CE4FyY1m5b)VKrcrLL*FKg6M#k+l4teA$7j5F^UY zD6g=7jJ!do5wjazRQ$W=lX}PJ-cdRl!aA8HlJP8Bm)J2|Y`Mw&A6}EUtKD4@G_-np zyx}|qE!4rq{<2_rA(>Z`w1E3o!vHNKIm%5xQ(Qeq{Aqg3!yu922Ez5@=jn{f7}TSK z3JTy6yP2bvTjgvR?aU-#*5611q@dil-gzJd_HnXU%Hw&*1{AB%%wiD<4R8{dZ;<9k zas+*>{~gv|rfl1Lxtyzz;+ew~*{|#3gD#IhPk-Ehz@^>tO(Lr2!>XWydKZCI z%^NF&w0xV#c^l_t4hyyDB0XiRN4w--aFE}Lw3~lC`=(~>kH*dDV5D6gJHD#5u25qt z8vjTkV*7Rxfkb&zs{W_I^!d>*rmS8KW-tOo90+O+^L>UYjP^(Y_6m9%SB6PS-e7t7 z75Td`n<82EUBn*nN8kJ9U$zbRYX3>_`Mxj}gID&u2QV~-VDdEK zU=l86@B$-17~+srCMm?S;2^e z?5U98`S`8(%#BV#Jxl>AR}}BiNbYH_ZM8AH;E#p7KYAthd5>+oICo-xHF#ac!F&N5 zSkP9D{p4BK8Bu2GtV;mQ*PTW-b1j|eZDU>O4E0ByfN|;6v?nlIf+XN7( zGS$^iW<@=5-k7!|k{gMl+zM{OJnRYW_@#*rU+g^n8mY+n&9_rW`T%e?y5VUVRxFHK z)Kv?sX-@f7S`=K0ap=sr>WTPI8Fls{Ih`oVs}rZ(Dhl#6g6|LU%3jO|WfKNZ%>YjK zt|?|+aV!y_gL2_#{Yi+P*x94U2ainT{5J41;%R%s-2dtB%>SW!|2}?LodJh$zOsK3m*M-{TRwww za@Su5d;4Tx$=(SgVtVi%#W8Ce4PG&lww&!%uhsOZ@ZIBP`+icBC~lVpRB7nST5JH4 zUwQ7s=Ud0G+}MI@&-ZMZ@)a1~ypzC5Cf=oms21n)lmZUi-0eN`GjA!0V$?buQZLW4 z+me#JG9Q;s`S~=(+KM7Au6xh#I^|@dc#geJZ2(SUp~NJEp{WxHY7?4FE|V7^flAsm zOm{;j{o17fnZ}1|a+PjpOp4{xwmweG$9F9i(f{DQo{VUbcYaR?An2j$k^Di*M0RQz zeOU5$PU#oK^m*s$_XjTau0PwqO`WY7(sy=Luw~?(?qgfh9(&h<_hlrd-jt}(BwcF{ zK)#a)v;7hcIgm+i%_#+P!$xztLC}YR6(Ik~QeB}{Ks^#B`jIGnfI1wr&2G2XQvb_g z#w8~18M{vzT=g63qT}L70j+-UF!55hN8^4y?vlZk*;5n=CHR_o-C(;z=#5L}_26Sy zds`0gRTuFq)(4hkR|u;J^=Ub08u&uI(yxyf$fGTuH=THoxk^0QVZw{m0l)?K*#hV8 zIGsUTT~1f6Gag|>Fv2&i9E*R;> zwF&MJ>^}9UnU9+9SdFu1<BS`13-` zH4T1V^-f(MIkdEZdBc^jf+=ZlHWANA<`E^@^;BP1FSvKE4CR))sL!t&*6aC>;bZP@ z>ZWyNA3+)ZUpMFhR>E?+Bbcbq*~(47Oeo87!S*j!>d6|v&iyxHJ9jmB$9^+=!SFen z6Rhm6eKd_Ms*|9hNx*AKo9xZG1fG?IWJZMOI8KKy`7Y4rYTvSG2A26G;8It46*_rG z`$?1ky)?lw$qmfdTR{+JT|8@Yo+>2O6Rjk<&@a{6Jy_bq5Ja`<9Wx9DmXDBJQ%eWq zc~(Cyq=?tZqHd2_<_93M=6UJD8v1Lc=ySQUmqW(tL)#{&aq0Pud!*c4a1h^^9<`}a z_miJD3@Ur?EYiCNObfLh3 z#g|EGZDX}GyTcX@MmRxoq8QzdCG&&*(5~l{L7{XT9^a-0P?D1#vf=Satyi*o_3xrN zrWo_!cWpk0J(^Y6*^kZ-Lm?yrLVM>Kf;xg{T1S&MnE=}9{k9B((U?Avpy+z?-rm$C z*(G38xAe&kf-I{LylkqIt2}rhzZi0rV3gQDI<9k~h+@;Z|Ni`RX{DXaH{_*J# zl|}MS@M#H{Td86UI~;_)e3dj^e2?o{tMmiLw*_#i31$p#kvm5Kv#Nrist4H1*|h{R zffT7cT7)>pa{qAKJNh?alqXt`6mnKVa{Lj#U=EfljTr(#urXb*-XPpb5QX--U@-Nr zHKPRFeSA}BGFWLL2vcoLx4k$ORwwIBypMT+`iV?7$NEw^419a5U4hksrM85Xq}aze z8W0f@Rved<#NX=z36%B_Q{UhK3{0-w-z{WYV5Uk05)H7tw{{KiZL=ZIrEy)ib-3cH{G< zh!DJhyPEgS(pTGDhk&suB}M^r!jH%KJQ&zs|7N+%}z>VOAe>{HX>$d%N=XrnrKCW^Ynf-2SJ3eXUtZ8J(+J5S>aVaGX z=o_W{zH2#Okh9xyt*_K-pa~_89hs){;-Vf)Vqa#)}g*Z>woYcm2|S_9h`DvNQ{D{GO#|ZI#vd zu5uAOzM(w-3gqSV>$olX5a0rzH$|s0+(ImW`6F2bQdiGK%j1oI{S7a@16LZzpJ$j2 zsPb65JugS{)6N2Cc8y#m&86USg+UoH3}C0FXsGfi#%P&La+#eBgvrEJ0c494?kDkR z2-#o2jSQ(M0--N7ii=q*Hg zN?(2g1e|2?p&hxp+77@4-#l7QCu`bZRgI#sD-hb6#~&4#Eq*E7uy@+x(X76WwG5DMNMYMSqP4c)&pg4~6x@FGoZn;n~dE&4wRw*D? zIln(9lpEP$OF14!2`MtY4&4u$hFd7rNA_~aYZZ3Q_k*Fsw)D&M{F4E{ChpyXg4RNx zW7f+y3(mb-haJ+(77~=U{pq&hWus_t*e!!=GhE>?UE zzqg^=5ge3>twN~B4r^8q3@+hz#^NheG*0-3qX1}2@a?Oy2gyE()fa^43N=(4k6PEF z{LD78$DE5Te2qc}x$gb+yT|Tc6IIoXE-P(V@0Xl`dX}f6NrmL1I4FV9owCAH_tsH> zSpex;!%W2?L|rnCVKHHOpm4F%;R*G^g4Ff%G8mfI(Px9E2bgvAkEHx?NY4o$ZkOWJ0|It;#it!2x3A0Y;;m&$_p;YT>)fx4&Z5R`A^= z=JeA5#X-$W6dAX4XKW+Nk`PFy9AVe-?bD{VzgZc#bxqE`Q0?r&p)af6*GusAH3}?y z*n8ybjil*%KJ76}Ws;!g>{kHlA?Y*teK%51xvkz9~2x3i`i@! zqOWHCs2_7hlPou^p0g7TqU^SR%vdlH@t*8tYs51N42%pWxDVWl4#nsWzk3$UwNa8s zn6iP}eH|gLHmWgk#1)gHj6?J5sfE8sa_k@C!LS3f>o8#V!%Ez6{&N4RSoTSsOQQ7d z02$i@rL<_TXz}KcpI*$)7Mh2OU-n!#j}jkhFy?ZVrRX0VjN$hYf&{HVXrLm2XT>WH zkL4QIT@AXE|4!oA*|engXK1(}C6^Gsq50p$7)9Rxl~lj4k{)Y$_f?&1W#qe+B~v^` zo|?lhx=LK&$a<$3l`e@^AnttP|ANBKQ#Y$1zklYcZUQGL680SBxUA%c-x2=SmRFC5 zn?Y7~Fn^V*?@pgZa5{U#=+j*lnd%Ff@(F{X(jN3W?JFJyqK&HV0SV%7k>-C_#qDT= zK3F)QYqhTW0AW2s=wsBi_*Xo2iQap8jXmqRsc&WZJ+rb6!6_9eiz|rW*4Hn|C z9P`e@!5?m5wyx}st*Y*x(;x+?9Miz>OI-<-?}=neU*!xI=nO8onH+l6our9uNweZ? z$cf7@|6~koy_^&KdpO^m$pYNEKlTJe>?WfGpWIaF?(JtZD#SDUHT}RGr!@vKb2$@c zMEoGhugleyJKC>5G)IBilMrC4K#|s#TuBT4y0p8(2pa7kcqIXcn= zJSC}`{!%5wSe%2QmlK_Zgh*X&sPPl6Y?NqU{`cn@Xb4PPfV>@CJX-*MwIrX&dYX$G zO`PKT<9X*5JNuJn3hOUImKPgMwI8(eoK-S<8PVApn`32=Qqkv@3S{sQ5X^%DY3v4B zi8L6YGU}SIx%fEN0BY9aVUf~M{cgz_(RNN}#s`v*S=o<2xlIB+{=aD|i>Y1{qI~>s zj)mY-=c{#+w}_;`34v%HYR)c44&gn@B<2Y2dY!~O#%q-udBV$F)V7|H%6%>X1bFz^jpy!*#Kfc048b) z3umyVz;%wjcB{!xQI3tqg_!ue!}mtT(^w^L6v`mH1JwNC@L;9vGEfLFk5`pBd`wrY z^TJO6c`taT+JFV&c2C<>5e$N!vLLJ$G~lAzb&w!v0YAT~5&*m)#L}hUMdI)W+9O2ORJ#FM@Q&Fm$#kmCq{I}T(X!S zVPLVe6`ili0&)7Nb4`!3bVi*cgUS&K4E*2gjEW4mG!e&tzuhscHRtu&nx~m{HVQIA z$a3I~a2$>k3Kp{v)UQfG1Tp~d03D)SrVSqKya@uTFAFfI(HYWJ>(C>?nL15G+40kh z6?0u`Fo-})fgFKuc^6Y4I_Yh4_KNp*e7bx=OgSZy48IN6;>kLvEjJMqi>7E{_nc5M zD7Z-inFax^p_v#Mr834(%p<{l=f$2$fKiT7CvfC=7(#fvlein^lEZkfr->*=~7W zr=#PVUX-!Qd=c29sz1QX0wHgKiW|uowCmM>m-4Rg$K>9ZbmzV0nu@)!B;@1zk@pQR z0$t`V3NX+@z@7nvR(qBt?eLiSzt__1?6=~miYaW=3=yRkb=QT{A6}yW<58y6B}xPl zEGf`x_psCW((y8FM7_uE$d=M42(1>3odcBCm|72i)eDs^_o5tTvU$5i;Slh>fCQbi z%%b{Do@5cELA>iCnvpo6U$;nQxkca-Ub8uG7>f|%{CActS+8y}Tygz@-`0ddD8i87 zy9spN{bTOu@ZaQ*@HR`X+&WG>hnl-ZQ?9BA8kcSt8#c4(F~aUW zoTyAb*!9PyKl&Q-;Oda_-#A8#JM<9p_VKSWe-KCL=f_fW8c0mWsQ7Dch;~Y3$3ccx zI_jd1_xXS?-^8Mjc$AR>3_@@io`tNOKG5Ye!Qe2ubTJB*%=SnG{DUPCmolfd7$+S3 z22LJ?8Wm{}6rk)y*oL)Wk(@|GA7LvFrtEOo3;-v4Xff(%;UXsxhx$yoqnq3w!WxTLkX>RX~kq-lL{aW5M=`@g2qh z=0le}=&aAAkzm;;pRt5>=vcW8}B z5NfXpEmH3ODwo`S(wFAP%A53w2Q(r*1&Ka40&~{kS5D6&aayc7T?(V2aZ=eE*;Az* z@Nf3sW(_3)IRa4%2*Q=52*<4PBzWwK+(ib;x*D?IAoKooAQFEE^E0!Ehc>$iwEPUA z%`1?FD}|cJm$|Hn?627J{<=AEQYx18J8yY~8yS`aARjO~9v25c&8|6m7!RYp*h7$T z|Fp9(;Ows(#AJm_#rbSJcD4xvmnCFPO{hI9)=;WW4?$lz$hru@0#1)+T5!-;lvAjn z8KVW=O_*>XRd6kPz}rwFSU}j?X;$dJB^sDD6sTAi>c{u-lUP?T%<$g|# zbouEY#l*XC1Ax=o5_iaNf=YjM+A9%rkXk8uv`9~8{v&Ft9X*`8y;YcU+k?{Z3Gc3m(U?7a)v8RJBeZC!-h%7^r8sR0m)!O9 zT8z8lQ3E}$c=T=ez~~$i?dJGUgc@*_gQjxKT`!0)qOs%4NYaWdUCts*S7l- z^BCMi`KMh=Lf^q(!^joTl~lAPR}8#dVPAi-w1|0UnLGnGjWnT(@3yK>W}uANeSW47 zWq8;fyx(}}_@W(jk^Qj&30cqtlBYKaQ{i8S$HA2|;S6=J_ex@?qoTKxp+e?q4iD_OX4!20E*i9aW=Lj3~|!shgvrqe!|i7Vt2I-HlYwyC?Y9AP42c zgOb_Sdv+|ilvG9jT)%Iyw`S_SEt}!;l6d8OT%!^!pxl;Fg=TGZtS6>=0=I|G#q|H- zIVehZu~AUWpXf+iFYaFh_25Q0>fc-aKlk>8j1&I)@4brJ2`zg2$;XG6?!6-*n{QmZ KsaA@z3j7}d6-H_R literal 0 HcmV?d00001 diff --git a/static/images/semi_vs_quasi_separable.png b/static/images/semi_vs_quasi_separable.png new file mode 100755 index 0000000000000000000000000000000000000000..e2aba67b2f482e78316da0a39fb0a27c90d431c8 GIT binary patch literal 73196 zcmdS=byQs269$L|ZGtoo4Rr7jG`KWDf#3?!nz{ zn%w)l^Y6@hZ{Az87SgBJIs5EgRl9c8_f>_dD9PYpl4Cx3@&rdtR`TtWC&&QAuMq}1 z;+qL!-4D zemtLkMI^?)Mh>JD3{?7)=Vm3dqN;M-_Mma#P_AcwzEEp!uFS=)z%cBs*A5-fx0y29 zXu4n?uJMzkjgzG(kfY-Q&=Z6qE}UthumAb##s9m%s?2~b`rP1z0H29}_`pTNt{-VN zR%|ra=oA_i)g6jY=eRS;R>bhQWZ6Qa@4-fEeQ_Y+k32mwR55AX7PK#40(Yh=dw(Rn zaC*4+o@?{g`LsvnF$muM{S|BC^E+6#v6Gy0it^#o!=3yZ$Ni#cKS1$OZk1M}`BLs{ zaWa8NZ{@ds6&)?XQOwvEf7)y;Ww|ZjFT}<%P`SX*SsoBi*+0nbKPsS=t#-qY+vC{u z_U53yKf5hyYCvqPSg|N#IN=3uvyU(~jA7t=pB4)Tr#+#0*#c~42WYs^!(+5WgG}{T zNoy2EB_*rV)g}U7mmKE@i!q6bL}i-Q$h8*3($3CxL>eAm>v}-QBzWy-iZVzyG!rKc z1)=TZ_VfLD8yhWZ=xbE4DG*@?KOPY3N9Fnd`e(ynXhXHvFd8d&w;6p7AsE(2@ZUQO zg9{xFiC%>~rx(SPA?h8AM7~g-~OT}B3jJ!=|4;Lje~HA-S+X% zK$!D)u08wfvn(|NGho(#n%;eKvu8`+u$Wy)3o#vRTcBs6!B4;q`)Lc-j-YARB}`d= z-3N}9rH_A*+UQ`rIP@mO!t`oIV%n}3R#ORTARxLzAG1A+&Yk^9l6bWrc-)Wzsiqvv zG1_DuC!xzELp8@v7q#dv=oaa{R0}u;WBe<_K5*l@$mDK|U(!~(-HG6`sy#5BwU#27 zrGe7|{#IeW=+maz+jBOFJ8>oK0Ap<3^bl}gmgwCHZv3O|xyC{IkjFj}Ck<(ttw|X* zvCGX+Y+B;o8e%6`XolyFC9h;4ztQSKzgQXh&>}Wn1CGj$i-KQxPH^HcSV-HN2$f)W z)*T(oo=jy*mFoB-rg6*81)!`Yh!S^qyv19zFzGiwE-d~q6&S^T)J^H6NAp>1>E?PG zR9bj@%#=N%&;57lm~Y}BheR&gCWA~fmTxt5s7}YT9d%Tuo(_Oz+0eZVMAKk<8PY2r zd%Ot{YQ!$fB}zlVjVs~RK2)#F>inR#n;D8}?KffUN?vG<-N-b}2!buK{MFFg3OvNE zVRqe%ljWtN+&d22C+O;l6;0wd%4cxCKs0wZJz{bbL#o(PGT4$l?DfXuF8UDkp%NnC z_Ec2)Sy-|no$8xsM4CmMtMizg3mm5o>WU#Z><@=u5TMrZLesdP|f9?1~3PhkDe0FsW zus-C%ev*0ePD>}PYMr_z(_}NHSga=3W1vy7h_>18*Dj$fXwbjfrNgjFU>H{*Uo~%iLmmLTPYfhjP#g9< zZZN3yfAWkN@Ne~t=X1O&bm?HSwnt&vR`|8a&5@qGyRLFb6Tk@o))Jf7Pb|vwS@8Vt zZKP=^s{>C&Gf;HwVN)Z=FFdl4DHi!;Qa75N=Yjr#WYDU6&%|GoPJHfa>C3qlV~3FD zK{<&UL-H5@?OlwW@XKddC8w&zJDmEFl0P|bV(H6vLepnE!22hRg}aJkIp&o!lje-G zw)8!=Cp;GcAs41SV1Y<^M;-vvPH(Q42o~n}9dPlmJM+3Vq?3kZt(*@XijD3+fP^?) zt_)0pfi6+H-$OT+s{+*}TDp6yZHJvAZ?+Y!lEwTteky`cc^mwR&lvO#Z(beS5xOu0 z4ixM*p0DSg6_!zjaj~Y7xVtf!AKgv8l=k-Va`m1cpCPsX(J5@!4Hg(z&6Wyt;21PV z-8a&%+$_A7y55}_fByFb$CYUMI3V_^68Ll*lK)bmkc!bA-TU*&YGB?LyFGG^px&qy zde_?1D3>YoAB%*mXlJ&YgAYtS*3o7~<03&&rZa99;rGN)#_ z(caZfZP?2FSvydXbT7lPe5p9b!%fT;abd~>2*||l*(cNy0R$aU{T$9;a4xvQLwJvW zj;Iew5XU~KVRgPjB@sQN!xDvSrA)&|sl(FeU`5h}cjsy;mL%9^Y8sYA3R|^KmXy{k z9@x4K>Qy$idMS~XP9;H$?-KgKCxGb^No$mYPJEoTBUHUf%WdE#^USppN5LuX(RS0)@dq`j`wajo*C2Lo+D2Q%WxEJu+ z8Y?+~Yp6*LbXvh+v-Fd7NpS1Etk%OfnnNz}g)WO=Ud*+ioUR{1(mW{_W63iMYZ>ef zQ-j;WkW>^bO79B7HaQ`LxLtEA&-EqrC7~e-QN}1spCkXZoTeeLBcH8SWbwPsxD)^< zjz@&l^6FK|>EKpzKv5r90O*&9KhAqTwll(C8-eZjrRK5;w!Rfm?YZkx`%$^RMkUeV zf^8csn}*%9{yh~o{>xU$e;j4NhayN--2E53`i^50C3Ir*8So{#@}Ph0?%Y_@tY+Im zeb4K==%jvb-TKWM4h{cj+#4O?(kHvKKmIvBlUfG-$p(1fud=k3ya$|FKma7cL%ajI zZ#};WPqk%VOtRc2eR4u>9{cY$M|cLEU9l;F3j-WyA3a7noCV+&cH$?$;9Gz( z<%;lLW&rbCDyV|nO5DUsm3h_U-&rCKdDI`=vv0yYDZMSSg`rDL5`X${`Xthgs|=>Z zRBkqMBwFY_4VBaH^PHoku*uMaYKcL(2if~ZLw?rr_>3xU^gVV>ep*nG1TK-c*QSVo z2@@L;e01|hT6UjMX{t6-8sILys4MGK!;L83xuSHjLzZtlwiL8wY9kv#?$&Z-Q*jHI zaC9*TQtk`SONh@&@_c&tzP4E6GuvVy$^RM6;9yyPNzTi zBB{>HYd$YGTrwPi`7pMhS1KoK_~<%K=CLmA%RTwegW)0lFPiHE|NriDKtDLM>Bd(w z_9&pt2KoeHL{78JP;t)x{{F{cy8f96SV98${K`=7G>pSof*23NVt zHCoNqqf_a4;8_JdPp1Yi4h2-{XW03ArijK}d%KvwEi2_42oqcC$u(zWT|VQ4DDrYa zC=AkJ-QE$)Ka#_yO=goLamCk$T{>|*zsa$&x!2f)?>QWNASO%?s)YvA^{0s$_AJ3H z$}v3J_|1#4`k~T;Zf!)EuiT(G;O1R;TD3S)!C`KDfz$0 z3gi%s$OWqXr)ik^r5GlivBvJGctMFdrjuy7QA8zBgKKox+x6z^YUgoF64jbWN zNa-B*gH}MXSG|O$l@Mw1ozctS`xdmbsM4O-mX#lt$qx^BG3z2DVfCi)&GXaQ=L@~Th{tE|Oz{53y#*gWK^kKUDNonHPkmB0-g^x*(gdkJBkKdV+SI_8wKbo0WH+= zLK57Zj5}f1pee4}b|78p40T`l;E4Mp%l z=alg$r>(#2;aPde?i>+Zro?Kz_SJc*81i@MaZZdjZVY4S;S(LcLFo&0-a?B@uPnYl z_hTZqoWhO2i2vP}AMzlLL6<}L^?tVHe0V!Uiz(8p@&)%vd&#bH(Kq*kfCJg}bq=0E z74ziU1l0s8iD$KkIYE|X*=n=|3G8I&ahL4LUtLm`WO*^JM(x*q(bi|aTLQz|w5Zs% z%loKQk-O=RcNBsAOrqxPvv5MFqvKEdR(Q$1+$OG+D#^$KcVrJcA#~1Z@pPt?Eh**u z<}8kxxoR&f(VyrvR=@o;60=t&T{~k{T6K#Ky(0}}h@F%V3XHYB{7_OVj*G>8dMVps?|`l zj~|B@6|IW$U*^^*+C^23x%&j$Q&IS4b&oo`DV|?Ao&K3AiwSY`9|a#a%)}(?sSOon z3FjHw1*6UAT6C5{WM@S%*KHNAj4TB%ir=~t7LZRX=D;A+L=nuxR>hQUL}B+CJnIZI zcse?QZEOko)+Dx#ni({AMDg>3ZyKG5ilwv(=d`#B@r*0uC#jcRrBb`TG4}q!vYvxU;r8Pz(IL3wozY`^i&5}6ZO%@t|4$aeY!SO`_E6- z2(4Z?7^DqwRfzc2KpR-sTWE0Bdx-t7!)f{nw%sltW{b@t188+javAAnT3x&*n$S=A zD!hW?xJ*BqisJpu?1(AFMd*7XM0Pjo1@9)dRSIjxCd>F7`g4;ih1#sj`CM-jW!K+J zV_E>MZc8Bf(4CIKrnC_G#^~fS1LG?gNzy;D#5yz7akqCnok}3sQrCaDYLl<@Y79+c z#{6{To0E-(CpLI41I5sr{cbm6_obv+54e;YN8M{oa0$%YAXZ(8`{yTiarfI?{te5Y zD5yt%l6KD)Fyng_pDC;S6O1H%8St0L@gNz5&l6@;3l9Ehd}4u+V%?^j|CS5IeRHth z7uVIg(g9SrVHPz!vaptyO_}S4J>XQ&o@3iM{-2*$`L~x(v_b-FDF#weOdNlvLzW6o zG)zWmRMW6*n{cwF?+&g6(a6nDrJfJEc<@lH^t|h0n+d|n=Ne0Gd2z9^gv7uzmI349 z?8SE31<%zIm4D{%GAv1B_*VdN5WI0{4^_&ANlyteUjO-7U6byhogvd5pP<{;cAw7i zYblB%lUQbf=w~E`8a=srJkM%{SAtx0ov>`KYdou^b6ss*?izbFq$ida-r?{cH8+>>!OpM?NDXFt&dS;_8PO)zfKf8jyL9v_A zg!lo+WlYf8o>lMQP?8H4Z-q)XsqQ(^K7A@W6h3`&hPROUfKDcq* zMwx~(>nue|hBt-V#r@l+R+4t3)Z|-X-o)W_ZPbWEw-k5{M`GGC5eB``%;H-vPQ?NwsKvZ z*z8mk1Y*$=l(aItRjD{#bF{;&TQz7Z+bQ!_&tiib1u)sa`m{c4%vMA3Z9-7vr!Ni$ z(Z4TyhB?zwLS`s?oQQ#XbT)_8>MtAh&(iy&JCFul5&5Ojl7TFDq*EW$;ghg0u7)9d z2N~1(Kaxw)WtIO5@c5S_J!Zez+mXqt1@X1ImguuzmDV-5476`P?+k!-J7}$RzYEfD z?{nKcA_{S1K4Vi&@Cuomrepu4htb3x1jbE^2K$f(d17A?Ep2orqtqErTiFnIbC;nm z;j<;fLB#7bo`FsrGYGH58r8fr!r6)DRVraLHx65t`Ry?k5dI44h)l=a-+OHtKGx;haomhqxDFI}CXQD3Jaylo&60b(#2 ziovC8llZmH%{}iyJ#Q5E-m6n4*-vZqm_Qxrh;tIvhc;`9I*)#{;*cQ1&1V_E$^-DLcLu8y zj(0~`?UU$Z=VUvS9d(nu*u_#bb;!+?AL{$m5SyBY`8O4UNf;zd%tmgKD(HZvfxE{tp`{Htq_nzYgHY+KJR4m{l^H7OmGdIAv#u|8e zH*lNBE|Sqr>e?K$1K>9~Dukbtk?v>b;SM6nP_jKBXLA(Z;CR%KYu`Dmj1?_~zS!vC zO#OiB|8a^j(W%s;_o`_w4F&3M4HVC|3yK!r&g)7bFDU8AF-n*&XCTpk9EQVHp! zMOgUassS|C?h)JQF>^Nt+Rimbux&^Bb6<41U?Z-*VtfaYHkZ3Eh;J?Q<&%xax!uU&LD`VnQbo3SW_@~GoAv{x{=@y69KBgT!d9e}^1@9A4b zI)zbpf06k8{+)A17U#}10UNKD<#6!jlq&W*$_q4)7k%K+{tFY!n+nTOABk4sMqvTg*pXJYJnQB2+PDM(36)*v12}ZvS{3zU=-kP4&z1;8`9(i;n-{i~nZz2*>FeejsOw= zqB1W`fdRM(@aFlysEq&uit$2#J1_r->O4(@Ex`uF9K-%aeUuPT4i(}d`G0{KNAV5n zOKK>d?*D)rLd0kQ#6txC0y~+A(L@jrrTh#0ECmjT9g1yGH~v2{1;gNPYN{ogPl-6Y zU!*!~IbzewUyg&zb|dn;kLW2PAqh|qH{b6~Wx#2vwNl}K;%{VUXnhm*=nfxzB> z*3BXId+z-%_;q31FzG0E6*l3%; z-a||`ZYR|`y`S?p0-hRG!XYaxP)7?Zo3$S%Qc3q2RS&nsnL_PUoD(Oqixb&j*lXKo z>g^;{XmD>bzeIiaS2%L2QD)e`W%^z5oRFWJ3vCw4@Q+#)I7)0ltn1Q7Ma_GN5c>li z-Y?1o`_i+=L;x#$E5mwr_{*F?fKQTy>gOHwUA!duIf>icZTGTc4pyxANvy>**cZ^d zseoI%c*nQ%^=_#8YdJPSnzq#w@|^%%ojWbE{zd%_+E+fyj;)+WrBImT1-W_MQ<*$m z0>{_t*(gUz5Y71l>SOiDZc5%i4{3h5#~e^YyBI0o(sI-n9CdFgB~O*ReR*#_#^`9y z?K1_o3*)pr;3Yx1;@av^I&-Xp&`1Lc&I0}#oxhGxccep$@!>V#+-?W3LPja6pA$S- z|8Wa+>-yBIPF-#Ii_q(p>1#&IYC%qn8cqx&hfY;!WxAzNAKB#)7~IoXv1okmbW@*D zn_uWFf7N0)IX>%G(Y@GHg>wFJAqXwr-mGv~(Griy#IX9k%!A&=nHdDMmH~jx*RwgIi6h#7>_BJ(k!zSZ(6YYvtx`vTKNLSMWnaY}Mdj?y zlf3ls`#P3xh25#hy30ZV(>95@EbBI$xnyijjah6!x6_D-In03wgx|BD+t ztDx%6H&{8U?ca!4`3Rir46j@pR^`HQ8(t}4w8!qx9b+~igAsx>;fHA-e$y|fSG8Rn z#z6YII+HzOL%cNv>F+Xdpt1hrYwoO?`85>1foEVygBjlEjllv?paM91R+kH%{d9VX z+ob9uq(7eKmv6;hHRD*rJn6s7VIap2KF1y-<2diyA&eWX)DtUTcaJLil2ZNRHt^yQ)tZG7mMZk0yA0Usj5 zqH;L_o`+JLA|Orz!XXPfjs@)iMv|qy;ZlNNFrTkbh8Uv(QqYXsWJ=F#2EwffZ8`;u zN6<%zFbzc+B8XFY%&zkf{I!B@0;wn zJOYIcLq!Lx_W?zt;x|vEVw(Jxd3lCqXbB3GBDMvW_kFzdwHNqXr}!>vL9Os#=hrqs z5~!n*;Px>*R0?3M#usoj8dmwLHEb%)xf_-os$l!Zl9#YO3alLKyPG_vF9i!I5|IC){3-B^~Aop)YT_hP^cBbPuEE102^$ zV=>przGUX_KKkJl^3t=P(3;UGDdG*Sq8@RCsVPK?mPUZ9m7^cN~0MYbt1z2*P8 zv4reXHrT?zGaOPtvGfO1QA0c;xzyOK{T**_b87bz8$wVuhyoxpcHXYQt#AGmH&jJ|;&GUV0Q)Sd*zxNSnTIIv$^}udk_{qa(tK4_cdGj)G5%pO zIBgOmgHr`SMczXldVWMAW(|CMO_~zWY~gT)evS=?-bOt%{bDTgmQH58^%~Qw`bvS~ za+v4yIIe@(hF_xax<1r#m3hTHMm%KW?66gkvFam{b{$ou!)771R;sM*9*{9)xFMo{ z(D8}zn`4vlo`#UMIgE35N6>GE@N-p8auk=A9P9vYsN)$?agSHPl>W+aH9BoeOy{`p zMOW`(ENdEU)~(9n!hANj<0@mU`oK0g;zratou+B4FJi)?aN~-1eGD*`S@caq8 z*3SsU*>(8(TzvgU1=GqMPtWVj-VQxVRCa0(02U$7@LNF3P!NB2@6i)PuF}mt%hBqB zwgzn^L9=1QCLENU=E9S5Z<2ul&UAGr^eAgp+yu{G4IpDkguF}ZysP^=nj$5t17bAN z>00nKbwHLzZPwlF$6~m%?Xh^V^w`GIu#F~n^|SsU>q?0T;z~tXH|$>xB>8bf;M${} z+f8X_roI^bu%pd~?uEE(ttV(A^M13P6=+eMj`6)6F8_?}tJ|GUMNSuxHPtqcY!fF9 z^6f|(<$MfI-d1n}UXj#WAgSLOt9NeBCKXEYzdZ^&m1bWx1YXWM@l$55-w9R*8wnDk|x0Y1t$u&;?Z(f&J( z$H;?^GaZI`b|h_dp2a*cm4xBOp%Wxn@uMeKkxUIC!~&4kj?=6M7Jn9OTqms&%868- zQv+x_ri>$|x#idQ`zh3M0HS}P{&vXb=}y!; z@l_`*IboLko{I`XaF+a^fcUTI#`EtCcrGXg#h#CQX578P#Yve=VzlMaSJvMw##cJnJhH;-AdSYUxnWB7NJ-7%( z0~+}OX`1t^OrG2#p7l^3xO}=;ZX($rJPEdwojeGIIkcFn`WC;`9A_$`j7Yv zGREU`MM8`(ja%4}TlhKBQLxG;f=zDp)<0HOe~5ywB$7K*Q=stw0HbCgN*+|WDLyAs zB)=^C3x(1|k=Q8w(2KM-Y-dnPnm%d1Lq;!vxLXMj;JDPG%Xgk)V{@7hh;29iZibR` zgMZ_6Aecf}1F0&M;*Y|Qd3vu79N1nnI7;rZ`i7T*?wv*m1*FAbjgb z=_hY_hgKCIPmQ0$yY*-f!~pKon~w;j2Z{T6tz9uaVZ^M{cf;||5+v9tc)Tao z*_*bsK$NW5@;yZa$_%JZ_R&`$*!uLDk!VK*2CD+#);>B{@D@QSuo|tL!Fw%05nF1O zM>Apym4sWv2NO|I+Y4bbPfA4kqSl+7=+S3u9#mw_ELwN z{I3}y82cgY!&*qyI$x=$mt)D{{mW3wJIk*l5OR2`KSa3@Ddm|}sk%G_?sDU5 z&79~uq4|l2Hf}K2UHlT2yf*>vJi0uS!{CAOPi;zwY~9J1_A#` zpE{ehDGi5*h`esz5itvA7;t}lMTGXn?Ym}z+BpIU->9mwfz~vyPDE((coK<}sR7Y= zYf;6sy^tKbZM#X7d&K?+O>U@m_6HUc>&xf&pbQ4i6C+^{9)-ZcbuYJ_SBUkrM)PUD%9w!<^VdfRnH26;tpeAX_XprB^}lo15vXZb zlKkI6^*^HTW9(oe2DKbI2){s}k_4HtP5?mti`jsk0Gtpf8VQx*=h+JhPoYaHGD=wG z3qO-^7FVN4F+2L(GNkk!V9}#`qbp|^=MC<`DF0-KMtW>vq4C=xt9v<0%ei$3u(?;} zM=iE^x>4k8gV$p{6;6qPbU(g~afX)56v;0xRRI<;UNV{_fbjz7%3pRdd>9hl7c=*_ zAVKZieAg$eE$!$hT@GU}rm&^EeuFBPo0U&YF4Z(F*Nb-!;F z7vp?x!ogIxqd}9TQ~a_$m0!iDS4M@K7hex5w-I_yR^+u_uy#kmqGl(-k~@{2v9Zsz z!6szTF(Z8RDK*y>!4u2q+-y^zI4wR@wp~wg$F@Gk z;2HRn!SFJIe2n*Xj$EasM=&9)h#$YMLC$z2r@qZ9AL{JoGx;9FH1ifuXf6ZCTh6rg z!`H0+HPR_QCrs1w3w>@H903=A-^T4~3s%|QI#@72M@ zywB4@cNmmw$A>!{;q!=ipqxVgm(shFfJa&&&{@}{5^BOLGC&oc(%fZqa;9J#&cr_P zBlx&7a=a;W`4^VEABY0j8gdkXXg*X9uJ%7CKD#)t!>RxICtXhjz**KM!f2j*RtPI>jBO8hKXuV7VsnjXD6%be$4EXJD5^}@+1&KUH~ z5EzLk&fl?aljK*Z#+D z0i&fPRe4&ty1`8%Z7=&W@@a^LNhcTtpnr1(p3}pX-~|NF;5?>&P5oejHHcwFi>`sE z=pFof#>m05g+z+z_eZ`0`<3fG=yl1_?1k^IWQG2Er+ZLVkj|Dv6OljD-A`7ot3Xm{qGg&APi|e9v%_e(!yk zxtnch;Rghm#UorsWW;a5i)uS@9e}@T925ol|{u&R5NBB?X-aS+G$tDK)fNoypS1m{IumWd2`y}5s1iW>V$dcQ1 zIYUGwkHMJXt|lm|#6>P7QuD}}+l>sJikdUnLf&+PwnvKPdlWZlLNhvpQcIGqn<61+zm)~=L3mTcBOZrC7*sYl~Wm2ZSgLVuY z0U|1x8R!Tzf@C#-z7toisGd7mTp4`(VeuO;^0Vo7ZjExJ z)-TlQZp4~&MNTg}1tVYWD*ooftDd9>JCQTS7&}7j5R0wmhbEArPgunVrR9wj8N6?% zi5xr=AK%AyRbIup!ZMBA@N;wMT218ISQ&zk{<;QQr+xjcd9i7lX)*;Giq^TL^VWD8 z^5|fv(olxr!t$e^KMGK)wSkyplH93;SL(3H!&KEcNKKKZuE^1&QK-#O#ipi}E59rD zNV*sr$-11XsfpQCnYrqE!rO#rH>in{L@6T~0xbG|hEHy7nv#T64j`?b(_p)4V!eJ& z)U7iNub&OLtJ78J1PF~8FQ30u=t7G8G(06Z0$$WckAuZOD?UG}2@&%L`M7ud5ij~5<3v=mI-S)c zs#%jyl!}4m({UpdMdEX0D55mw4IK-USs;`W1cl~kKi8eS>c)${P3ijETu&ETyAX&w z=r75jM5)|;YUFLljeZ@i-1Fu<{bfbv^1dqeS~+gePD>h@Bfi{roeZ0t%rk8irG32v z&EXzvxaavGEV}2h>Q4p1PI|*RFf7tQ5!5QqouXT1Seb-E$LTbbj@5XbS1wP}mW^=Uqr@Z97x z*>7g0o;!^11;%S@k0SHhNS>QjpLH+D4jg~P+|Bkeyh!Yw7uM_Qd9F4h`jLp>rBG~+ z(Jx)>I|Yy6$a}(s>aDU@(B}o@FfZO-L0&|^4Z6&AzjKD zzRS@dh-3W|_4=w|kd?rcrMWsZ*MaG10Bnl?SYIa|S)zvS?PraB{EhGBPLH3-B-owI zAp=&HM-EeI!>B%E{T+e&xmD+d#4IC{v+**jTKT8gDA+=>OT(0LI(aZ*5oEHJ(dLPc4 zH`559pCGY-K`m9YLOAF)&Hq%M()7(Kx-Vt-$hJq^i`H*2&PI>B@GXU~de2|a>5VT> zvtJsnWZeFqh3MX75fZ9!E3>eUfYATR1MHh3toAM$Z%IeI%FKmPdqPuP-8l`;Q=XrP z>Y1p%>mPw(C!&di;R9tZtSh$@6Jqz)w2WW7t?+P>*U+X0CnKgt`pUSJ20S5aHXi4MZiB0vA^6%qgjx4N4O=~vfvVn;zV zUujX|9!dp7SS8-`02&Sq4X%lIXP<@O7M8_DVhVFO@XsdiTiPO65G7?9JrT>HSohHO zEz0X7m$fSnxUa+fZsuD)vFHAA6|5#~f+Val@ZVcOZ}kH!{s;(YiAEit4i6nzv9_1E z=O8N0Jrv;yyjmwRxtd1=v!9Ga!Mi?}gN^Gy&-I{u#8jz*j%x9$-BBzPa6*wmi7J;Rj6BE#|(aXIH zs4NW@f8z?rO~yPw!M8$y`DA!B%F=%-ic0rFQq#~Cy-R9|FSYT7WT}+I9-7}Amm{lQ z`aqkBT4U0|>VKGKU-0ngHe~lbEpa*XHUZijNfU}(EQY^`u!ND(>882Bk;}HcP-i5w z$dayK}D`lpE; z@lJ0=NTqv|oW(a#!1m9Qym4;G{(KcPc%WHEl*)Z%AWy$ zd(>OHyze~=_Djj3jbQ~@*q_iP|Nqf|DnG>>#o7Mq13BAcfO9B4tQcc zUqv6D2SjH0-cxZ=-pZps&2Zg)N(nujdbp>;H@&?QeZY96LI)o^Hc`oLDjulL=PyGr z2+S*$5PdeNFA;8)P}Py<10Ui8*8iALd^A#8i%0kO*jb6_ru={NE#{3`!AbI~hY?JC zDGidhBKjA)Hc9@QIRYDU>6`dW3OeI$-@&tD)$rm6KzeiUh% z`7hgQ1yoXuPJGx)j1lksQxSQgy8raPDR?#M{p~@8<&k!ydhaiHnhFd=cPP%|6y!gy z{174$x*phZ|Dwo4-e48eFST}HeZ+sY7|CzYRdCDAc8pW3Px9D*iYP}=%g3Torex_f zunD@Db%o!sEq09ULhAz)#mZx*7j1RQe|Z!gY>$0&GdieTS;gC+ff@$D=vX?9?E+?| z#_cjjgd-o8TElW)ex7w*Wibbp*=bNq)K*?xV%-#Kv^+tz{k6NJ z81Hihd{$y09-ceUleca;;{u5f=*oShG$jr`co>$8QrY;v8>%kj=l0))mJ1WJOxS$^ ztfUEfj5-c^BlydoR-h3(0+LLqOOCI}=!pHFgk#ck%%4!8DRAzXYQWxBEGVvbcUI{( z^s`WC;Ebj9RY=9AW-l`d*W6BWAgfX)Mj?l#ZpbqGF! zA{M^Pw*+awFr98Fk4)C_6;bGTrt^)yu0C*i3}+j`4;v$=hakEF8r-b$hMr3ngieH$ zezfg^hoa*Ob7w``jUGq_>lxnYE*L)Ris7S6s69focN&B?doUMFB|p@n&6!CUh*6$5iA+6I z&12X`bw_y6jz0K(%jTbH_7UjUKA|rB#W6rij%tssFViNp>~1$sv)i{^j-VH7UQ6rB zZ)Hv@ec_#b=&Vd(U+F)%Zq{GdCl7x0*l2)Jeq}NOLWvdy&=>m{C|&D*iG-BT_)w7! z8T1l=b%B%WnxJ=uiK#@1#VN^&nP3a#FRfs`EpA8*=qo6X-wj^w7p6aWY%RWSURWoW z_FC$qF6~)LQvy-a_xP>0BZc{zsE^4M+6OaW-RI0>=-%V}0N5L4R8+XlG(jf+4$R2h zv*u;eK?99Nn>@!(7@DzM@C{rS2_g$r<&|*x+t^?&dN}~v&3qYCZb_y^(jAc!YsG;k zgXmTWdq9*MDY(PXEaf@9swLF-O;W}Fs05OUMv28n`J)BRUiX7^yKCUb)8QBTEdIv3 z+)0Nth~X+U3M1<6)fIBxT!`TLNW5(;8NsAN2(l`j^o z1V^lKIuGwFJGRwKIv_D=qLKxL?`{r80!Y!H(Lk3JeE0zu$$4lGhGkjR;s}PCkkDpJ z#5AIaHkb;3ia6hu^U#|Gc%rP)IfTUc`P+se7XTo$Wagth(c^z-AFAa6DD-~T; zHqzrnbc)RKKZJm`V=4koP7a$H4$%@I(~vJoMB@_Ll|#~oLLrp~-jWWv{r7&|l&Lsh zBm56;#C;!}pAX!(297HhKX#9k9=AYLig~rkhkGmYKdAM=NGEVLi#EDy1DcJwfOpgl zAkB4731jPPM1F@#`|>V|B0# zhk0Ac%rkk~9g>YU$Mv99sVHXJ1?yQ&npx{)NEe!h!#3_%4NGC?HdQ}%IdR!~TUrs# zCgCz{%cb1+g~R4@(?*-vZ=uu(@;i4(*{y~WR0NOsz4`r96uX;E4@cvg1MBtvyHDNJ z^j!h5Zx)v<&Of}U*bz?l(g+&x*SeYogUQqs1rhv2$T)ovWwY9hMN9F)4rRJ~C? zbH<`m-R}#a!!1y&A)yEK4V^p&>4?S@Gb3(~dYC`5Fdx6W*QE-_<4vC+nwNSm zOr!F@zVElpb4JjjXMCrr}wpQEU~j*$NDlF{b>_Zw~sDgBuc1g z8ttP}(YT5=OE)~XI}%|QG|6>+j)Bf}9hl1tkl}1ZMC`H(jnvHi66oeU8@-Id?9%7S zIw|VmK*?}Ns#l2H?JrWqFy25NaTjHdmO#N95w(;=2`_l!d?ItUd65TCY+!b(b@I+o z-7HR(9n8m(sd)h@genjCt_A~{ibKclu=zhkG`N<#kUxswacNPmr=4hodPxaI-^cBl*WksJgk$lg3A{+Ikh2rh$0bx7ZRtgHypY`@Z!D08 z5hrv7I9lSNhgXF)Gwu1hVEGlDN|~*2-KDMITl=TyY6(iiWpN0!8hci+>=+xgh)^8B^~s zjMhg7s)@|P4HJRnmn$q8=P_H;=ovNJl+r9)iOnNAHnB^8!gTKU0RX2~-?1d?< zZbCg^vSgfIFol-^OZ>+4XRH;y^Z3Z_eW1t*`L=KS(<2KhMEk|?06a%ln~!X8J5lB` z;lr0_o7Lja4;8*LxVJB2mf@jU@m{HVyNS&~UZuFE#K5CE{H;nWyp?F|IIr_-S3mdo zZE_0iav)l%V6jsT=wi0Od&#-syTg7cTI!bR>`BmS)6|XO<+*50(g6)=I?SMYt0f)d z9py{qu-h4BC(xptoNw8PdDChenNGVp`r6m7d{*F-0nG&^!I-aGv{nNFiQcME{Sq>g zhgxv?{Bwb~IO6v}*~^$*p%z4A4qTJt*FEc*rmBnSIX7=QAHX-eI}(@9Hpi&#c#p$* zL)Yu1KK@=)99%d&=&tdS{T1@3V6NKTXIgTC(w`h~Yub}`BkE6@`>Nb80f|s^Y;_`- zVYwi>r{ww!qJ2ap(?2L{clfQcfcA?yBC1{#ggW9pHF=gLh=8g?S(QTDxT!%7*R2IM zY4~VciGl2Er-&A-qq`Z_beJOD5i&0LdE58llo?KodWq0;wv%=IR2HyXY_PJv>)%)5 zy(OOKzg&r0K)k9eWlWGpcU-EIaYP6yOp7w*|KRPd!s2M6ci#{K0|5pNI!GW19^BnE zXo3a{?lQQ$2PZ&qhu{+227&~4cNyH>Pv`sZy)VvlZuhz48M>-#y1RO+)>`lHeQP)m zfh=mj%&)on7)p+;JJ?xOMBwt3YM3lQAGF3gWP-bgpbKyHG71BGge5poqJ0{0Xf#;3!4*P&69fwSLtH;i z>DSiK(GlQ-Fh=}Hjt|1AIJYC;L+e_*`F;kJ2~A3xJ9!IjY;pA%mJ}>PK&L~{hnxJ< zAu_krE_7c&Pm<8(&AJ_x-<+Okx4mu;Y54GQfKw9MQQ1cL-R-`7b@R3>dHl0wC85D# z)lI=FHyOq2kCJf!1|r#adv>9jS8SE6r!MK7z%CN5-#wD9+|SwG&K=(NKpT8`&HrL# zM^)=?jB(sF+S4^qS}5rRZOukn`Gh-Rkpv=*FE_D?sxlX!7qij31z*3jj$S z!qgM5I$&1~B*&2ibt+ii`@9_AlHS}wE$86wnmlIYX zkkgt>b``04d= z^vpd6%7O#Q=S<&jVpU4UgS^IasTG~%@8)rB;{0%7byHo_K(qbLV*?xHU|c}4HbXo-b624EgQ+^b+0u{^eZA=We99aQ)-H{ySds536(?{Cj9X}Q*N{N3;4(#L27O87Nby}iKXm^A=19?`RYHg{SKH?#AMR#XIZ@odqx-a{vbY?P8KRPR}1J&T`%ewF1v+D|O!KSVy)q3TY_ zN5>Z&v|iETr$8PG;hfA76B6m(LeY$J-omB}lOjZxPXwbzo6cWx4mToK6!l@$Uk-b# z;8KavCI|#cV;_#HCtKs}%`pVm5bsK-n7?ub!;;{uf>E}5!Pz={ z*|kG?CzgHDYr=VEQHv_9 z#D%_a9<02Sx^%^2RqsuEoko0XCG6NQ&`N`AFgocd(igd{-rTy!G8D!)K{lxBLHC5f z(``0#9A4;t*ukkh)7iL?T)#=}=()|IW4EO`)L*9^t|zYmtXn1`eio%$FY=JD84J+OS)TB&rM!xL}U#zn#y zsq(xx4sAF35DL5~*zcS3SiU77MD!RQ$_s_l3ZY9b?J2IY3<*ZR`Q>NAHl99gG@8Ar zN`QGxOAn{WyQKTh!yUZ5k7_y?&XH{l^f0b$bk%aj=2>x^2#sFx-{KQ^5>SdtO@IuepCo;jN^O0 zFRRg)m}2RLTw`Y|c%a*cNF?3ZKXuMcE#*&R6h?5Tuz*5>|61a4mq0zRb6ii@7FR0T zTbPjEIE5EBEkU75-~i0J^*|1Qrp=Iz(r67aKPPel5eIerhrwdgvh{T(pEW z9(%=YW#=AuD!VwYVDgITmcs>R1MqP78lz5nw4>8IsJ<8Oqw~(SY7*>6p~L(DAlf8f z2eJ9-EJZVE{JzF?TeyuUe8_l!bZJ#Mr=A0I-ILZ)4Ch@& zeDrVYUWhn8{5~Z2?@L_s^jm>bzkr8~E1FxBJP&s*K~A(C5~P28CI(_95mT@$sYkuP z6*%Ld0leVUt0)$o)MySA24Z~MW*7cpegD-Wb8A2I#apYtU;IFAcU1*;1w~&E}>0&c~{FuTzlETLlatn_B@#h zePrhEy~f-PRHjtT7UKj(Pp2`oP?HU2Jb(B!>RvoxUtVk zrrCtN<*Y0e%B&!yQ$J^{x2ESP32GS|+)3L&&0f(2p3aPNg(48>4ZzDnb7YIujtO90 z$z(z?_EB~5Hv%VkjO${yQ`-U}fqH z<2qPu&IZ6Pqo59GyBCii5qVo)_7y&xPvxC)qz%@{x}6@a*Ry1eBHq`L>m2H8!H}9Jxf@WAoCKfk;<-{y5C}uDfs%pj$FO2s~1% zDLQH#kD|9j&mAu^?n^u-fI1Js%h4~)z}2oi=SZe%e-p7slw{`&Y1kp2g1A;+)FA%i zv13giHv?zv)Q;(ktBe)XV4lg6(q;zSrx5U|V~X25FUQCV&7V2I zhha<&=mVvKUPI{QADCGObWU4zjCZa4hVV5JABg7Di4ZSmndR84&odDmWx^!+8R6dj zx%A%M2R#o>T`Bt$Aj5@=Pw6bmBku_rik#8#NPWwCSaW&wMb=Bwi4mdJL|SH0PX6Ag zY&N~Ca#ATM#YZ+n@KZw_*0K5BZX>N*wkpIqV_U{^yvkgXD*v{fzhtTO+@_WW6l=zI7WlvVYLz0I8Wo zvb%wgAdu{45Iq~6>F;}zlW-n=Kc7*)yWRL`!5LJsY2A)$0fI_q`wu_c-bEw0mp>cK z_}z~3Pr_|WgLvOgf1%ql98lJiZ5$k)F)EJuLaQ_barWBhg!8Z=AHBJamlCparJf|++Yn*?v6L@iQ zW~(9C&V}ZAwcXK+${`EdiX}RyA7H4 zrS1oSdf|r*l@pG3>MOn}s~+0I>rH(yVRY*QwtG}Wm&?i-MMdKb^7&WbrgwTUoRq*% zc8LW>q06YozB%%MC<4bbq0Qh5K`(84D%!jACv+H}alCzwplkuR3s;BnqazM3$?OE< z!fwEqo+isXl)Ym14so#wA*rO%-LNyWvqOf17-amt{l|W9eIHHY5WTw08rzXyJlrfu zm$#x{rdfB|TyYD|O*Qzx*3|jGQN(xqo)VNOjXUgmlFqYwg5W7cY(l0(>fEgL)eOylN?VCNmbZ_dnQw9b#yTc|fV?8T5ZU%4cYL_oZ~P;kC}n zXJpaobc=z8WWFI0&b|gOq{8cpKOHWcp64G4M9+A9R{Mi+XIX1%9;OEauC#im5kUX~ zp!&Pl8`*7m$2kjVc@=(1&t2&S=9MlZx4{RRT-%H6fy3&&$}`KItOV>LG@+};EkGNw zy9$juV^EuU_ww}ICx=WnT` zo{j;tNRgyX^g>4TjO2jQuH|Y>y-xpImZEHaDRs2Bw`l|IZ#DP8Yh`Z_++VkTf1>S~ zMsx{1`r8Jn)oVx#CR0HRc~cRU$VL~((?@qb&Mf}gM6HEmBd#Heyb*6k

e;I0D1p zfs~*y=VQe_Ah+&;xFR_9f5zYG)PU=vq#5ICg$Fe17}+jJq4RdcD(c?i;4=;6-b39kPDAnLoQ;YDcUP`Y+)cKl zLZ(0^5!zou!be7YK6(1jxCEMG2KiNkE@`sF;ZcZjkV5@mC45@B*cEfb%_0gJ?f#CY zhP2$6-H=5oQR*$m$BFL>GkTRsL=_I$xpxw|U4v|3W0iGE=TFgfvS)e&u)STgaTC@8 z9wX63V98$^YOzyHI?>TAa`m;N+AFvUZyTIVLXOgh3nw%|$z!-B9^ugO-D+dzPxVgd z;oS7_!fH!x`b>uT_7qpE4c|`+qY)R_+e{CM*Ep-P3>etyywu3kQq_mdT(9KK@M{1}i3`e^Xz_DQGk`cFKIJ+$Mv>$YFhQXJG&Q^Zb^UF}g*l*uX((7YDl)<@9{?1k+A5_|tKr}!0+NHRp#~-bN zT9OJ#*KrX_##cA<;j+PEE}qy{*Vb7dH}d4CD-@o49i2 zggJ*f5bL*7V2Ib|407F7A<0Z=%hC%9-VJ|!lt-klc?0+AU0Kc){yl2ey}_L}m)2ImI{K7Jg|Vpgg=(O=Y?(=I*?TGh*mQQ-nCw ztSl;ncHTUS3u_^v-A()SPH~3A_CdjfUK$bmR$#XmfA(lM{Z?Zm-ecZ3x-&DToO<)G zh3jMPN7r_54t|5*t5xgDJMDkH6@jXYYH({JIltx8FS9=cvK9;#QdnBmwBD?xNUN4n z^4@a1T;w9UtkoR667@YK@8>R0ov`mpU8I*c0TOrcw|)+H#7}r9Up(DxU7r~kSOg?awBcP{k>mJ~j-ZvpR@oC0ZIq^2WLGbcof)!ey z{XjP*)aJhM>(fpXZs$eX{MUJo#e^BV4RM(1d6n>k_?OzRW;WJH6;+j(op}$$YJqBI z42+APn_p%+VTYTpq({#6;3LKIR5HFaer6y&2?Odam1G%-6mH*;u-^k?X4sAJ&we1% zICbs*FfJTyz@rr57^aG4)hKBdx-$@bW`j;amt5^vb5PbOvPm~*H zAoHi;v--}2Q6CnGRBK3kd9Q-pF2v`LGi~rGH75?kMQv#8Q^6L zAsL8Ll$d({yoEmlu9rO&n_SiHt_^={*&o8K4PH?GcTvJ<8nAdsSV26|<4lS5L@(4g zrJ>Bu2O4Wbp$!g)ynrplGY(Ja^X)eI!?6&+QPIABdg~v_2NWAVozJw3gV%2f22c$1 ze^6p$&o;Z*4zq#2N=1`(|6tT+b~*YQ3E{zcyuX&ZK3W>9GL)*dUchJ6t`k3p4<*k3 zk6Q;a7!s;hrj@PJ?0j>x5j|F-&V)|Pj#5=srBQ8!dUY_j!KIqjU-BQHQ~I6CFaYiT z&*ej!7l5LlJOD@V|Mk&drhf6BOZo<=-fjFvj1##_`J%z|K9aG?>fI&!6>7XR z46qEz3n{#JvD@l@6ShwjB~*5O?7G`~pS_PCHy<)^o>(@;auisdvgxzb9+>d=P`TAN z(TnUbHu?Lm!8KF+YAe+ES6qMb@C z<@8zdfdd8+X!`(E%%+erS@*@=8A&Zi4*2fVTx6HZ0L;KDgCxRv>a6S6Bg|N68D;?X z)g}Su>NesOjoAoBsod8*cWW0PE%rW&i7C(hDB!e&Q=HGcrHwR1TS9+Q+LCRp(4`=% z!~k%czlzmdW!fdt9d8YMfbs$B2$8bURW?wb@EB@jc^b&A+5l|TbjP+=Ect&%L||-l z5XHO;YQ*`rbkS%^_FclM_%;dXYT+!0NB^0Z4`4$f6$222ha8SE&liOE*o>T59tqav zj+Au(9VZU+AIY2bpJ|KG&Z}?qr{snT^zV{)q3yh+u>iHnU4Wu);&8te}*0z2<%=Ti#NOyar6nI<#OJKQ*2sZ&kl;e1bU!SM_rZjLs-Y6O^iG}nJd#KPjG5g33k z^1RdOD82_z*4Agl{ode+Nfj+}J+U!bp-UTR5jB9ycq#a8I{nSljfn}QcW>6Y=tpLB zKqY3(Ab+Z_w-Y7%1Z`QU8D2zs{icJ=^#t}z8S$CrQTNYw;ydd}HP)S{_j$w0P`(Dj z7`JCEcOUya#0 zd!k09R$C~HsZ)%L_6CG|%QIWqC=$i)0aeq9sS_DO?2z}fmr^t{--e!FE=Mngmnb+V z#i3r7gs6D_2*)#;>(M4TmFVuZ-mc)dLLuv$dVf0d7*U^LnA!a1mTIUMfP3pM=tEyZM}Gac#f- z&${DX`>vXbN1&w-;d@c#>?I>B=uG5lmOO-19W8sBH8dkWTrRFx-_|>114~B|Bc>(j z8`b@;x+9B>{<}?mFH1w)1Oo&i_OGb@KY6G3HZ-ISni2zdP`s_kTBsb6{Y(Xc#6ELF z-J6X({fgpkKo&wKVdX^6HzT}Mvnpw#K;K=-%?5gC3fIEc6+$kF9Do7Ja`z&Vs#3z}0pf8(2+xrjD%(dSsAWO+Qm>RFb?zJH9Vjqu$)ii^=&tR}{?OMM ziL8zNZ76BRdUyK$G-&*)(cq7t`+Sl^K7rf_P zYE`<)8p)a*08)<~U8CKyl`5SI*;o}Y6Eq`m`%OL)!8%q`b)RDue7wRF+3=BR z?(Ch$z(r?p>hQ4Ge?AS;-k(1Vf3ylh0T>&B950>jx?xsgok6>?3`Ge&>U>CiMmCbi zc9*jTW{Fv|0fXo#KW+ytp;*Sv!Lm1WbtD!oLZyW?HYu@)Mg}sIls zUf<#IxlBUVQh9t7>g2r>C09c~ZwnrsYq$CkiqXvl*xk?C7m*SBNwN&Y{Padr2ix`} z?6BUg>bO&=U_7mMZq6jGYJsVu2rKL`k@3|>u*1yN54KW>h*ZuW%L1oq%;Y{?ypjCP z6r{zHYDe=v80m|#$1Wjcwb{f*8~}iwYISe;tvM2*rBX&alvXa&+C|Q2L9v2%cIfPrY2sQWrKEiJyFHuU9-fVK^CGu*>s3Y8t@!8c7-1L@;r+RbWlTH zDTWneYp`K@Rqr109@wapp=enzrUYnf%q9qsSh6upbEcR%Ilk*J>D3>x#q;o1u!xiP z2C((xVsR}!qNK$sx{ug(xLVnn6M{AzX6odPdPJe{zMvH4pBoaQ?P-3~5N>n$wycFM zpJXGpRN{9@^XM&4X9*P#_3aHdwgiVOcqz$-{)DsVeVKvl2d)n-ZjWJ68qS&zkn*o{ zkrlvNb7-Z2Yu|q)Y_`9yep3x~n755d|7_plg`v1YNCCo&6sf`uuMFZQZJduNl&M#c zLtF`%BCb)_Jk^*cMdF{A*908O)rvtR?^iz_r!a$|;V0wzrq0$)!HZf7Oe%Y*Iktw>+yKQewLChq_pHRiu9 zNLO=CO7II(-g5)^HMvwu3a$88(;t6u4I)%Fx+JR%ua;5IAEBAfFOnd+c?p^+yRN*v zv$3-PPa=d-$_!oZUCM}jp|vZKhY+hQxA4UHmsa!8H<5>6wtfR&=-{mzHgba0eafki zE6_*T#kz`jf!{=gx4?SE7mM7LV5ZE!s}zymL#{PzGS*mctunFLO7>eL+H{=4pPV>% zMnO)Hr;>rk0yioW@aFlXpp2}|yLZvB=MY2B^|Eu?9hcP-?5gt1z0Txnk1*-zE^se0 zhIs83o7lpZ^z1vp^Oggq#c>%yjF3;!^8Iw&WZ6Fq>bJSvn!&f{f! zbzAaV>|k%Bug{URQh*gSzmGHRMy_?3`(WHPHm&Lonq?nhNB8fMphm7K5D9 zru;(9@)a?W1F`E05C^$SyHXkU9ev%ChAhYm>NHFNs48xZc05u3;;;GtE%+q(xh40c zkeZY}l-SyaPp1Y9?UX5?)mBryhqRa~nS9{O(e<+gf3}auijZtVNIjEB_@Z7YHSce; z>*wJp04@>cAJIh5j4SOQ}9Z?Bx5P%9}Rj#I3APYI^JWp6#zU zEE`I9zMcmtw$k>>Ue%$QiDZK`SIZkn!Vv zK9d3|<`>F(6HLeaiU>&wiB(rdU0IUXFs(T9ArX~(8V1L`G%zLH&G=ci>gVcT#2WXn z$Y{c-!wz_IM1);qUb#TPar)^*38T+fteo6WClpnmC zFOwHS`)|_rkwJ*Y-BH%v9y~=zovdG3iq}$%_>~UkR{gx_8OCFSdeZbzGuVL znv0&IgUPINLK({ZP{eFSh$U`s19QDPIeO2Hj$p#YTfX%zgnVDmV98_JN^PDrf&B>ovGk62pjuytT|CY+gM{#dhmO3^y>p?RSLTG zN?R(-WBl?L><6Yd7ZHIK4HxNu z@#q&+Pg!x7jyXPKNwSK?hj@_<;iuUP{t!!DQsSP94I_#l8?TzQbnp+e0pa}31W=u4 z5pnZX`k6wa?M1QG>-tF_bqLFpkCA@M>tfi4P$-!@9gn{2@)cmrE*iDXwQ&?8TkQT# zA5!&8(P_z1q)mV9R~JLp5?py5;Yc?gdf-yWxcWywxPZ-rO0nGm!^N$(!G%QqD@2YL z?`Z8R(J@cKyg=KoU#m6x^Y%|i3eAVxzO94h{fn-Igf10i0dLZ!n?WS!r)A5EFvO_< zU^0%dNCm$3(N1^%3|X&*XE~;@R-XL`p{3;>thugw(as>Lpw!K#R$f(6ls6G3^ZT>` z?#cSVmKGS`^U@`WeI}36r1uvGzSI+(yqDuEPs?kT%S(7LN#tDfgwsfGLG#}o-N|Jf zg8&_ub`JGZF7UU$^Q&nj8wy_P<;Kbb@pBLhs?Z!WU9z6 z66y=B%>7ec)Kw++Uk^Xny*1z6fLSY5q_x;Kx?3~Pekzbya`nvvsbu)W>duqIS~&w) zjB8SVf1EAsF4XhR@%*v1=jE?)=rEUJ8arC{fxg7gIlg|beEXvIT1ll<5dotxKO%rl z@5$9pbuwnJYQ&?`TV_w2Pbyc_&71V`E5p*@JH~tgjq{dI%g4dqG@2?tWm6sM0|_9Rp<_h8lz1nh?Z%PpYkCz89SG*OM5HTFDJOvJE+lK)Qh z&y7(5csc(UmD&Aw)_-20|93yi`l?9|pt{QnN%&?fC=`aCH=$uR*QsD`~!2A6wN zH@DlF03wXg^LokUW<7%GWX?1*8wfRVJMP4&#Q!&1m3o_{(Hc|5i`@y3p7-thV00q? z(~Vw3Ac{0wYmPgdArKKkz(n`Ewd8zzg#I}{X=V@>{)YbKpk}1=TUEp6#*X62%AikJ zTF=V!Y6S=|glZ?kd<0p7k%mc6u)k=}I3h@Pr|Oim4c?%Jc#%!Kf+4_^g1dzSHbo(7+iwZa z0vuh(yM@mBs0f*zIeVDqe2M;gRGBT%GOYL@$%IrGUXE#Zc8I+a1QfW~_WvW&bs*bj zdL}R(eh!W|wOQ*6vz+~XjW=yOACZ5$KiWLPHORmD_ZCi}*XG%|D!OR}NT8|K|JNS` z`^hq^OiTah#Q=zynGy^LoB@9+JAe`SKQ>kW=fM93IR3wV$W1Ub$(z9Dn&aR3dQy0H zo)5WOo?*S6ta>cAr}*M{jA5u#7NbDKx$Ws-;di}!{Oii9Y5o&1SMI+?h512%5+PsO%DmyFE# zclfIwFK*-?wjq&8k+7XpC~3cEkOexg%V7Z3ZadokCf=PV%d6Y-5QF*BH=PbX0ISYI zzvS^QmD__wOdU@DnN6kCRR$v8Oj8=KI$JG((JjXxqS)khRKHyq^zYneLkK<-TV+TO z=WvmzR79WAjT*w;w1>ruBt=1@U}BCH zydT?mCPR9R0$dUAF_F6?ai>q;^IhrU{6yrbOpNotNYcaXsl?v!8m`Usi5*1I!}JAx zR!&drjGU8G&Js?(cxWRRRM6;6I#GN5PX?aL0pjmC8TokvSstgHKVrJj5q5`mthl=|F?CDc`V>Hv-4t@x zJU%@>J=CT(+7aw?9=-jiJMXbUTGksDvR!t_S~`;ZQ6b2Hq`sW~mTrW$=o+StCBF3} zW(#&b_a6I}pZ9bY(SBFhp5{U|S^6fddW(+g@N;mbf`VW4)T*C2Nf9}6k4=z_DqLn} zYvjpc`5SDd!Ob$L{eGwQQ$3kizK3R&!?{=^!M{uZ@mO*|2==CH5;FOp>wZgHKo$JD z<#KCnCB#sF4c26JOQ*v2EFpy4fX~P$V@T?7=PT~?AfAOG{aG8!b)upDG zdy0s|^FY={%KA17IJA9eL@&ATX~75f9aDaDdvF)-8(-_5YUC0hEc+uIZ?bd7 z8RmsHbivEFjC z71*oRBm;GMoFT01S9SL z(f_k)Oeo}rFX-tPZ37nDOnR{G;TlqknW}1GhJ@u9@_!bG%@E~0wg&BZ4r3ixV9?DO zVrXfdcc>Hz3X+pY)3U8&FO;WU2=wduPku_TlN(&4VCw06TkO} zsA1SS9|;CMu?^Cs89e&{j4j3UcA~;W-a?HOdmr1M0h6&66o8CA!DfdVgAQisUf-G2yQNMX%qv!{ z+Aj1A`d$!KTi2=iTUHxzdwo$=gn0nW;fZCBSJS)`m3kO=_r1BYWF}xrWc{+HL%0LG z$2&}2au1^VV~gu+{QAmDFDLd?Z{(S;kmt=DFwFulnK_sLK-RyjvEJK4=a zr!I|OILY9-d6S!(ARHgYb+|B%-h2MwrcM}Pq}{r)b6t%TYj`}r@0REhdaQ>QtyxO5 z{h!P}D&>d&C4qj~P+zEj*_`$Ap93C=g#+WBj<_#Z3N`^b z_4PPbgrAAr_L%+{WF`m5=MC@po0c80SvCk;VKuPOq=DWhB2#=E^42l=?rWB+ZZPsb zUFVu;=$ywz^9kLl>{U3==(Or-29LDFvn&UTei!1RTAhyGC~-716l9kd20>e`p0m)pe)g8<=Iv3tIUVOtMjsM052DaqAE z(t~*)q~^N9Z*`O3J2j(!7XqzUBgca%EZ$?S4IqT=x6%0VyD$si9bAfZsTLgckX29?IS)?jStn*026d zL$KN(XHXmNEx*b|h&uD{XlIHIbOzjMwodfUlRFwfW)4!v0Dcw?zx2B6qw}NmrxxVE zar@gDF_eYS_I`dqn}O|x`$wp2%t{=EM1@Paa}L7#Su)L#S;5^0i0kf>S4_9)`5ee= z7qC}zg#5VQcWnT@x*S~JUI_2?R@wXck>5CN%MTS@St5$k>Vh*h{;b$-Y@kAxQVX)p zs)R0I(r1(0#MKu)E)RbMV@^}2xUJ}lRzx2PnJm#oKSVL?_;wBw@-J|XQTO=bu56eG zp7{&_eLm1wJr4=}1FgS|@{raQKu#2!^1;fcSuv%L3v>z~Ur9l9vqG@-K5CZ%wCdLJ zN)nIjvUpw8sQ&Jv?8w>1wV)o~_{6M1X~k~p?%y*cU*W>U+8sy5oiP`spdLArv2tI* z**1YOL2nqTbHL~)VifafLOXs=WI8GQby7j*3Fo0Xc985fOvc1+IHDg3Cq zGgRh$xpDbMKq;o4ZZ1O^SDwdq+R6;Qvv)|TueV2dYjfn7s^aSBUr4e#b4oyhwY}M$ z3QaRPeK;3cm`#~(LN7&S`@ltiIv~b|ub#XVKQm2j=Yog!f2uyH89z%>6H9sO4RY${ z?}*D_>bu83if}84MChIHwK7#9HCfYsIC$LPOx0@&7fzWrGslYtIQ_KXtdKRv zuj^ZY0aod8Xc>LB25m7?_U90ckESfMe`S?^pz1=iyo0kWHszIaBZ?mI9B5IoEd@jwd9}d0^UW@j zuWtJSYvtK_FjOWIl_>omodY+9s-(USe>-^%a5c)am_fa2IkD`R3;vX?&cr>0|9NPi z2}Q;Lp7aXVnkO3of$k=VpP+3q%h8~b)_eCm^&E?+g*b5&J|%XIoYD$FECZ|_qyA~% z`-%|eCbAfC0ZP0vU`k=@l-OgSs@6*aNsiKbVEirq&H{f-kQ!Qs&1sm0F327wf_0di zLmVK?4D!^XdQMZ&!%d1f`3^vB7va&;JeY!7KQA>=%{TVfu&-=$jnbvXXEU&Kg_+(p zVAMGqQegg14;%0MAVxYceJ~uOsL1mMaAio&8XgWS%6iyv4Nn1uoKf^d@^HKQMz>`~ z9_?jy4CgX`JF{Szcg64s9J+8}*|UD)7a$_n-$XE}+mgf3KE3*b)_;1?kovlq3wh7E z23Z87F9X?_2r%6`JYg9m;?-~uCcPVhr8&7Q)&4TNv{v>r(UK$EG$s5^Sr8=WB_$rQ|Trxt3+J$`38=J?=Q12kSv@sg&`sZR5kpV>I{MQE30@s-khOcHrUkcFEk$Vo&h7wFShB8^uInhPIL=A5n z<&wE6UXQ?4_cTOLt;%d!+--+P5?99|2Md1Q#|fQ=;{r8YH`J6TwFc~MmCr%~*vcvd z3iR^KOTD(MC*N9zqOr%(4Z+V5LtRu(+Ze^|gdXDeELJ_jNxj0|x9n1Uny7uwkRcdg zcCmt@C^sS=@kX&;{=!cN8$=K$XUg*1i-SK~SFX)Y9B+NYj(ui|QW-0c?G%dE$d`3G zf{AWy!V@|ocSgQGe&flAb~GR;hOms2ol!52uEuK~TSLwi{@R7krgPMwq zZ}-{9;G^Mr_8tM;R2f5j(Vy@8ZD;nSvLhW7!eXwH-?5Vcek@u=xQ1Ds=%UYiD73RE zR6%UC)i^PS02LorNQD(w@siT5WMJWqm5E$WxOzw?E?KWPX@+hDcf;2fN4!V+?caLp z5v~V6UQRN~f2T8R|KJBau-63aY9@q}q+V}!gB?^$N&Ip6%@Rn1F8S?rWksQY`uyp@ z8t@MlVrhpe2dWX%up`p}3qj$xmPC$nH(hIYUkq0T05P*^2C76Ko6~5XwK}`o_v`o) z<1{)r!>RGRnTMYh`;tqQgaQQLhJDuCqHenS0dHFgq6JHHvsPkavKp$*lqW{R1 zxFt+B1`5;kWLv%%LK|o4?}o%j`FJUVplmP7ZN!s8djUq}8xt%oA&mWVJ*wSmmzBl$ zk?<_xS01Rf>vKQL#>&M=CTYGsw@Kq21>NY?uq{o1QvyPUW!Y6$;sA9uz-pQL@!ltd z(yF5vDbil-q=+164E<;70mk`?9xh*(uat?!ynCtCkN9Qzkv1aLw}B|H`I}LGsFa)$7t;u_7iJ z&klU5rfJG@e6r{oq3=yH6`HO1>;E&ioE!Ob*gf}JUbEOh7C&;IMeAKDh5x;}QD4VyA`9ln$s#`+nM>M?rFGvw6D`8NY6rRpeRy{_Ih zmuHqgpPIL52{Y{|qN;0S8b~K;#bmZ~dUF@O8j5JodW+U@Wr`g(CYAnba{oHr(Z$Tb zDY+OY$ru;VBAK*0$P<_U8VMsQ^=}yvXRObSh z=^ZU&ell!aX&&RZ_L3auat)th0^nli_1nyWaD8L+@KX;}mSJvgZxULS4;=%u4%~dG zQE7mgC3A8d59y7mL3_EWd*0qP8WwO(4Lo0E!i$l=g*n`Ke!s6Ghhx7=Qk6R?{{kIm zrK*%%IPjuHEapd}*iKFrJTKKBuvQ`wV=pjru@cb1t<8UWNuIAi3>d;Hp)aWuA-^jp z`-A?2bJWOGErbfN33lep#J3PCwC{mkM$F_h_(GP4vX^E^0AVX&4)EncL!De{CH*SZ zFiDaE2)SS*R`!hc!)m0?U_a`3kD1QA61z${Wu zIp5kZ1vS873}v=}RSRHrZiGk5NwyU;^i85xH3_#vF7th`UP4!^8MXn>zMfUM-aoTZ zS7P#^IBjaH($vry_L*)UAh}LJy0Nyt3^@T_kx(U@QChL|gH=L}yr5Y2?ehwr|Ey*o zSq?sH)4YotTcbU!smz(P6484U>kZ_&al&+$G~3r0bMOQ90H}XHk7kVgCsWTNlRRjO z6q7{@`ZK=lDodhNcF4RU=XI&u4QSkDa>C8`HTX6N)To=U0h=^oU>S$Jl|R-Du!^ie zKOoH^35JC%HXR-`3>Y8r$|eC<$PIO9*5-fis}`eh zSNvyL9{K^OH41Y?JYhX+I$j4nsYJ=2c|%=dBWskEwm5L1>>Tqrbj z{AQ}knRVGDL6iacgC2RCjX65cmf^X^l0+p{(&?{3roY{s zP2VIRQZzdU&6aB&h>6L_@QL%1{WYG}9m zZ+IKZ=h6E(<&*`!VFM-ChBi(XfIv?ME5TXkEbdj-YB^MO28>(H#0RyDU2iUx3cv&| zxikRYkZFKftsVzB<9Ob!L8$iO+bQRM8R`a9(PixB{T#l>w?KWyay4Ww)00C?DqF@D zsZg~r@EG?!&RHg!TTc&Nt!m|{GXVpyXQs(k?5Yw|jr|M{Z=2R-u_!aJ6XzkPtR`TA z2UIJc=A(6!F(ncm^6uWzML!P<#~EPZWjkq&*ly5!ev=pz$CBpDM$LzvH;KgmM|x5R zMmrnn;FFc{zbUk<7UYEI&v)OcQUuP|LqMXoUD+@2U zrRA6d_9s?n2cVZv#5r~*8@XTTD@s>vu~lO)tG@Q7Edd2c7F*&!qI>5F?N9(Ek#4C= z%CMA>a40PXq#sfI!@dt8>qDj2B}JzZs}Y6o@h7XClhUOrRLx5(t|KF7ge0H*eWkIgOG$TPOvhochhA4n4bk+7OzY($HaQE@kB(sY1@AXiqa)GrY+v zz^AzNV)gywKF;wFGbPft%RL->ZADwwDr=EKUDVf6EnJFyZU??!=dcg;-PT>)$OgnN zdJZd;oR(sgMK93DGX$A(0vA@W>+5DDuPrX}hss1s!YW8y>-MF1M1Wgx-lvW3DX&C| zUFVv04jzhiR*p-`N4<4F!)LvEQ}A842ym0U6#LYr@o`v{R(?M*Wx`nZ!G(T8o3n{H zZF>i_Yi1BZT{i=jmTGm3(Ds=dVk|>?5HL{<67Ko6l$$yFJs*rbShe6py#c8x#$T=zTpTNWKE{NVe1$7|(OP*ie|5P?zz%A8$93)WUP3r1 zDc!bMEP4=_izM*zrcVZ@GC^U;B*K0O}KOAA4Fb&^?@#f?(;tn!OS<5vXHfLqA325ic;3U;Y=gb zh&MPi8GoAn^KiPBr_5K2e=!!SSsJAK=IWc^RRlVXHK7K7jK1BT!xvB~;v-H<& z0kXpP#BsQYLF&l)A)fa*hTH0I8*Nq-f9LXrC}L+iulFTVtikv!$=H0s#vo*6C0J3_ z4hEV0;d?7WAj9-GID-=oHB735!&*WC#*b$ejkIa(fCbUBf~k+-J{I%EcDdNPnltIo z;Lm5z&?eTxCa-#x_9p+t6z(VV?;iATE*oxh8e*n?Z&XqN0}o8tUOu;f{qQSUr)E7_ zjzF*-cI9yl{XWYqB_}AFNTf+~&E{B8qXsroGO4CSnOTd3&AQ(cEXV;utJ$UDDkm?z z6J>Nu>8JGt@gIF7V-qL8u#LRqr(DShy8hdJS}rRofr4WppELqg?FXw5(IB&Yl-}LA zDb5&A9A6Y9s{9a7*{i$v6>cC&OMP~ae`DXDT>P|2b7&J#ULB(133;^EuBwRYUXXRb zQC6xTF|n^ii0IYQKMKO}c4#ux<^99lxzGJ;Y`7lQ#wva8MBqDFoz@38|Eh=UdOI*< zNCp}AF0&#BW>SjaHz8f`_z_Lt3Tm-$2rTI^I9N@nzEXa2jry6`O2Q1c%jI-Q5;!ncM~*S)i1~ z0Nm746aLAO2o80b-H#T!Wond;KGs~#0aeQj!Pm|hljBg9GAp$07Y%+)aOukJcP}>m zB)Dfo7yXd&V~CwB-FQqmsj_`WBtIPL(Yw;%ogtY?c%)V?5aTB*=kEI+gOG5rRS2 zp}O!)L6|TZ6@AUaT3>S0MTj_HEmG9Rh45(EkTU<7c9C3VeYM8C;hRK&ehJTnSUysz z>dJjr-{r~nY8Co{kaPD)|JwbkipIg&ym5?ZO!YkJN$6*hWO>MvAofJ$j#OuzZ|xl~ za&)g($;Ec&%{8G}^ZSD!Uek95YiRH9QOm+UU3LUaAFp@Jq(mGYXw9~9i(g4^WG@Ce z$$kBS?Z4)B%iuS$;q@^fz?kbe3vbjZ33hSf3KQJt$aNl zkSiEYyhQGHK2wqr|05(Ogxw<0n=;lNnkOfE_;xzIp{2~whv-?yvwW=!hn?;9xuJE@ z+mJ8^X`?T{w>o9QV)du}w*2;CuqS{|Cw$&k7VM}L;@%GjTF3ZxXM;}56;WmttNS)U z08X`hcO>M)yod7;GgQETeQM+|4F(2nKgY)uT-RS*k9WJK_Koe%X;sLPNW{r&L7X#Q z;|nA<*>VC-OJ+O+#(o>wd4o<6-Js_bbtFybJIR-iQ79o>$<_Ky!8-yw>QjQOQv(|s z*zZSQR>K6BG1P9#O{v~|?ttOeNfZ_p(Bi0rPlvI8Pr4^+isDt_AVS@`+jqaJ@Hm!g9XobeZ=s0D}b7w$;6XKh4Ivs*7ppU?Zep4o#RQ= zv}^|~%b5N8^q`cqlY>?iGjLYqIFvEcdKye;O*}-FD22ffj}oPyV+F#iP}L93%3=xC zdv6mG=AzWNn_b9y7@wyX&f9DjGDAmEbP!I}RyCw?V$^d|hPS%qsWm!?bou@c$wK|% z%F2Z$@2cRcH6SWeV?wO{o9DlgMEMB~Wn4K7#-3w2#pgixB5RH=q~hE0fGq#K?#GJ< zN?D%Uyff!Il?S_~upVB*VW^)N@HPuHV*K~cYCGmrrJ`e!`UO;6@7l5|^O#d8wE~w? zOUDFc{;i%o{lR)G;)>y!=ov)>OlVy-+iyPx`+m{sRQ0Gt%td6yOKgm|y+RCAJ5T0> zAwy#M^PNWfLh_#WfIt96X^v9I$qr0BDMU1sHu-dCtE|A0{Esc?Bw=0cOwtDZv;<%( za<MvLRlt;6rznE!v zh$8lc?U&Kd&J&vL#2M#THwwMnxKqEE_r8`W`88S?MrRPRpvqS0UOHMGZp#Qpd?RVY zvgr{vg!4h?0M!>kMa(6Luy}J<$M#b9>7Xf2^b9cJ{kgzwI71IfwVmMy^vRqs8;*Wq zuMQS(MWdY_9go&t9!G>;l72jBpMNTy%|mWW>BOtwtg68deamBe%s_0yHNPM787CM{ z^t>NO*9(eik>LbfPjNmhFOW^wDE9TW&|G_E)s89cZIELmS$QdHyI@)M`ucX~tZrHV zVr8M~GkBd<8Bc6H8+1EyOykpP8+J4lII6u#YV2%sxD||9C>tsVY*Bi+X$H1S$b7Zm znFdsPGy3UdL`pE12tOcsZWc>jecu2KDVDB%_$=^K&N`!x0w3B5O~APp%`XL=uG?EG zjG5JnVp#DGvpjpE9mF*KCuV!1H6(BL+ZS_bFk8m|VO$1*`Z15cKt&QXFtO_a8s&dV z-g*GJW%4{SXY}asAmoI4;D_G-i@(4o^DR0b*#l#X<0DB{=LcA#)R#B;{c*_CtrEGG z>=K6Lp3%bDEnrN(c=n1$<8Nx8DTxjRQG88BV6R$(Jq4=hLMjE`+8_#Gt^UU00H4SI zS7-k{5Fo<;Dm|2b<9Gg-GC$s1y#K2b@&C^Q#G_JI8qlod%xL-FYmTob&@*0vo>ra+ zB+cLOqy0TnzTEuAlM|46-!@(aTt;n_ zcdz*|3hJ_8(I%hp!CCD3TBp`JZf6m=^y*jySZ|ME5#qyV1j+Fzc-MH+#=v+BXG&8o znh^d?tK%TBMzEw?L2PC(TY0Z|l-K2u#F)qsC>9f}NoLeWf`JDQc67bE(%!%wa-8gf{rgXAXXNqQxs9<*WOijTo3oz-t z0VY#8EM%*4&p;M;%okOJN5&fPQHG`Paz?+OfBd~~@I1#umAHW>{Q0`fMUhxpYL_nr zr?O2BK5QM?Kem9sQ_{MkVb5r!j&cMqiB<(eA*1~am*3?tV>XNb3-mWW^tll_m!oMy z&&!>QqpKOG=Rtj&rB0f=aVY<&efxr!jgGiKANKbvLAhXJa)3PAxa~eGHs^6{sovb* zqm~m$0X*(m3wS{020zGsZv>m~qe2ygzfF!z1e6Mv{HnQ-;dkD!4~@?16P9RsV|+Z^ zq0#IFm7A>B6E13}4u1b^`}ch!S8Cbqg|DO<55h5S0VW>oIM+zc;{?5szx@4hV4LaA zwDZp{WJ-GQBEKV(#Wnw|IZigCLtjvq$+2|Pr&>Q+sKLP=IZ~x8?PI&DrXg`c??W+Xvc85ay z{uIN5H%NHSb3@{%P0o1B^@PvQq4m~=FTP50X+1iNl9RZ1`-|XI^HMTSeK>anq0Nwn z1DcyXt(Mioey9#UqU1aXw=XX;S~+U*sPoQRkfz=<0PqpxX6@YFGn3?idy3!)!j=cp zR!>}%l+1m1d|6nOkd~OBK(C&-MTi6O;6}Gm#%MF;MU6w72zojUw!f13j$j4d*R@cQ zwy=w*x9+c?3>d9jvTkl8c|<=8_*N+Mwa*J8CFe}Uq2Dx78Pc_5*JJD2A>1JAWM#sT z?{AGl(D~5VLE>(C{zHo4>{28s01h_TM{(VCTRR0KIdqP*)RAJu3L5KVW_l)My1EwA zm+R>fs=sg}^9+@q6@M1)d3)<&%O9^|Pl#|1X(IdMLbi>2a|W#H!*6b>3!M4;9Q~^^ zf>!H)@wGDWyK;-IoAdrdf7sFgb^rxnI1zudCVw5v?1}cjsCYyWT}|WiZ$)`_8R1bp zoD3CT36couKX@-D<#>#Y`Y(!fpX&yi3QFG8k$Q_FHhN>-twRh3U-!(v5KXhSOSx53 z?PmsU|J96r6`%vWrv*|w31e6TFP$G*<~*-*w!k| z(!NbuFtZ4YssE6vSG8t5589)iNcfOHFkzB$QMCMgL*Yznv^o`IR69SQX?57h`LC+i zUoWztkJ@!vAH{yMaV|X^JHxkQn$7InWlq9IwvQ{c__; z2_RfH^%w(~bu4c^B+2B^c$+@ZO>`QV&FnDKE;~SB#D$Mhxlt83rr+Sz5BOEa254Y@ zjajzwe+;)Vh`JpYgSX)Eg!Noc3pM@nON5_z?vmA6+2tK~`SME}go$`0s8vKC7%YDR z7?f`Kkvb96URIAibF4pGC(LM5%+SF)7yJPF=MbpHyCxR&?7kgtb`bP;w9;Uwb-Nx7 z*X2qnkErZElap|^3@?#}x{WeFT^oPz)HW@)E{fht9IVgB{~>o417*s6LdqJ%r*O?~bprEs$(U++kS>8P$1r-m8bB1t--7JuFoHxl)X-C}5>-3h zc!J>7*}cwMz8%GNqY-$=!a$+a0k?RF>3sg~h}x}Z=D3aE=EkR1Y-2D_nK(F!4i{cjU_)ch!uzbgKI=D>XP_zB+T^?u)LdfI*4Ds9Y`oX~hYjEVj}D^7R-sJf zcDe2A;-P2qt3ltE;y^r)5^u)1FcT-j3Z7sx4a*O|pG3H`PKiC5%@fOMLe)lu=`b8N z4@gQEB<>`Q7`#DE#0aQ?Ur-GGhF%bul8W0l-WDYr9TnGW`3Vw@HB~3HsZerL7$`4b z3U;LA;l*|ug9Uw3st1g$Jmgy6h_xV*$Vzh>!l9=#0BxaZrCUp_2fyF!?uoIC+Doe zzfgYyq7&KVw5Ae1F|_7450cslwa9YAPr3D@0qjj!SBLkXk2M~8l*E?%JGZtasD9|P z*RV>}RJ;JXh#y>gos0vEEKZNsFMbn5UqyI)_Y3m00;jQtw(5X?1Ar!%V>L>gF!*6c zTN;i+6qy$Op{fKyHSX5!HYhoxi2+9mL$q2T+*gwZi88 zPr&UZOWNkA;IJuOSkL&mSqH%Oe3$e$AsN0~{Zf=kQ0{XUrAGm5%9+P}y-v6`tN*acT^6zwoZ~~R5s{o7w(0i<8YNn2DrIUg!N3p6xG5VFQq*1sn5e~Pov(kb^){x+aa*TR!)fxy>|ANqrjxP%bxp6H!NYX7NDsE|7CzFfOLpAA7jBfm!5wEKdTl(2>O z)>rU*hXp%mZ`a$o29AjQ4y1-_n2nyCY3kdYE3rXg{vzg)d)ZdkVs3F#KGxW6WkuofNE1c+wq4=wrP6&gH)F?lTDk;A#jI^7C6HJpi zuwN$@*~tzXZIG5@i01qD$=Lx{q&-m+(+peuk7>n->7O=?kDGKi_+fK7k;PROY^Cvh z=;VNN#>Ot0Or0$?qNv2^ctYjB+t6W@+C(I0@DjE5XYxKlY-~BvSP(8rdQj1(M^4vVGD*~~Hc@o5D}lkTec1rT z#VE&Nmv?N-JVGX$GCT>v5NY^LZ9cr2O1iF(8CcR>@jrCW78*%{6S0d{0zmha&@A*( z9Is24@sO*`-bCZUd66yI*8C)d5xzU`U2m%S$OA9d5Em*6;5=LS+GmFkJ!nvuMblTA-kt7Q5V_BHKtlB?l*`&3(8k$GyRS+gQBZ; z^Q5IP@7$3`SGLC?D+8^wTz6a6FA(qi$0RN-hmjo19(?VVlq_p7I8WxoZVQm-6e{4~ z$0b$uY@GV1;L3=mvjWQQc$_p|`^R_JUdT`Dz}lH!gCbJh96b>>@MsfdOAj|xBC?vj zo7eo%woq(}b<#p|CCpUiJ>%l8SIDN^l*xhck@W_sLCmfrtTns=NFSOYO;dwWp`I6= zIya_MU`eDZS6-?&TI-)GKoVsI8sgXwp4PiN=r&Ig`{+lnr_yZ0MC*0m>5Lm`eKhSp zRrjGHQu0+CI4>S6iX*l~R1ce3 zofi||JY*$YMq+m>mt>^Wh(lqh-L(SnJFXrqc0wM zL=E>^yg5vGUtZk@BXr0RextE6e^m5S_b;FxiI;(Nmr&iIIa>J;Oa(r!OvM{6%jgTj zck%uQ^3f-T5dZ@EGpr-pNWlt0rc}6cffB4T&V71acjzwnS2b@h; zPkxA8oR*6L!4Qc@Sj2>cpc zwW+Snu63L$_!ha4%#-#_Mj=UrA<34+qWl>}om=7bYv4Tlokx#9(b$~PF&9Adl;nV=hG$%VZS+-rR%Q|(sm~|3OvY*q zHoMTtN)Kv#lqu9PxaO<7V(z2|{bm@OCLi~k0Pk&d;V(YuoAbrvo&>p`u0?0pTrJ}k znX2RY$@9Og-huZl-mNT!+ewkQo1mztiG6Ou2566GaV;|F|m4&BD&?7o1sE%4uiO4+c3XGY0|oP)tJY78&G@x5A7%G z2?MHx>o&h&b?6l8R4q6`0`vK)Of&)V>n-jlyVUnB`eFxRjhF8GN!w%fBn={+)hI2D z;L?NJn&l7fVJWL&a0(B9j2Mk+c|xWkTIi}TAjY4l1!nL^LiaDTPGi<;z}i*MxY@Q& zFmOY?&t0u1T<5aPlp@9#e|I&XFaeG!Zd1n*lV5ION^VV#Z=0`;I9hRc_N!xRX_E3h zuW&gnpcY;j$=Noe6Tdzz`_ilu>1Yw%*#9wZo5%A&PjbbZfb#CCCrGVay3fRJtHS*f|c+vd5Dj=~ueLgMD

F;Jwu9>}3DL>#X_wUtT9W z6%U@0nJCz@!G1hN11}qRX~hD3%XP>^f~K?Lc63l1Pqm!(JHAvYDDF&v*kM(3nzdl8 z+A86>hPv)p>f&>=f`taaAD4HkD3uUFZ%PsytMaQJ_9vS1ARI2(pM1(3gg-noG)db^ zTTr9zXLNCYPTwxc_=Mzg?FIxHjR(|DR^cT%Y0?nx=sBOWxXDL-%XG>f*Kp8ePLvz( znVPQo@=Am_gnb9rwSzgZ)lqBo!W#Vx{RJ!ZUNixg{n=xdhHd!6yjj=%$xfUKtz zYU-aTsX1~z!&b7h6(tT1Q#S2PRN_c7v%Q%azR^2iXHurpRzep8+P<%Kfp{E`f^P+DREUz>jc%1PC{3eY&s7~&mY~CCo4vohH#(0Yx zfSvxvl>C$)`2&{Ax}Fn}mrsMqq5RR@82giBOFR2`4LKdgF#o^s&8MMk9FqNflU@k# z>_ye|$kN`|9yOXF&fb~!M^Zyxdal2b!-tz5(L&|%mW26+bqsILd>hU{GXBq1+s24n zrc)`O10?m=XVvoz7y!-ivReRr5r4ke%&|t`J8!Tu=M&+)CO-q~a~nx*S_y*SGYytl z%ScCqfi?+@RcfMdKx@`Ip9P70(co&B*@xlt4@7lm5MJXC8Zp%=%yMChKCGM;+AQ@$ z2|-l7VG%uQWnsAe$Nu9wE0j;sQo?Ki;`4R0!1CXbvCsMyYLaSIm-^}&Ud_b)tV}Y* z9;{xqCVkmDhbLl*Z_0&3)XqR0wgJpP7gqZAu<7o;%K*#pK8S+4lHbKUdH~a6JDWGU zL-pFa|BVMi*Vpy8{T&&n*op&QAKYq5y!cS)MO&#iRxkBsBZgS6wzvPQ6h7{4(o;#E zFb%_A9>k0C@JRNaYA{ZG*^A3&-}mO;>+{PAPDQdsEbkED2G%67*kMNx&s`RQF(8-M zEQ!iVP=ggFKcVyHl%rgD#?Gj&$u;|{Ddpu4l8e$bBdAhpa_W3+Sn0x|<=g1br0y|$ zw}C}e&5KM4Uz*a}*+ZrzE^j4mHjsH3#V5^uVdk^dK*S*7-}|(ae3UJ zFVGa&SK8n+lb84(9rI}@X1PZSREE(Q;BE= zN~t#WllA+Ze-q3pAmzZOhJW`mD}K7Qam)GTo@zIzRW>3iOG%8=6g+#Rb|h(%=e8R5 zb6Z)lb;@D($%G>1>d3GjF}Ld_$w#PKVSwK9d-x{cfX3Pu;kRKpx9EZi^Qx|9EH#%? zzIX;YL*k8w%Qg^S6U5OfkL}G_U&VucBan7JfSHh`Kt}a!=T2J!HD1SB00Pup`Gx@X zS<7snIQYdJUxl|BZ{^HyZ4cem*KCEv$`PH89AHUu?2(os6VJ?-380p6x@x$1$+s#U zMP;*zT-VK3)_JSow$m!GpbnK-_aDgtBccP|on;zVUftmizVFvh7WjhGp2-$%9b9?6 zY1=og(Er6^)=Z$(1o$(b3q9QvJO=yg*toP>-1sTC=jk~Uzc^d$X2v)0+J2!a&I(d2(GI_cW7BOl6RBA0QJQBcEKW= zo-9`A(Iv8Ach+aVHsfJX`2dum)PL}!O-G+e=xRWI%;RFrG;o``fDM_P?0V0%(C+&H z*u^L1%)TnlUr!CkWl;!~V@(ZrLWmEK=C%?#tVwYp69TCC??K8>@c@vSnYPx-E-Z85i3GW*zf!KiLLa>bs z%J^ZDEXOL<4`aIPRRiCph=|hBDHi!b0MHstL2onuMi-64fI1~_E9IEBRl4`p!7h>) zKwl#GWJTh{s+TZH9ghJJ2d&1ghjy0nv(ysHjLnBww=)a-rogh8Oli(|nBuY12vFJ| z>Ga3fanW(-3k^m%n~9%G7yyN{+Glaoc#^huPaQg&@CF2(UFLXnZ{O|!f=!LVva2+W zqP7DJY+r;ya8|88;XpG>;8pJZ}c2Z2ME!LXF=}8^V+> z(fXNp9&EL|UY7%$rM<;roVm4%17LyBhNxe>0*{0BOh5PGMj{qc^K)t(Pm9X0t#7Rn z?jU;(2Tl#}|6#7@aT_|wXkzUU<^UZ#v+a4t0}yo?m8AC>E^B9FdzZSMFD9nq-`t54 zYt4ASlg<yjxECxve}_z%(3PYrVXgG`kX=7}U8LP!`^|FwFJC`OoYA-E z7#U^Tnsp}nx#BQtj^B5`Q0pHuJRKib-py0{6v>75*UU?G!GwyIaW5mIL@*6~<@WEp zG`RE#Y@#u${eGHubJ@|^Hr%NotuRXY@x3ijvbE5ylH+o~y|w>2w-S5vgbVSTdWvh3 zu1ea)w5?_7sxUZIN}`Gfq`9yGQx-ytQi%y57CXF>dmDzw3Iri_U;dbd#M6-P_}O!X zILYESkx5^7$qW`dARI?cb@Q({l6js<*6 z(O&Izqc@179XYLLp>AWProU<%K@5t#F}gMDg=ishON@`MrzLoOwRHs4sb-4LwUSAI z5LasK$<}rNJP%*GHEocN6U6xIjOadc#l^0Tn<2+6m+tIVLycC~@Hiwn6(EY#)RYqq zWi}0DCml3WpY2MuvF{yY><@LS#s>siyT@o(P#MnB@DFk$=5jAN$2Df353(keixj>D zV^A$VU+b{=P%PAg4CR>=QQI*#=_3 zdn*ATl`8Z{SgkwWON$C?&P1`r7IBfY_Gg2%1fKWyf-4X0#2t>DceJY|cu}aMd}T&! zQ|S5iBpcBKYheC1@llW@Xplh?u^PioU0s8$`77D3)w)KVfNyn(vBb-1h3pOzgGWfv;LlXn4V)sJ?1cZ^+YVI>FWMT ziG6?lQ;Fh1e~<08a&_Irtz5cO^I0-gr+c`IGf~u|v*l6m^O@&zT#q8e;WDA66zy=$ z#Hk;m$(aGx2{CowGLV#I>u_ zzrp4r2z%{}z0!dlyy)kp_nfz10*W)^9)=^EaW3Y5P_$+}sHz?fSjn2_hP5-xWXC$} zvpZ0vV<>+|GIQly8t%0DF|{aiudQB=6N2!PeZ4SrF}c;j%krLU=$nW{)2Yz(_0g&U z4EB8vCL2!(160cLiL^NL{$_u`iB_u*OI?!n#X*Fx)~;#I8-}^9 zG2(djGe9VT}Gf085qsuEGPJ`l+>!V-@i_yfY+?eJz|I!@=gK60%V`+YG zL}{R{x8hB9u7zaVxjyAGKy1v4aO5+f4U3V@>fzsUwwp=`W<-#ocT6ECRcTBS(F*@6 zIU*z_TEMeUj}8?vFCR@#$|f5q6HXZM^*vO?TAjobZ z+cNzN1Q<5#tvJ>VSbYfV9kqnuqOyV(Wwn}Fv%9-Z088B1{t1&qfEuJ%bXn(t1snnu z)*k@7*}!;@(oh_Inf0zmoY&r37{-;ZknS}A)sD>Au1#z7^Qb>M>q_8g)|4ot`M;c! zfPOJr3|U79qNsIyXiY7wKl0p&LHdNhx@P1Ve~}<_lkbUP@HTtSA-Q^m8)UMnV8i%eh$6avu+(^hmc=_P{->9+sMb1TESQ(|>=aMF&b@lz z2|In5dwq%t%x{#F=!@U4=O4nXV96{F6GZ*D>f!n@71}+pi)@*v z@YBKck}`7+NkC1SZm8Jzi#u*_vM{!bWG^A#-P6)CI`IVxJd-8|2NI4vQ6Ok6#Cu`} zo!2pMkA(KsxsL=Gl+d7M%rr^|{^=frw)8Bl4 zLOcJ;z1Vi&KEW0t3fJ zy(sOQp<7=i#I(J2RVH&pkA(dQ(ZZzBrv2zxV@FRsyNeO;IOW~h4|O!t_vOKC=^kls zJQT`gzrsB!8S%IG7p!954Q&_(uA?}_tHh+Q{Wx59Dk;L9-<1l~VVkpkC52ode3;J3 zU;FWr7zjhd!-mLL8(0F>A&`3k9a;fNg4~jvjqo6~yC21l%QncA;3vzz(@!2}K~;yp zyV1UzWPE#)G2pq=>wh|jkdzALK!Q_>`D zv~qJ?@Lm8n*7Tv`{om4tE|v$zKPRq;MPJ`K2(6<7QhmRZjKFu2Zi|B!F{5V@ z4#pzAlD2hgfBR-{9nI_|&0Rhor2RDTW2VVt1C3HkeMsZ}A=2u06j617nAYgi zcG*TzNuoX*lbXVu7AWQuL2f-odc&x3G9pz~)X9$8lZtOQdKt|LS^Z9YB|dt(LK+=v zX1pza|GZK$-0vAdOXet=^jQ`XOdbG;c724FdLJ)~MMMb>$3Tq;QVs@!72hIlApM^| zQG&(aV}SDEB47$6CALVh-kuZC;+bS2=u`y22893a^!dH&*2d2DD}{J8h30ES-|L+S zc|UI^h`fJz63&SQGrzYLm5dTR`E+F{8SeP>MaQ;7EoY+<#`csWJ8cyXpDHDT&Phx}mK~zNrE#A$5$AP1wV|dWicZF197F#S! zk;WG%D7nV7Zz&1DxNzLxGe~kv;WrB^MMmv95k~_z9 z>f47O#kca}&l6z$Uj0bWfG}_*McO+CK#6u2XB+dEyG#^#16-Be+4SyJ<^KH4Of0Al z0i+K-uPo)5^39@m)-9KRh3-t%@PVh^&j|h(d(zZi;ArC$T;Ov*OpZ;_v#~$(8ntGB zt@%**^w?l>AU7Nt6h`8$=g2JuXD@NpB}50@zjHsq$KI&lbw+(>6Vv{&UXI<_sO~Pn zACTSc>gs+=I^Gnj{`@lf+}2K{v_;xK9R(w7f=*0a<#V&UcxIsT>pTE&cfFw9vc}^f zeUaSX&nq@}bl}icKwJ%S&SNYrPi%7+?_+)gHyk!#>D#5MQ^Kjd^1ML-dUD=*Jrt4Y z;N7tA@=0r@{k%Y>XocqfrS9^#xZ`aQGAy$^HM1Vj{NGObvZq`90ux!7&gn_r`7sOI z#Ur|>-gcQm@%kAp^utjS4i2M1aWnze?$u164E<-Fd~`_rT^qZ2&*p3D93AR;%gfer z7&>(wUCdURMp~^`_Jo7Sf%v~50V{1!)=lwPWQ=mwjkjiP#s-SxwvTANCfJhH$SG#j z3g5hIn0PGOErw&RIpp~qY!TPk9U}LdFpo%02}Xj7}54gg6)gW765xDb7!OTWI0@v?*y1Wnj3--gc z@wiGcGd{&BcgBsXe+d#ha-!T1VP5MGST}nO^#x7w@9<pO0$|DDt+9Ud;q` ze$t=1NV@xYiaEuY>}=%T^B#lhtYHItn+pF$1yx|%=lk$tLfP&at&99K*Q|X=q;{yS zt5F=`wWme4g{A{5K6PXhRQfw((QB3sh7VSb-c}+vt&R9sg$mfa(yezHPn4M0(-|~s zsRFM^bE36y6p5BpT+=u=M(26wQQuJ){mrT-b5%9B3Z%8Q0ui2gGYUG-`ZiRz<9lPM zy_^LS4guXbn8s>Ml!{yQlXxObkt4(+-r=-Y_Q!U;QVs_$)!3J4E}5pSZ0RTD680_C zyQ%1gIF(HQO-_QaGpYpy6wUYq27#)5BsYV06J9r2=f<$bS2QOVu-5&Ox{SXa>HAyP zrXXXxuuj`~>m)zN8b?Wqz4l&Zs|~lTrlj0n+}b&nmNQ`3hBI5rY@!gXqp*NHvM^to zYG0j?8>VSFxb{8CWuPzez@f20yN2YBl-u|A~jF9tS|CqD}E9q}l26zeatSZj`Bh=9^0tk-&pj&qtiTE{g=kMB2a zXI)y9ce7mqgRDJXcooTcv1opzD}!7U_b!n*Eb|gfelOe0;q0&? z^zmGX|EZYKK>uIv2Jaq+)!+9-vFAwo=c_6TMQO&pdHbv8n?L5!5Aid;}dQf z8gpMzX;y~(dVdW*p_oesrOnx4s~Vv;1w{q0=|`n?$Z@Q%SU&1!0RJS{nY8*OD`D0 zCLl`+T&v!|bkGuS!Zu%{uM4Jm(f6Bp7=E(&kWcHCJ$sBw9ioZ89{Bv~4et>$SkLK+ zKeADRvmA`Mb9M}&6YNa`Dpts7n6ubm*-yt37dXgq5hgLzXx{e>75Kv+BN5{ob3Xe8$b|9w*TE6bijZ>E0R zrc}(PE~V>p_nBL8Ft3qWah`JnHM-uxGBIsM)_n)Q0!(k>bxc)k&X6$h@G9tyrtp{z_dGWmK!30_T^(1tj zG@ddZ9TDd>AujCR#_2!vyYJLGoSi0qU86vEqb`)A`h+*m>mu}-0^U;bxKDHHIgZ_A z%Z?)gW?utq%V>K)CfR%CYDw6tg@Sop!g>#3g@k=bsg?{DT4bS&wT5K$b|ONGDEfq{ zi1*yE2v!YZBNjK>ZWw3!mrc7x;hz5a#|BqMlj{+1}_VQ2{L9 zVeb3BE~`iet@BO&!xrrI9+a>`u-ny>;$2;0ZAZv^(3&RIqRQxim>qSfh)>kzv0Yqt zfCBE_1+vylKx1$Ma_|&*#X$|F2LoiHZF-+Wi1gm08RohCbF9A1H5z-CVPZrO^R#)C z2E>LW%nNwoTJ+$84(kSMYW7*D(PZp)V1=zmtKSD?UOV3%%k6{|Yf@cTA{ci|_2{sO zMQX@GhLuyf>Y6$i)QWavhxJ3iyRK6*8(I#3zhfA9e83lLKFD{hF&xp=&- z{4C^*3PMhsZYek)84 zEU>w#89rUbcE6pGgO;IkTIqV}?BWp|z)Y%w8?>Q7iVz0zaCE?byyM`kI6Rl``SYI9 zk=5PEyjV$rIy)&vfXr0IayM#ke!)Bad-P$UVdxVCjo6qp5G{rcq0U+GPGstgEPgA` zh`bmpD(L4~ejqzPdc;C!m)(S8dbbTN29nBo18(~Z0|xnX>*U@et1fx zaZf=hH~GG_w>w0jRY&TpeE=z>?)3s5379$$Suql?fF5~k!>be9GLyQb@DBt{m@$cGW~NcNv&p>3F?a4Ukhs)>Dm!9iKlgUKTJreNV6Y#I z1~Q`;ZEcV&a+jkKq|9iHI_T~G+K8b%zTa4f8Sln=U}+| zlf3_@%0gZ@xfaaEEnBMdN;+Y+Z}bsXNCCc~1GyR<<_sw6E@Key=MBZc9kTANjgi|5 z2E3JKW&eM|nJTbK0>IqL&F&dvf+6BYk~D*_!7_uO@WMK7(Nc)V8%KwBH~XTvWt@)O zBHxEa-W1IFyxe%|O^@e9^LH8%MvWLI;8qX;E_?sh=soC)gY}=)5k36>!`oj5#nF8M zoNx$)K_|El?hrh{;BLVsNN@`z=->nk?iSnw1Pku&8X&kkgC)4TZ0Gq`z4g{s?N;r6 z*#6kneYNkox6k>VbCgqM@_yK=GCBypD}fI7Xxr8`jD9hT`^e2qS|5(Ay@ zaG}3z@Fr^DHt}`}`dp-6mv-j+)Z%F$Lu-)@z!)WBh9&yV56mnpuu zMX2frBKbUElxD5;F+up5Ct=$q1N^M8&WY~3!HI4#2ZyxuKZTXekL?NLXJrMLRaF5H#%CVup*pt|xX?Bfc$C?!SSB zHUp}fQ_*^T;`IG}S_8jxg>u=lGP8160K|@BRWn(f_YqAm!Hm3%suG|4fYtlzVa9N6 z@b;3I^=>vnBsI@%kh$Lxl+uT=S?{_=t@LZmfw(xEhgSekLrWgtTJQlI#T8Czw#&GU zsn_O4`(aG95QyvWhc|228oR}ETgirXcs8)N?;5TQ^T=^cyOi`I zN9(f&miWrt4BpmaBl|~1tcqPoQkc3XokD#j3GMeY#1vT)OkQ3T2nchg?24t_1XC+s zY`;gg6A#`Jcx5m92_F=~+1P$1eJBTc+evA#+!Z;hualz1t*EM_ImJRWNk})H^o0(4Je%Xeas1GS(ie9-cv%z?Q*A)z@fy%8l;TSQ7eMKUzoV15g z9vKk}^QOh1>RYuip6C+X79qF|`F{;Xochz*Z@-i#Wn}5)@yi{PN`3jyke9}nLVQio zSXLMf3|RawQ)uU#C3)|LE8grj?I)i_rTqXldWNDLC&L|l&^@7d@SNd(7_=X0p;p2K zy9pCx3TGz~JxHk>Pih3-c?l_i%rqpTm!@OTpl*AX4+Oo4BoC~IVx?lf64|*%j23M0 z`;^~PTyYxOZa^@hE)m-YN!be(WDh{*-Mv0QzBkxW+o^Wh*~@$UBV)+mKql% z-*e2jtU;oUB_(ASz#Py(I;@yWSNforQqIm!HHe6Fd7_53k{qQDsI?{RNSdXK$9A5^ zz`BJ9T>UxGXVPRr<;CA-FbK{#^2sPZxNZE^ZKC2Wa{wC&75GZd6@W2Z{KJT-ih{1E zBClNrxI*itUw6Moa413;@Pppi%D+ySnKSX$m4F}FvJ=s7IoPAsI4ltO7yIX3dhH`JT_04ylhuSe!zqwR9Aa6 z9IWQqYp9Xgts35kKx1wBlH_DNyxhw9j%_%T?Y6hQ?~m8(7|Iy25h6?Z*emugO1dN- zo`vQIEPZ*%kP;{~azoLaAQ_6)ggHHfHKVFKmbtBn(1^+5Y>k@n!GRk528L?Daj(7s zD1kZ=yok3(ev2v$+W_B^{H`lMEQgj>`sOQ zS{)Q48#0!`je@DJuEryg%m+1%j|y5$m^ z6L#MWvH9f90Pn=nXYy3u)y|V$K}jgSU^E3p$w6N0Yqn{x;kkE_#$f?0It?Easxc-z zki>vbqh)vfw~o_=CHfj>xxJO1PMewzTU%0eMCwhetikfuk8;-0f9*FGEQ|F-%hx{3{_W<;rlPv4aO-e8#xrj==YFNB$=!=8cR;l zE=R{4O%YaCyQsZwWZ|ZJr|Fv`4J|gQ@4p5Sp~w3q)D3>-@5C$b+CB$222}27RB02q z#qtuzFx?VUt6o-VC#?S>BrnDw6fBHPwuqJ)HSsG~DP#3sjB%J*K$?gaT}^k1)Cm_}>5l^emu zws;+KF7;b<{NF_S=(~CMXHwWDPUE6x?0rdk&-Fk68kny!rY>9_jt~alc5}6MIe`S;|?Yg^Q1Z zhitkl)?v-UCG5%i;Dg14!;O!@(s^p5Y5>`=vz*o$1JP(z=bh#=`)r_mt4eriRqCt* zCAfOG)f7eUZH1QH^96zyc~J7s!%y6ggdBBI%|kh(Cjr66O5#=#h%2of-$|T0`uQ=OzgU129aZP|nc#mHi)kBWA;QeqmF9o*S@I*II#23Y9gj z-+t&@UEeu7up1+=7G>(f50hE^KPjO#B2jP8qQ|Y`awA1-6 zFL|5JDKdH<#M?NGSLLS|11!_vbn@M_uYq;tv|LfJSRpZ}8+#W1(EknfyPRTR8ulGzhOGH^-vaS}m7WT$ioO{34I3&dN z>+t5vJo5?#WrLC|0`X8uw<-DhciLnNV(Yw%-bFwpg*$L5P>J4$>=|^+cTuo_1XDo(R zZsDOwnji4VH5DaUnjgOp`MW^j*>eh-adMn9{u6q6(*q(ZYRV(wu%}1aW-Re)I>t!)|48y zPP&6aat?Wf-CQNe6aDb@-+Pa|>G`R7d(ux4;voCjXO<-;o5AH)-*q3K%y_V3~I-I($Q7;OCICKp*GgkyKjuHF)fQqUVhva-d*e*Q=E z;EYKVO0G6x^N>mk+xqH+pZ?KBbrZQ3WYrs#WtDsDV$sUux=v4vjTeY`3i-p4OyO{RqQ3KB}?oIeV&(|Z5c^M^y-s2U z25cYNrixDRg3E_aH{{rjw>`ZS4j5~R>Lw-9U!^PdSq=Q+I+vx43gPWe7Y*QYR z=1uifkd8dfz`UHNAB*g%1GOfAMPd8IlN*h7{SB zEN{jDSOtue0ynLX2@}HO7d`+>LnU-O3aG69 z(0pE{fK@&of3BCtebg8^KhaI$ANer$dpfh=jvS2^(9WLFgUu~kMQtf0oJ(@#1YkJB zgpuOc(X6l-&zYbrjO&HNkw3q7joHBm_uW;vpp3}P{fXnI7KaG=o5ZF%Y{RVPp30m! zV&7iBnN&yG=nqg2Qk&!Vif~|Vld%ccUcR@VvBaJkn&-CR?&4-Pc^!sm`6GwabDu}l z5;2Dvj8|@|H#A3db3jv+TSidysKe0uJi~VI{9Y?N9?*-@ZIq~d*@$~*0rOP!^{P3i zFq8@uaAxOX`;rn}_mQDY5kF-gRisbo*B}xFb_%B_+Ny882=&mcj7>NFr=_+C#EC+_ z%*(&y%w9K>xQ*AVY$?^})b%HpH(vyME02v_W~9m9`ob97_2Yx4s?S{xbX)Maad1z* zObt1_cPDDIgmir>9#n5K=@s`om4)CTwDG?qWec^uGi>hs}AFx5< z1BFmPDV(5>#u7)nQX$@#EpjMy&dqB*#gP$t@Mi2MddQe+P%lCb+yH_Xh@t4!Mri0> z8tqy9rwntgH*&?x2toI8J$GcKgFK@DeIAU(m#V1=Ig7mgRY)6Dhox_>(8Y*z<3gQf zS&!!(s@-l0y_t-#x z=#5zw@0%3_jo&Z?2?!FM=MPI4=_TnMF$F5{-e%t8-KGqn;RS#pwG1R}fG^|R#do3WsA7Ta zWW7X@k2sUU^3#&?gApW32OD>5y!*MRbg0+*Up~Tcft4r;7@n2=*}*qn?acxC9&Q70 z@lw}H-B0US*m+n*D|loy{$#dX%-^6V(oDD@mV%$+Q9G1!aa9v3V`4T;2Dp!|?t931 ztYd!s*?xnssD4*`59mc;AWTnm|N7>KCn^>#Obm~~4;S=vs;+>0v}f9KCdc8Eh(Bcx z^mtH2$259xHx>0ZK`hEdr^NGI%MAAmwcVth0jc)|laRZ2+WnBxNpc9iUJ^9x=KfqU zAuvxd{T3=i0|MV}@*N0ype?;F@9P|>V-2Ip?m3}&Z6+w#KDOk8B);Ao#69FaH5=JU zBa}yFtC9BMFm*Y|)k0YGaC;0OeujdG&a7X66sF_9>W zc6aeG{>HZLhI#cs`+4~9rz>9Wlg;A&104}%8wKfO84F!f>pZvWiQ-{xmXy2R{r&rC zxE?4h0WFvsoN{9|?@M--v3uPZ-e%n)lOA3QC~JD9(lHAG1+C6ydA;UXu$xb8HX-d! zJI$wn=%bQS7d^$NUk;d~VWn@q@H~?RMa5GcqsDzEcPP;iqePX>PXD+TsVVDm4LH#` zKcUHs1R^lR!i{%At~W=2zE_*5bRG-RV}5TjK+)Txx3&8!$ZQQRKbs_2qSdK%RL?=p z%xGJBx%aZvWv+n`)HhX#`NxH#!-sB@OZ}=zYs>qK0}_GU816rVFaQ%(SMEh@xEIK~ zEHn93EF8~fUe`<5S13>f^n_h(6Fv3g9x?i^lDDU7Y$gwwmYWm9uTX9OnL@OW+GH5O z;T0;LDOjpWvCVF|&Kt9W``4#`5_sF>1(U+a|jBj^7Ew)Hlfk$zIBF7LA!d)8{+!#1>fd$vDhvnwPDrguuw-fx{SR{y zr&k>WitBYnQ?dDFQef^+2bw>y`k_gTk8^bNJp#$Xh zRIKbur54>^pM88rrTj~U=7r{tJTyxT$y%#P%Hk-zt@N0wmhJ0@rvwdq)}`gGA6)Urtv|i+KYcEkv)d0Cm84*CLzkh zlYY^BSO6k}3O=d?r5DsDrMC6$#FfWoQ-R$gE$eG~eM-#|HN+T^r$%ugX$}gjggWG; z!N{kysHx^1{@>Hdenh48cQJ(c*-Ghb_@L#kUwfc<`qv_~!iFTD&Qwn?g9NLsNBjnR z5qxnIGjSDf(i;xkKjo=!4^3`L_kkosnqR#H#+KS#7ThSth<2k2^<5$WD*=j$%DckN zqvBUjzPg#lTxOR7+x)H`f`+%cUeP;oh)-G+HaB|ZMEn3*QZ0}lWdCtg^GW2ceMyD* zRoo0JJ*=W52L*8ze9E8J2h_}p);lPk?Y1U;*Zw#OJg&W_n!TTqKm236%UpU<=6XYHzO1&Ru?$=y8OP6t=D{xJhLN^a8N38g4&8Ge9zGihTZzc0}qxF((a;o^@< z2+}8V_IY#(_>V>bxGubxY~UNc`qI+3UH6^$>X7cQJ<@(Y%;G9)5r{{sO;EMgEVv3)Dh20Tj?KfaFhEyS) zM-P{)c|pt}cT>cz67-}uu>QNA8zlhx2zs;?ka6{)j9bwN-&+0} z9-tuiY=Zic-BsqSYxtWr9>a6rbG0QLMN8W!N{p(hknCGMaOltxZV_Z|?pE80wjB5F z{xY&qjgKqyx)aeY>HWiq(lw?OJdV^XFtFDz(k#rXyPTLnjgYn3wI?D{%jN0kGQ(L{ zG~j>J{y<6Xky%+!l*!s8!`V%CXo9)n-uC#WMdtg7i~af7&N(6zSnGOQ?UYUP^D64f zmkX(eIdWNxC*tPMc|6al6V3Gsw;3$27unimPP-ti_ug)i7bFP&g5C?lhn4y?z+Qv+ zWrbi=q{m9=?A`0u**uVdFyOA>IOFr-_O^~(Ws{BYS+C3=K`+YsJc8`%&C?vGI00zV za}hfnZp`;(_5r=SCvD@=_PJZxdm;5P)ZQJJ#rkUlU)qoa9u5eHi@-%DwKDBb)oHlv z0d25Ie}_Kx#HYlfp<^VmsqW|ucE8bQazF?1OkcpvcV?AeNRZY zPc51>dChx8M;^SRq1+NvMyiRwQO=g)$peK`T0U#GKE6jDRTx@+Xrjpb5+i%}9v0aV zlulMQavS~yzH6*{QxC8)vHq6NIBbNV3?=mOreTJZSaBLqS1k@Jcn4e>sV7KRC29(L z^Xf2>FD9AOa5u2y054e2vjX-E(eWARHi=4=gXN4mkot87bxC5Ql$`KktS|*e?pa+* zUUXBDu8!21b!BNW+UUKc2ebVa*DJV_CA)wOLn!hdj+RBwz7~=@@m}_PFg+tyY5VHP zi^?utwU%E5)$?=UZ4zW|(r+w0{ty#e1vNjk?BNV{RJ?ZLSic@e$C=>O)w{S{{s*ym z6GoienN=Mc!RN6f!%;1>t;xJv$7qo1V?KpC(7$}$y#MOk@H{jZb%-nOtkbduX6X8w z4I^#U|D=e4(`PMv5%zmxZ=Wf8uAgdu|5ZkJLp}gWaT6{2$AAN=&Q2>hk1wLAULH`w z)&n!Ma|v2xb@E_0hHhTFso7N`OPdLZWB9aDmDnf}L{dV4g1lQS*xi$_BUKOwnXsm6 z=g%0)!m2DMWZhoRpD@;)3|-KnYqy)UxOsEV5w>?BtX)I#acYTUh_X9{u~I3Q+{V6p?F3CZ52>;fYYM)o z0^k~yfY1V>zXZsRnwBGR$)4L|3O7vqZ^6tNlC#<01bl!NNKoxi3bP_yw=_0l^eAzf zove)Um5TvA`&t)Iu6+ap@S;ga%PmSKX^O!{Vo|yMRFjt`2R)7Q>{x*u4m>iV6y0BG zAl0IINW-$z+IKcTxZ4w^FG0RbI6|;W1bgW-kv`KMIsmL-BkM%-rUU|Uc?feWQnq3r za-kr-Cko7wApBnm7r&RmE}#3(nyT*sXeO|sr=lK74N`g>G3YQ`xZB=SED?!wpfU&I zDwnxCKKmk*B6gEYOnm|L5m6f>_#O*6i^?;&b~$((hO+C;>s5})A#}cW^a=m@w1XAE z?w}=veR<9FSVPOFT*3B_=fRljIZWf@JOVX4_g15(=%H#9E|Gqplb8d$|~L9+dM*vA@~ar3M!ZgQjkR}O!( z)dH&KS%B9Wu%1L{!1(jG(SZmdk4N|HV4}HE6GMi=mxPzjlvLJthOX@u#9{5P&XL~Fzg#M$4;DL7#b{HpLy4?PKK$bon$O1y8=2GYwg#B6iQCy^(GNuuN9+%&1DP0g3Vc1?WD6OgU5S%0<^--yyn(c z&01C3j?W2HY5SE*kqGKI8P3g?0v)`+7WZ2GdTmhK1P~D~Y?&V6>m*4&OQden9C7ap zc(aJOkgMD{VFTHgsYI}A&SI-!mPx(WQQdF z&myMlrrFjynw}#!ljMHB!b?b!o9ICGV)G0lrzCy~_m z)C;xJRjj^80}PyAy`8TJ@AbUb^L|5jq;wj|s(*|y0Nl(sF4A`m(8l0fNs7P-^yVd_ zz7*j_&F)_qt@&?E%m-4%+^>;!`v<9jUu_HfBKv)IHmTs!Wz@eEd1xInQE7V}c90!8 ztZp$>k&(K7O7|SfQ)F^~8t}s>1&7`)waAE#$Kn8$>*1lyDsLUu?WXC{{HOh$CjP;= zX}&x(MAjQsoB(45KcF+lp@e?W2spQOgeh$ zbq-m7EaZ158~D~3{s)ktm;I!Tz|nzGNAFz?be^L^&`zVk;`7EOzIN10O72%v10t_wW`7tF za?LEWA#$K}wArLd7BHB)9jxNho3#|bX42Ms%TBSPkA7F$-HS*kj#hwQ1|1X*BE}Wi zI6^rMG3yTs-hBOT6#rKYbKZcAQ>vt+c8;WnPt*YCMT!f4Cty=vh>=4vF&;AePHl>H z@JeC>jtRwqb!KmeTB=T9>HorW$+x7ny-*#mcgF=KkJ)wiYVT7?%z+Wh3sM}U9Lwc= zwoWjO`nT)6olz4;N?uS;SeLaq5q5XR4v0~jVzr_oU3pSi{VTA@GB2ttQNLA45Wl4j zm284QHL0_*ZW{^F~?k#H3}UgfDat2%Ca^`;9G*8Kwc zk>i9Q(*=s>JY%VTp8$&m?O2!>)uPtPL)SEU)z}xiSw5?u32QMonFU-S=B0n~WgZ$* z^XcC&-cTKl9~G-Uu`2Rq<~d^H8dcFlxu}sO_8`M7Zv02YFS<3>t|1=?(^QyjJ>1K&A)mHK^;~1 z(o*jwXFjVDRK3s9WA;fjkc~8UFyF2R{_c!8EA%WH#9%z+fm`vul*jLoMLtH)trV~g zmqj0rQD7Ure>}76nEyNU5|@E~(?q0J-vVp)?8bj>cU`v!?JEIMHeLFM!JcKd9VFS0 zfntmy%1lkOS4^W&K-i+CiRzNFY4b1ZmXRF2vBUv>$|7siT^Wt_-jzXhaANf8#%}+n zK7Wfqbi0zeIxy6aCOeu4GH~y>6~3`f*>Fkz)r@)M4Pf8+^4^qo!=qz0i@J;Y1~Y1KH@ znpuy_6U$?{6DJM|N8%McddN*;8z$q~-NW8XdbwI-yrc| z;!-SxawW`l$F3BNiO&*vH)(Idjo0Q_)wJBkUb}ffV54cPS@=jFVGce_&C|uf%m>`> z8(R>X=`ybq1kE=qTN2oxNqSsA&4~t}vR;DchKqW$2Pz}lm_9_4-XOMc#ZV16>$~)R zL8Hb3Z1>nRuWD4RO}ED$ZxUi`{tOw6jWulfGb;X@n3Sc=Fy3AG_YoEr7r2=4(PxhS zST#=XZDY-ixi{y`{hZ>uB!yG}9Rj(-!PHH?F>~l>LkHWtGSHNLuj09C!~2pj>!~WiebDJ109hLvki=XqwNusj)%6<`F^e{+lL0)Pxn4= z@a4lMRY=A#Q2>*=S_0#zWETx|HhM*$x+qdcWnfNm475pQRi}aWSsv_t9;DXQkudqN zVZWxI2pqB%P7~VhqG_KMIYp{J3DYVxn z+KUnX!NM*sX9^y1@quX0sPCo~D|f|(BE~JKE?4y6JbW(4l>)+Y?Zr$SPIa!6!u~Gd z_x?^t3rYSe>$!z$VhF~~`~I|MV}ayt-cXH_`R&haUFd42KmlPz9%t4Y); zZ@zV<$IxJcQXOYor*K5r_sc#F1{`6;`D9>sa#kQ7uI&t28ku?_YBrv?7_3-Pno$0c z!(oR=#QVw@>*!}xGNJm2UxyYg{OfUA(GzwNUhzLt1~Lg>qExiTBLTrX)d#X`$J?vI z1I(Rymn*FLx?EWs{J7DY>?JD!)4e-cK#d&-f#;L$PYvTzxEOK!+*$=A&~MxC9UX+Z zAlpkCAbHz{6hvneM}Un6_T3TlT+}u={g!d!@bKg*d{f7FhlJP0DCUNNh2>XV>(R8k zvnaiik-s0JDNI{RSJU%|#j9;s#bZSsIOdXaRmT{QPRhI2So5_Ooq+Ft1H7oABlRBu z*Rk_ne|6|ZqatEFVsAw68qG=G-HHE$;to3uH^jYs!VOR4`844Q z-Qnf8-5g4ICtH;UoBB5yDrIZm<HK$*+4G*{w}KTA#)c} z{s9;s>eJ4M#am>v=jiLzrl3ww2VkK?uWSrOakDS>{pP+Dd2x6kx_bNy`!2i1 zi&AnKc<+U%l7eJXW+Hb^4X$>&qO|DOxqum&F^U*H<5ttT*q*kiyut>(ANMV38s#x3 zTML(t4{1KWC%Gq0t=o*Gi=9;bO8{a{Z+O#R@_Qd%pWeqEE@r-2|A+UIgV%!;VLm>b zujNAOynhDUg@vLjS}cFo?Bp6T|2h>|aVh1je7MVto>BSHiSb0I`uub()J839o+Phz zGNM;iv?rr4YYobG`!kosxbnZpRTb5wd8n*x%JJK|GrYl!wk)9P z6Jf?uPSA6wpETmQ;;{TU|H6arf(d|Zw%040L7Y!qvjl$7+wEK~I2{P+V(L6(2+LmT zXwY(im;}cH25l+R)yJXG+3Mm6hRJ=q?#{FZ#<}LEqjaK%&;|xqe*MCd;K4kA%52TrvUaY=K)e7wz>}eJ#w9K>hXg8k@63iwG|JC})9h~E0 zai~*~Xz13A@d}1T^wgOlPK|#Kh#2Poi5R+*iv9r|tDil;24@6|y#Hv&%&dWV> zfrjcw?5Ea`?g5!|L4t?KH}kPZ>o_}o#hgeYHFV+2$R@vI9hQ~lYhWLNKO_=(& zPrr%pQ+7vFDNRv(BPdwj%L+|NSZ?ZNhKWIo7h*JdEi6by1f@sd@py`{0{@c-hdXW|q z=5q0z00+cFoRA`-HnHn8j+Mul6jdViY%3KP{2n4#FCTpn8Wf!I;vmXZ##1w#c>mXt z+us*jZ*;@Rc9s#2;!r52aQ{(s*>h2e%chc)6bq_JdUgVK*u+QQWsJav8;t!tQT^Oz zd>gdKg${Q>aEami8Db*Gxt&8H~^{nSt$&#+XCPG1wKo_0bCpTK@=45#fz(TkS30le(&JdzHer=+>uo=H{EU zvy}}GnhNcRD(y%NP7km4<9V_=GcbmPy>k=ih?+gbvEk2*pA< z&iaNGA(gwf=`K})xQp=egcTF%W&T!J?O#ug$V2@4u$! z`gf*`%Oq&tWaDy;mLSW#VV<(eaFO5n*)Xk!_pATCa&M9%ME`k=Dl&o-ymn4CUq969 z%}`sE^Rd@i+HC+Qf{26aO7}#-(IK-<-g76DBGpS>UDoVbZnRh$B++Y))F zJ9Mf2YN?$-+V@s1SIGpQi?6ROMDM}{jvw{SmVZ8!ORsmaN9Kyl+=VuW=UoUmo}hci z6ZnP3wN;I{hULzqeOTjdDSkzK&shr=U3=^3@P!Wfa_jtV%{|Hl>qm^ax9Qlx!XR+O zwHJCx%5H7@Ku27mIUQC}RIan-ygMWsfATU4GPscN)wA2aCj$L<9^%q?9Tx?EQP7d7 zY8Ij|;3JRlI#BQJ2Jx`r3;*<=Zrz3Vj&!@JgK0qo4#3}m;jl7WF!S`t>M35***w&jIr?nv7o-G8XUxzj!UEA?sFUvN&UkX9kpGaFStRg4Wxr^?@Kom^+H)^rY4sY z_qbz=DRddfdQ2Im9esnS4vv0EW5|Asae0NgRH@2wl}ON1f$(e?qBfEp!J6c|yDcjv zo#JKv28tvLK|CNZ#DRTa3{>;aJ5CS*9;{QOXK86;vv#d2h)h8=E$p-{trP!9qRw#N zi{1N8&hiVmn6$1phg*W+ho?UA_}CQJMfoF98ZZY(7ToHdaoZ8* zjMjp(^%Q=B(?#xsSu>(`{U*!-NO>Ac26SAN`V#llkL`m9%)2091&P1I)KFh~hc<5< zmio9Op?grH?&kTnddJeqz~!fyR&tH=Go6MvE<8#&9Q8$wB5CpPKt2`G$Rd z@gfEicdxx%Bt;cR8*_zw=x)4{cy;;}&NqPxkW1UC;CM6{p@4t7K^RbxjL6R$$FaE?D!t8>$mC)lJiiJva0 zKI*}-4j2KJR;UYbB-lM$pmlM|OX-q3=efjke)1!uxmESSGu_qM#DZ*ULXc9vMg+V( z2k}N(`BxwC*Jpay%5lGW*YwRVXGl3}%xn+41vAyF>+bD0`+3a^?;-y9A1I{Sn=zF` zU2=$h|3!9a-5@}|{yr6sLyK|bgFXd8M~udWtxMcn>{NGeN-ON_lQu`m2*el;EB-z=MTv13I-{~}G)XY?e~ z$q|K;VLcKeGgDI~@Zjd5MA}4NU+>3&P_PlHbm1w0CNRHrbUq+>^9x`)?R6cblOPIR z?|6ZNIEu*vE~oNVM8!e=r%(+dKsJroHT@IM`(IR^f97jBh+LEZ4{WbQ$yZkewm8VT zg0U&zOkmF9(!D~Oe1JPuf9*Po zpI_9)p@C&QM!82+QT`WrBlbI+taHw2D8}YlOT++YlciFH)0UsCf+j6$@bc}HqjRZA z8gnc%Ef<)R;Wtf^Qdny5&yZl;=r4Q733)_!L^11p77=(QqOEM#7(bxnNi=6Du9EPa zXZU0>zIzRUo88@=_|6=^)$pB>AHSoddVzJ*{JDifBS~bR;R=iaAuKsw;3xdMr22*| z6~qZJG8Mts^M6=>b0kK<19gmaaa-Iv>DwtTpYPJyq|f76?8mqyByy-io}*>2^bkQ# zK4ZNpNfv5uiUpR$xdvjn^}Umx*%=)Zg1aQMnC<8r7Vk~+?0v6(;s1eYH_i!R{_mq8 z5IR4v2cpk7SW46mP=b3o+AT<}I7zyv7-G8l?tS{;{{whw;^wl_!;D)JPUnPmMztjyHNcF~ROJErfZ{sKbgtd9mVehQIU znGk?c8VvpU704)PB)*^}fCBGaA^#=JBM=MZm8OIl2k%m14U-+pEz^tgp_Cd3Xs#lY z^;~*j+kB%e*6(WOF@on09Ba!G9$EV}7LW49Gh+}B%$cb+5ES(t>j7bmh@LSCAlGad zP(K`y5XevG$SUMKDgbTR&Wp9F13`oQRST`$oSV0quy#y^QVLqARSel`wgu2uHboqu zUZQrKkRQ9q2v@qTY}NtMFL+HrIxwPk*7aWTHW}8kVP< zO-A{nA&d+Ec&mS$6D{~1L&=q;Szi{n+^Pr8IqyMF?RNUfZK=7lkJDGUwn7=y{p=S* zV!xnC&sbqTIFzyJ|L;|bna?vxckj6K)gdG4e6RUdoa1XHKh|dpM8wZM#V|@g9Cthg z`BodfqYgCoZSduY6AL8Z_kALjWqS-^l%8%JZh(eFl-uC;sDEt>!D3JVJ(;xK+8aim4O83T3NiPWn3^X5DJYvg}d=>=&RBG><}fH){6&3_j+(f`%WYSZcm z+>*HQ4u;_0%Ms{{od0eJK|Z8@eabF?+n6NB83{)na);#2U0^&kSS9_4k9)N&2Z_Zm zHh8q9nEdvUNx8~WCDBCVTWtL?pZ?6=GSL+cvQ!w)2wSaDN+B5o%e&z*!N5dRSdnkU zmF`=j2+3C$z`tF2Z(}1-!d(3*I6{FQCZ_c{suw{Qd;UGdfFCk87doC{LPVYWGv@=3 zOv0{?4iN82emU~|cJULHZ3rh-fBvN~+UX}gsX~Q>%(QE?wAyg#bE{K+4m3vaq)2&$ zteKe_yF?2ZfJt?kcyU1HU3KVx<=mOyJytu<$b|G_LQ^dr1BWF8|s`QRX$4Is=y26Cu?^07F8=fx{EQf4Q)WAv$T2Fm zVFxBvP8-kEZ9;EsDk3P|>Of@|H0W9Xzo$AEvo?z8G6eF(yG_bEMFsiF^6yFRgkJ2b zi@NN}o^X&6zbmf9P;8j~8}_&Voy?}Z!{j#(iY0d5L3;*Ajl29|mYfVS5_)x6-tfj|Gc2ijS~Bp6 z+s`z)t1sr+G6u(b-?nRN_#Rd2yk%pM5h3Z! zjY-H)M;sr@ycSoNt@1Kg_C!-N!7iE$;>UD7q@~9(B~5lxS$Zo6MQVC(3eStUJyRtd z;sHFpR1SL7j&?Vguc5((M$>zr3-~wLn)e6jYD#vqI{}Ax7Y%8DMxn$;(DS%5-_lH5 zn3Z0tFbD{~`8p+6C|wuT00T(jPS#jHbCp7^h%DqsLT&jKUU?2PJwh*HOhP?7*HiA3 zpSUgYjGM3L8sefNUh#T?b6aCdoulxmH+@%%X5xSZO`|>3IZcgEdXveF zAVH?W@rI2;MB;Ln-+%+R%?z^Tu6fMiKw=}e9WM_LR*Y2>FN2oKS{mv)Cxb)6cLlO% zI=-nKp$DP)A5S|Wdm}nN_VJsdZ(p(39R+5MP0#Y{H9^;HONa=+l->>)`q)Zw#zTZ%WLFwTBav!<+L^UkZ(N9x-OTOFtLViOGXfGC!%-}i8}w_L zJ|+62bE4uDdb+<%OL#wQwo8<$@3X_WBUX2AjVT*kelniGcY@=T095)&&KcruEGfjC z#2Uib@CV*YrF+^Q!kHtlJ1cX=-#02Aefp7Tg+rfdGy9aom3#l3D?EVJdJ|@^d*vjR z@U4jDUG{t^?j>Hkuz6d!X>>2by;tyC@ddp!(}OHuqD3c{W4zjOMk#r`=KP&+Vw5Zd zG%zs&vEiO&ot5OGL((c-1@Vk~fk=To<7wiMxN)(Br7%P-g@&^(X7H%V2L<~%ImuQ& z1ZpJ3f&38zg@bPWULfDQIK9ja_FU4;hV7=ZKkDnh5qqz$N{8R|ArSI3aH?S^=koi+ z8>Em9uU6K1p3hJQ@9&y@lHE}385iIs*iHiv&>KXfv2lFR#V+VcOSYfxXt{=k_+u6=0pHHuN0Xwm(pGW?|;d78NR|h;TecjU}wLWxDol!vcd&~uSXEl3a>Tb zwN49<`3Uh|TeD`796hgw+UvrSevfS zaiJ>QTFab`KlLC|b=eV{INR~rso$e8ilqAOIiStLJ;#&f6F3@lg2W!Zh%L{^x{@y7 z^q4GK4`S1`bJU~ByD7&#U2EY?ch$vD#ot@B&1GqcgUJK8Hy=!orf2p~crPsu>!&a1 z!1KH3%F^_aKQX#pQyte`ZK%`!+_AO2Cf1Gpw}M_M9iKpfePDUv6Wz~O5)h7-FL|kH7RDb-LMYPRb$*Zeaa5TF-oIM!c0N5` zT_pZ-$5ZDx*J<7b*k81d2O-x-nzZ?3i6D$SuXH|TmiY=TXb@A(7n%1zF&6FGk}5oz zB!Uua#&&uBTK1S^$LIylN4Wx0cJ4nZD+G?w2u(q(fpQE$fS`}_x<2H}cG+Rj>uNWRiqg&755(vq4)+d&?BRd9=EpAbx2`5{T-L*!(k4H(NNSPMVKT8BEK2k6ud|lOP zb}l|UCW-HKpa-YGsNk-q2T@B38Ce76@CI)d&c}^)xwOf@P4qK@URh^IyUQGQ3BC{k z1omBKs6iqs6lL0q=LPI{ukcnL|sF@l2 zvY8SV;H7KuE@LO8$l%D(@$dg}vQb*USt zJv`?>oepFwS%@*O04|#6Az~t(eM2LOw%oa51q zQX8$BuK~VDVw_!bqaMUPg8#%Mx(`7sM=)kjdvB<(Yi5cx_QXp`igB3O(zJy>%y>NQ4lDY$HZ?Mpt7^lb!5q$7SaW^dCV5CB!BFn^A&~O)90kC&Y;Jk zIQfY`_v%g7-g-IIY1ds+#7Bs8c~9sp2y-_=t|Jw5eY=&^Pd9!o3JvtlBE;83Rc||6 z18?5Zef8G+Oc61sIxztD#Lc`S_$}v`nSkXm<9|!cT$F21*;?DN=T3sO_7nx?I?qbS zyNw0K=-e5?NxC7-$K_%=kk_+krUf@Y!W`q7pnWi0A1`3?G<#0}AMn2>7d!oYHb3j130{m>RP5W6R_)2Jby z+#fqOJKXCnmSl(s7o(yyZ;2ViKk}{dT!_l&$gsYtD90MeDXz$Z8S9Qm8RggHc}pM= zy~j<3yW~kpUODUsMkgVN<7MaS7HsD!hq{#RLke0{M5EW+JuOrD0?YafeY7Pn!f-n7 zv>fw8SEo}9fMpsf>;}VIkbe|C4&Hn1Gh&T8R3M8 zs9Zm5lBI&dktmnt5g}scjyXVBs|vKL;VoSA=c^y>N%<&A;ONZ3oz(Qz#ITBaj!FP4 z%=0cY<8*k0qEy!TQtOZ;sG0l(`c5k`J_47n1AKlaJ$z8>IG8<)&|G$w(8A{*`dJfu zC8O`=KX}6_E7(svK&7$o?DLJ`N=cr@HZ;46z!Su226%vk zFx>Jw%u|94%GkCzkvD(XcY!f`#aUQdbX>i4*9RC7hD_kkZ9x{^_XD{47~DMvm^&=% zW~C@mqWg$yC|tIDSAmdt`4Q}iirN_5_J@Tr)jd8$j+o!~-73o~FHlg_i{bnRC(4zL zn)*~JUyv&Pk|5rC0^FPX^8560O%Oe9do4%5ohlm9+?&)~B{mNVu0u{2s?)mX^91WY ztyzT$xO^=2U`V4qO@z@6!D`Whzvgkz(OZSWqs>c~Fe6SNrRd^^4&UY#)xAo)9Sjkh zu15a*{SyB`pWV1a%p@MuO{sMs@&E_jJ-2w-wAt+YKA#l{W$+1e!g)Ec&?P&YV}h0S zCnHK63j>BgQ2pfNUxoC7o8k&88QX)Ndm6fz`7cq23+EM607A=nrbl5(q{k{S?9C2M8Gf_|~h(mx=X^&oDO`=a3#n#F@z+Bs%p1w86hd z%4@{m+TKV1OOy;B#fy??6k;9~Hmrq}r1e+j)YqM&Y2Kcc@5)aMusM#!k~ese6bv*q zMctI#2-7bP;l4a2kGe~O!`G6JHNfGL+l?6Lx-FpyD6*BV_ z9ZzxDJlPr){=usj1Qdf&vsWU4=|lk_x>1qC`hObfbZlk2l597FZ6M2lX6+p2O4sAS z7^Ugw*ic>63MG`4y76`ji_b3NH8u~}W8zRBaPaLgrU zH#2ojAQz}RO;us|{S70;h)v!26O-q$1`DHI70tKHC3g_CFS|0HM)sLNLVC^2RH&Ta zqAaPJ_PM*;ybFm*{VUP!C;#F2UKT>Jxp)u^@$MobtKXpZs0@X3!L*Qyc1mEm%*8NW zWt~Tx8g;wJ@!~5EJLIxyO!Y0(n6sYIx`8(OU<@K6vY!%_K-HIuB>RWn)4iQomESyq z_9u?eQZ&~njVQ9(SJ@0FZ-EH=_nK?JR>h>6Rti7nD(3{@;Vs3Dl-*_9m?6zX4|vO$ zse_BX1}@SojE`&A7PqJU!^R(g7sblYYW#}nS2k32+3lEkt2(o*Gdt7Srj+>aCdRt;=nX82 z!`$UX3FuS{X+ZCalkFeU2T(>tOm(tjG3KO87II2KE|Z&f;U3Rk zP#aHwg16WNwe47v+I7#7-}8A0 zXYS`@Q?CO@_=mF^^Z?MW3> C1Uhm6 literal 0 HcmV?d00001 From af0dc665b966c1f7c3fb8394ccf3c5b617c03bcd Mon Sep 17 00:00:00 2001 From: mbarak Date: Fri, 6 Sep 2024 11:05:54 +0200 Subject: [PATCH 2/3] Rever mamba --- content/posts/from-mamba-to-mamba2.md | 278 +++++++++++++------------- 1 file changed, 142 insertions(+), 136 deletions(-) diff --git a/content/posts/from-mamba-to-mamba2.md b/content/posts/from-mamba-to-mamba2.md index 26136b0..af7c829 100644 --- a/content/posts/from-mamba-to-mamba2.md +++ b/content/posts/from-mamba-to-mamba2.md @@ -11,8 +11,14 @@ externalLink = "" series = ["Awesome State Space Models"] +++ -## Abstract -This is not my first gig where I write about State Space Models. I already mentioned them [here](o Transformers, we may never find anything better. +# Abstract +This is not my first gig where I write about State Space Models. I already mentioned them [here]({{< relref "posts/hungry-hungry-hippos.md" >}}) and [here]({{< relref "posts/butterflies-monarchs-hyenas-and-lightning-fast-bert.md" >}}). Now what is the deal with this Mamba(2) thing? They are proving to be an alternative to the strong Transformer++ architecture (Transformer++ models like LLaMa are based on Rotary Embedding, SwiGLU, MLP, RMSNorm, without linear bias, sometimes with grouped query attention and/or sliding window attention). Hold on, if this Transformer++ models work well, why do we need altneratives? There are multiple reason: + +1. **Performance**: Self-attention with a causal mask has a quadratic bottleneck, and as the sequence length becomes longer, this becomes a problem. Resolving this issue is a field of active research. One possible solution is to use Linear Attention, which we will cover since it is one of the basics Mamba-2 builds upon. Another possibility is to use Sliding Window Attention, which constrains the context for the next token generation to the past N tokens, where N is the window size. This alleviates the memory requirements, though it makes the model less capable. Technically speaking, State Space Models scale linearly in terms of sequence length (quadratically with the state size, but in general, this is fixed). + +2. **State**: Attention is stateless; there is no hidden state that is sequentially updated. This is both a good and a bad thing. It is good because if the model needs to look something up, it will take into account everything it has seen before. This is super important as it enables in-context learning. It is bad because it has to keep track of everything it has seen before. With state space models, we have a hidden state that is updated every time we have a new input. Because of this, we can view the hidden state as a compressed representation of everything it has observed before. Again, this is both good and bad. It is good because this compressed representation is smaller than the whole sequence, making it more efficient. It is bad because the hidden state has to be large enough to store everything that is important and at the same time remain relatively small to be efficient, AND (it is capitalized for a reason!) the mechanism that updates the state has to do it in a meaningful way (this is something we are going to explore in more detail). + +3. **Alternatives**: I get it, this is subjective, but if we only do research into Transformers, we may never find anything better. Before we dive into the details of Mamba-1 and Mamba-2, let me give you a brief summary: @@ -20,87 +26,87 @@ Before we dive into the details of Mamba-1 and Mamba-2, let me give you a brief **Mamba-2**: Mamba-2 is generally just a simplification of Mamba, with stronger constraints on the structure of the hidden space update matrix and moving some projections to the beginning of the layer. This enables the usage of common scaling strategies used in transformers, like tensor and sequence parallelism, and the ability to split input sequences across multiple GPUs. Also, the authors build a solid theoretical foundation behind SSMs and Semi-Separable Matrices and prove they have a primal-dual relationship with Linear Attention. -## Mamba -### Structured State Space Models (S4) +# Mamba +## Structured State Space Models (S4) -Structured State Space model is defined as a one-dimensional function of sequences {% katex inline %} x(t) \in R \rightarrow y(t) \in R {% endkatex %} mapped trough a hidden state {% katex inline %} h(t) \in R^N {% endkatex %}. +Structured State Space model is defined as a one-dimensional function of sequences $x(t) \in R \rightarrow y(t) \in R$ mapped trough a hidden state $h(t) \in R^N$. -The actual model consist of four parameters {% katex inline %} \Delta, A, B, C {% endkatex %} and we can express it as: +The actual model consist of four parameters $\Delta, A, B, C$ and we can express it as: -{% katex %} h(t) = Ah(t) + Bx(t) {% endkatex %} -{% katex %} y(t) = Ch(t) {% endkatex %} +$$h(t) = Ah(t) + Bx(t) $$ +$$y(t) = Ch(t)$$ -- {% katex inline %} A \in R^{N \times N} {% endkatex %} is contained to be diagonal -- {% katex inline %} B \in R^{N \times 1} {% endkatex %} -- {% katex inline %} C \in R^{1 \times N} {% endkatex %} +- $A \in R^{N \times N}$ is contained to be diagonal +- $B \in R^{N \times 1}$ +- $C \in R^{1 \times N}$ -Because of the constraints, we can represent all matrices with N numbers. To generalize to a multi-dimensional input, we apply the SSM independently to each channel, making the total memory requirements {% katex inline %} O(BLDN) {% endkatex %}. +Because of the constraints, we can represent all matrices with N numbers. To generalize to a multi-dimensional input, we apply the SSM independently to each channel, making the total memory requirements $O(BLDN)$. Since we work with continuous time but process discrete data, we need to discretize the model: -{% katex %} h_t = \bar{A} h_{t-1} + \bar{B}x_t {% endkatex %} -{% katex %} y_t = Ch_t {% endkatex %} +$$ h_t = \bar{A} h_{t-1} + \bar{B}x_t $$ +$$ y_t = Ch_t $$ -- {% katex inline %} \bar{A} = f_A(\Delta, A) {% endkatex %} -- {% katex inline %} \bar{B} = f_B(\Delta, A, B) {% endkatex %} -- with {% katex inline %} f_A, f_B {% endkatex %} being discretization rules. For example we can use [Zero-Order hold](https://en.wikipedia.org/wiki/Zero-order_hold) +- $\bar{A} = f_A(\Delta, A)$ +- $\bar{B} = f_B(\Delta, A, B)$ +- with $f_A, f_B$ being discretization rules. For example we can use [Zero-Order hold](https://en.wikipedia.org/wiki/Zero-order_hold) To actually compute this model, we use global convolution: -{% katex %} y = x * \bar{K} {% endkatex %} +$$ y = x * \bar{K} $$ - K is our kernel that is implicitly parametrized by an SSM -{% katex %} \bar{K} = (C\bar{B}, C\bar{AB}, \cdots, C\bar{A}^k\bar{C}, \cdots) {% endkatex %} +$$ \bar{K} = (C\bar{B}, C\bar{AB}, \cdots, C\bar{A}^k\bar{C}, \cdots)$$ -The benefit of this is that we can use Fast Fourier Transform to compute the convolution in {% katex inline %} O(N \log N){% endkatex %} time. +The benefit of this is that we can use Fast Fourier Transform to compute the convolution in $O(N \log N)$ time. -#### Linear Time Invariance (LTI) +### Linear Time Invariance (LTI) -Just from the definition above, we can see that the {% katex inline %} (\Delta, A, B, C){% endkatex %} do not depend on {% katex inline %} x{% endkatex %} nor {% katex inline %} t{% endkatex %}. This is one of the main drawbacks and the reason why State Space Models were struggling with in-context learning. +Just from the definition above, we can see that the $(\Delta, A, B, C)$ do not depend on $x$ nor $t$. This is one of the main drawbacks and the reason why State Space Models were struggling with in-context learning. -### Selective State Space Models (S6) +## Selective State Space Models (S6) -![S6](https://n1o.github.io/images/selective_state_space_models.png) +![S6](/images/selective_state_space_models.png) -One easy fix is to take the same model as above but make the parameters {% katex inline %} \Delta, A, B{% endkatex %} functions of the input: +One easy fix is to take the same model as above but make the parameters $\Delta, A, B$ functions of the input: -#### Algorithm +### Algorithm -- we have input {% katex inline %} x: (B,L,D){% endkatex %} (Batch, Length, Dimension) -- output {% katex inline %} y: (B, L, D{% endkatex %}) +- we have input $x: (B,L,D)$ (Batch, Length, Dimension) +- output $y: (B, L, D)$ -1. {% katex inline %} A: (D,N) \leftarrow \text{Parameters} {% endkatex %} -2. {% katex inline %} B: (B,L,D) \leftarrow s_B(x) {% endkatex %} -3. {% katex inline %} C: (B,L,D) \leftarrow s_C(x) {% endkatex %} -4. {% katex inline %} \Delta: (B,L,N) \leftarrow \tau_{\Delta}(\text{Parameter} + s_{\Delta}(x)) {% endkatex %} -5. {% katex inline %} \bar{A}, \bar{B}: (B, L, D, N) \leftarrow \text{discretize}(A,B) {% endkatex %} -6: {% katex inline %} y \leftarrow \text{SSM}(\bar{A}, \bar{B}, C)(x) {% endkatex %} +1. $A: (D,N) \leftarrow \text{Parameters}$ +2. $B: (B,L,D) \leftarrow s_B(x)$ +3. $C: (B,L,D) \leftarrow s_C(x)$ +4. $\Delta: (B,L,N) \leftarrow \tau_{\Delta}(\text{Parameter} + s_{\Delta}(x))$ +5. $\bar{A}, \bar{B}: (B, L, D, N) \leftarrow \text{discretize}(A,B)$ +6: $y \leftarrow \text{SSM}(\bar{A}, \bar{B}, C)(x)$ -- {% katex inline %} A is still diagonal {% endkatex %} -- {% katex inline %} s_B(x) = \text{Linear}_N(x) {% endkatex %} -- {% katex inline %} s_C(x) = \text{Linear}_N(x) {% endkatex %} -- {% katex inline %} s_{\Delta} = \text{Broadcast}_D(\text{Linear}_1(x)){% endkatex %} (we choose this due to a connection to Recurrent Neural Networks) -- {% katex inline %} \tau_{\Delta} = \text{softplus} {% endkatex %} (we choose this due to a connection to Recurrent Neural Networks) -- {% katex inline %} \text{Linear}_d {% endkatex %} is parametrized projection to dimension d +- $A$ is still diagonal +- $s_B(x) = \text{Linear}_N(x)$ +- $s_C(x) = \text{Linear}_N(x)$ +- $s_{\Delta} = \text{Broadcast}_D(\text{Linear}_1(x))$ (we choose this due to a connection to Recurrent Neural Networks) +- $\tau_{\Delta} = \text{softplus}$ (we choose this due to a connection to Recurrent Neural Networks) +- $\text{Linear}_d$ is parametrized projection to dimension d -#### Selective Scan +### Selective Scan -Since the dynamics of the model are dynamic, we cannot use global convolution anymore. Because of this, we define selective scan, which is a hardware-aware algorithm. The actual implementation is rather [involved](https://github.com/state-spaces/mamba/blob/62db608da60f6fc790b8ed9f4b3225e95ca15fde/csrc/selective_scan/selective_scan_fwd_kernel.cuh). The main idea is that we load the parameters {% katex inline %} \Delta, A, B, C {% endkatex %} from HBM to SRAM, perform the discretization and recurrence in SRAM, and write the final output of size (B, L, D) back to main memory (HBM). To reduce memory requirements, the intermediate steps are not stored but recomputed during the backward pass. +Since the dynamics of the model are dynamic, we cannot use global convolution anymore. Because of this, we define selective scan, which is a hardware-aware algorithm. The actual implementation is rather [involved](https://github.com/state-spaces/mamba/blob/62db608da60f6fc790b8ed9f4b3225e95ca15fde/csrc/selective_scan/selective_scan_fwd_kernel.cuh). The main idea is that we load the parameters $\Delta, A, B, C$ from HBM to SRAM, perform the discretization and recurrence in SRAM, and write the final output of size (B, L, D) back to main memory (HBM). To reduce memory requirements, the intermediate steps are not stored but recomputed during the backward pass. -#### Benefits of (Natural) Selection +### Benefits of (Natural) Selection Because of the selection mechanism, the model can choose what to store (or not) in its hidden state based on what it currently sees. It may also choose to reset its hidden state and start over. Selection enables the model to have strong in-context learning capabilities. -### Mamba Layer +## Mamba Layer The core of the Mamba architecture is the Mamba layer: -![Mamba](https://n1o.github.io/images/mamba_layer.png) +![Mamba](/images/mamba_layer.png) We are already familiar what is happening inside the SSM (Selective Scan) part of the Mamba. Prior to it we have two projections that expand the dimensionality of the input, than we perform short convolution as in M2 Bert with [torch.nn.Conv1d](https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html) on one branch on the other branch we apply just SiLu non-linearity (This is the same as the Gated approach found in other LLMs). After that we perform an additional projection, and we have all the inputs prepared for the SSM block. The output of the SSM block is than multiplied with the residual gate branch and finally we project the dimension back to match the input dimension. -## Mamba-2 +# Mamba-2 Mamba is a cool innovation, and it has led to multiple cool models, especially attention-SSM hybrid models like [Samba](https://github.com/microsoft/Samba) and [Zamba](https://github.com/Zyphra/transformers_zamba). However, the authors recognize some of its shortcomings. Its biggest weak point compared to Transformers is the lack of research in terms of scaling. For Transformers, we have multiple system optimizations on how to split up a model or how to split up processing long sequences into more GPUs. Here is two of them: @@ -109,190 +115,190 @@ Mamba is a cool innovation, and it has led to multiple cool models, especially a Mamba-2 is designed in a way that allows for Sequence Parallelism by passing the recurrent state between multiple GPUs. Tensor Parallelism is possible because of independent parallel projections of A, B, C, and X inputs of its SSM part. -### Semi-Separable Matrices +## Semi-Separable Matrices -This is a special structured matrix. We say that a lower triangular matrix {% katex inline %} M {% endkatex %} is N-semi separable if every submatrix contained in the lower triangular part has rank at most N. +This is a special structured matrix. We say that a lower triangular matrix $M$ is N-semi separable if every submatrix contained in the lower triangular part has rank at most N. Here, we are more interested in a special representation of N-semi separable called Sequentially Semi Separable (SSS). -#### Sequentially Semi Separable (N-SSS) +### Sequentially Semi Separable (N-SSS) -A lower triangular matrix {% katex inline %} M \in R^{(T,T)} {% endkatex %} has an N-sequentially semiseparable representation if we can write it as: +A lower triangular matrix $M \in R^{(T,T)}$ has an N-sequentially semiseparable representation if we can write it as: -{% katex %} M_{ij} = C_j^TA_j \cdots A_{i+1}B_i {% endkatex %} +$$ M_{ij} = C_j^TA_j \cdots A_{i+1}B_i$$ -- {% katex inline %} B_0, \cdots, B_{T - 1}, C_0, \cdots, C_{T-1} \in R^N {% endkatex %} are vectors -- {% katex inline %} A_0, \cdots, A_{T-1} \in R^{(N,N)} {% endkatex %} +- $B_0, \cdots, B_{T - 1}, C_0, \cdots, C_{T-1} \in R^N$ are vectors +- $A_0, \cdots, A_{T-1} \in R^{(N,N)} $ To express it in matrix form we define the SSS operator: -{% katex %} M = SSS(A_{0:T}, B_{0:T}, C_{0:T}) {% endkatex %} +$$ M = SSS(A_{0:T}, B_{0:T}, C_{0:T})$$ -It turns out that every N-semiseparable matrix M is also an N-sequentially semiseparable matrix. The main const of N-SSS representation that we can compress down the parameters to {% katex inline %} O(NT) {% endkatex %} +It turns out that every N-semiseparable matrix M is also an N-sequentially semiseparable matrix. The main const of N-SSS representation that we can compress down the parameters to $O(NT)$ -### State Space Duality +## State Space Duality Let's start by exploring a special case of 1-semiseparable (1-SS or just 1SS). This can be written in the Sequentially Semi-Separable form as: -{% katex %} SSS(a,b,c) = \text{diag}(c) \cdot M \cdot \text{diag}(b) {% endkatex %} -- {% katex inline %} M_{ij} = \prod_{t=j}^i a_t = a_{j:i}^{\times} {% endkatex %} +$$SSS(a,b,c) = \text{diag}(c) \cdot M \cdot \text{diag}(b) $$ +- $M_{ij} = \prod_{t=j}^i a_t = a_{j:i}^{\times}$ M is an 1-SS -{% katex %} M = 1SS(a_{0:T}) = \begin{bmatrix} 1 \\\ a_1 && 1 \\\ a_{2}a_1 && a_2 && 1 \\\ \vdots && \vdots && \ddots && \ddots \\\ a_{T-1}\cdots a_1 && a_{T-1}a_2 && \cdots && a_{T-1} && 1 \end{bmatrix} {% endkatex %} +$$M = 1SS(a_{0:T}) = \begin{bmatrix} 1 \\\ a_1 && 1 \\\ a_{2}a_1 && a_2 && 1 \\\ \vdots && \vdots && \ddots && \ddots \\\ a_{T-1}\cdots a_1 && a_{T-1}a_2 && \cdots && a_{T-1} && 1 \end{bmatrix}$$ -#### State Space Models are Separable Matrices +### State Space Models are Separable Matrices -We make a special assumption that we have a State Space Model without projections (no B, C) and the state dimension {% katex inline %} N = 1 {% endkatex %}. Then we can express the multiplication {% katex inline %} y = Mx {% endkatex %} as a recurrence: +We make a special assumption that we have a State Space Model without projections (no B, C) and the state dimension $N = 1$. Then we can express the multiplication $y = Mx$ as a recurrence: -{% katex %} y_t = a_{t:0}x_0 + \cdots + a_{t:t}x_t {% endkatex %} -{% katex %} y_t = a_t(a_{t-1:0}x_0 \cdots a_{t-1:t-1}x_{t-1} + a_{t:t}x_t {% endkatex %} -{% katex %} y_t = a_t y_{t-1} + x_t {% endkatex %} +$$y_t = a_{t:0}x_0 + \cdots + a_{t:t}x_t $$ +$$y_t = a_t(a_{t-1:0}x_0 \cdots a_{t-1:t-1}x_{t-1} + a_{t:t}x_t $$ +$$y_t = a_t y_{t-1} + x_t$$ We can generalize this further by expressing any State Space Model as matrix multiplication by an N-semiseparable matrix in a sequentially semiseparable form: -{% katex %} y = SSM(A,B,C)(x) = SSS(A,B,C) \cdot x {% endkatex %} +$$y = SSM(A,B,C)(x) = SSS(A,B,C) \cdot x $$ -### Linear(Recurrent) and Dual(Quadratic) form +## Linear(Recurrent) and Dual(Quadratic) form We already know we can express a State Space model as a matrix multiplication by an N-separable matrix in a sequentially semiseparable form: -{% katex %} y = SSS(A,B,C) \cdot x {% endkatex %} +$$ y = SSS(A,B,C) \cdot x $$ -However, if we naively first compute the {% katex inline %} SSS {% endkatex %} part and then multiply by {% katex inline %} x {% endkatex %}, we end up with an {% katex inline %} O(T^2) {% endkatex %} complexity. There is a more efficient recurrent way. However, let's break down the quadratic form first, since it has a tight connection to Attention. +However, if we naively first compute the $SSS$ part and then multiply by $x$, we end up with an $O(T^2)$ complexity. There is a more efficient recurrent way. However, let's break down the quadratic form first, since it has a tight connection to Attention. -#### Dual (Quadratic) Form +### Dual (Quadratic) Form Here, we take a small detour from SSMs and look into Linear Attention. We can express the attention mechanism as: -{% katex %} Y = \text{softmax}(QK^T) V {% endkatex %} +$$Y = \text{softmax}(QK^T) V $$ This is the most common form of attention, called Softmax Attention. By applying a causal mask, we get the following: -{% katex %} Y = (L \circ \text{softmax}(QK^T)) \cdot V {% endkatex %} +$$Y = (L \circ \text{softmax}(QK^T)) \cdot V $$ -- {% katex inline %} L {% endkatex %} is an lower triangular matrix with ones on and below the main diagonal +- $L$ is an lower triangular matrix with ones on and below the main diagonal In linear attention we drop the softmax to get: -{% katex %} Y = (L \circ (QK^T)) \cdot V {% endkatex %} +$$Y = (L \circ (QK^T)) \cdot V $$ This form is way nicer and we can rewrite it using einsum as: -{% katex %} Y = \text{einsum}(TN,SN,SP, TS \rightarrow TP)(Q,K,V,L) {% endkatex %} +$$Y = \text{einsum}(TN,SN,SP, TS \rightarrow TP)(Q,K,V,L)$$ Or we can express it as pairwise matrix multiplication: -1. {% katex inline %} G = \text{einsum}(TN,SN \rightarrow TS)(Q,K) {% endkatex %} resulting shape (T,S) -2. {% katex inline %} M = \text{einsum}(TS,TS \rightarrow TS)(G,L) {% endkatex %} resulting shape (T,S) -3. {% katex inline %} Y = \text{einsum}(TS,SP \rightarrow TP)(M,V) {% endkatex %} resulting shape (T,P) +1. $G = \text{einsum}(TN,SN \rightarrow TS)(Q,K)$ resulting shape (T,S) +2. $M = \text{einsum}(TS,TS \rightarrow TS)(G,L)$ resulting shape (T,S) +3. $Y = \text{einsum}(TS,SP \rightarrow TP)(M,V)$ resulting shape (T,P) - T, S are the target source dimensions, for autoregressive self-attention they are the same - P is the head dimensionality -#### Linear (Recurrent) Form +### Linear (Recurrent) Form Until now, we have just removed the softmax operation. However, we can go further by changing the order of matrix association, resulting in the following: -{% katex %} (QK^T)V = Q(K^TV) {% endkatex %} +$$(QK^T)V = Q(K^TV) $$ -With this, we can re-express the definition of {% katex inline %} Y {% endkatex %} as: +With this, we can re-express the definition of $Y$ as: -{% katex %} Y = Q \cdot \text{cumsum}(K^TV) {% endkatex %} +$$ Y = Q \cdot \text{cumsum}(K^TV)$$ - cumsum is just the cumulative sum It may seem that we got rid of the causal mask. This is technically not true, since the cumsum operation is a causal operation, and we just hid it. To make this clearer, we can express the same equation using einsum: -1. {% katex inline %} Z = \text{einsum}(SP,SN \rightarrow SPN)(V,K) {% endkatex %} resulting shape (S,P,N) -2. {% katex inline %} H = \text{einsum}(TS,SPN \rightarrow TPN)(V,K) {% endkatex %} resulting shape (T,P,N) this being optimized with subquadratic matrix multiplication -3. {% katex inline %} Y = \text{einsum}(TN,TPN \rightarrow TP)(V,K) {% endkatex %} resulting shape (T,P) +1. $Z = \text{einsum}(SP,SN \rightarrow SPN)(V,K)$ resulting shape (S,P,N) +2. $H = \text{einsum}(TS,SPN \rightarrow TPN)(V,K)$ resulting shape (T,P,N) this being optimized with subquadratic matrix multiplication +3. $Y = \text{einsum}(TN,TPN \rightarrow TP)(V,K)$ resulting shape (T,P) Lets break down the equation: 1. Expands the dimensionality by a factor N 2. Uses the mask matrix L explicitly, we flatten the dimensions of (P,N) resulting in multiplying an lower triangular matrix with an vector. This just just an cumulative sum operation: -{% katex %} y = \begin{bmatrix} 1 \\\ \cdots && \ddots \\\ 1 && \cdots && 1 \end{bmatrix}x \Leftrightarrow \begin{matrix} y_0 = x_0 \\\ y_t = y_{t-1} + x_t\end{matrix} {% endkatex %} +$$ y = \begin{bmatrix} 1 \\\ \cdots && \ddots \\\ 1 && \cdots && 1 \end{bmatrix}x \Leftrightarrow \begin{matrix} y_0 = x_0 \\\ y_t = y_{t-1} + x_t\end{matrix} $$ 3. Contracts the dimensionality back to P -### State Space Models and Recurrent Linear Attention +## State Space Models and Recurrent Linear Attention The hints that there should be a connection between the recurrent form of Linear Attention and the State Space Model should be obvious. Lets remind us about the definition of the State Space Model using SSS: -{% katex %} Y = SSS(A,B,C) \cdot x {% endkatex %} +$$ Y = SSS(A,B,C) \cdot x $$ The SSS matrix M is defined as: -- {% katex inline %} M_{ji} = C_j^TA_{j:i}B_i{% endkatex %} +- $M_{ji} = C_j^TA_{j:i}B_i$ -By constraining the A matrix to be diagonal {% katex inline %} A = aI {% endkatex %} we can rearrange the terms a bit to get: +By constraining the A matrix to be diagonal $A = aI$ we can rearrange the terms a bit to get: -{% katex %} M_{ji} = A_{j:i} \cdot (C_j^TB_i) {% endkatex %} +$$ M_{ji} = A_{j:i} \cdot (C_j^TB_i)$$ The equation for M in matrix form becomes: -{% katex %} L = 1SS(a) {% endkatex %} -{% katex %} M = L \circ (CB^T) {% endkatex %} +$$L = 1SS(a)$$ +$$M = L \circ (CB^T)$$ -- {% katex inline %} B,C \in R^{(T,N) {% endkatex %}} +- $B,C \in R^{(T,N)}$ -Now we can compute {% katex inline %} Y = MX {% endkatex %} using einsum as: +Now we can compute $Y = MX$ using einsum as: -1. {% katex inline %} G = \text{einsum}(TN,SN \rightarrow TS)(C,B) {% endkatex %} resulting shape (T,S) -2. {% katex inline %} M = \text{einsum}(TS,TS \rightarrow TS)(G,L) {% endkatex %} resulting shape (T,S) -3. {% katex inline %} Y = \text{einsum}(TS,SP \rightarrow TP)(M,X) {% endkatex %} resulting shape (T,P) +1. $G = \text{einsum}(TN,SN \rightarrow TS)(C,B)$ resulting shape (T,S) +2. $M = \text{einsum}(TS,TS \rightarrow TS)(G,L)$ resulting shape (T,S) +3. $Y = \text{einsum}(TS,SP \rightarrow TP)(M,X)$ resulting shape (T,P) If we assume that S = T, we end up with the same equations as in the Recurrent form of Linear Attention. And that is it, we have our duality. -### Mamba-2 Layer +## Mamba-2 Layer -At the beginning, I mentioned that there are few differences between Mamba and Mamba-2. One of them is a stronger constraint on the matrix A, for Mamba-2 it is {% katex inline %} A = aI {% endkatex %} in Mamba it was {% katex inline %} A = \text{diag}(a) {% endkatex %}. The reason to constrain to {% katex inline %} A = aI {% endkatex %} is that we can express the SSM as a matrix multiplication of an 1-SS matrix, which is more efficient to compute. +At the beginning, I mentioned that there are few differences between Mamba and Mamba-2. One of them is a stronger constraint on the matrix A, for Mamba-2 it is $A = aI$ in Mamba it was $A = \text{diag}(a)$. The reason to constrain to $A = aI$ is that we can express the SSM as a matrix multiplication of an 1-SS matrix, which is more efficient to compute. -![S6](https://n1o.github.io/images/mamba_2_architecture.png) +![S6](/images/mamba_2_architecture.png) -In the image above, we can see the differences between Mamba and Mamba-2. While the idea of Mamba was to have a function {% katex inline %} X \rightarrow Y {% endkatex %}, in Mamba-2, we instead think of a mapping of {% katex inline %} A, B, C, X \rightarrow Y {% endkatex %}. Because of this, we can parallelize the computation of the projections at the beginning of the block. This enables tensor parallelism and reduces the number of parameters. This is also analogous to Attention, where {% katex inline %} X, B, C {% endkatex %} correspond to {% katex inline %} Q, K, V {% endkatex %}. +In the image above, we can see the differences between Mamba and Mamba-2. While the idea of Mamba was to have a function $X \rightarrow Y$, in Mamba-2, we instead think of a mapping of $A, B, C, X \rightarrow Y$. Because of this, we can parallelize the computation of the projections at the beginning of the block. This enables tensor parallelism and reduces the number of parameters. This is also analogous to Attention, where $X, B, C$ correspond to $Q, K, V$. -Additionally, Mamba-2 introduces a larger head dimension {% katex inline %} P {% endkatex %}. While Mamba leverages {% katex inline %} P =1 {% endkatex %}, Mamba-2 leverages {% katex inline %} P = \{64, 128\} {% endkatex %}. Again, this is similar to conventions in Transformer Architecture. What does this head dimension in Mamba mean? If we have a head dimension of 1, we are computing an SSM for each channel independently. By increasing the head dimension, we achieve a sort of weight-tying where we share SSMs across multiple channels. +Additionally, Mamba-2 introduces a larger head dimension $P$. While Mamba leverages $P =1 $, Mamba-2 leverages $P = \{64, 128\}$. Again, this is similar to conventions in Transformer Architecture. What does this head dimension in Mamba mean? If we have a head dimension of 1, we are computing an SSM for each channel independently. By increasing the head dimension, we achieve a sort of weight-tying where we share SSMs across multiple channels. -Overall, it may seem that Mamba-2 is less expressive than Mamba. However, due to optimizations, we are able to train Mamba-2 models with much larger state dimensions (in Mamba-1 we had {% katex inline %} N=16 {% endkatex %}, whereas in Mamba-2 we can go up to {% katex inline %} N=256 {% endkatex %} or more), while also being much faster during training. +Overall, it may seem that Mamba-2 is less expressive than Mamba. However, due to optimizations, we are able to train Mamba-2 models with much larger state dimensions (in Mamba-1 we had $N=16$, whereas in Mamba-2 we can go up to $N=256$ or more), while also being much faster during training. The model also adds an additional normalization layer, which improves the stability of larger models. There is nothing more to say about Mamba-2; it is simply a more efficient version of Mamba, incorporating many lessons learned from Transformers and the strong theoretical foundation behind SSMs and Semi-Separable Matrices. -#### Algorithm +### Algorithm -As with Mamba-1, we cannot use Global Convolution. For Mamba-2, we need an efficient way to compute the matrix {% katex inline %} M {% endkatex %}. Luckily, the computation is much simpler than for Mamba-1, and we do not need to implement a low-level GPU kernel. The algorithm consists mostly of matrix multiplications. +As with Mamba-1, we cannot use Global Convolution. For Mamba-2, we need an efficient way to compute the matrix $M$. Luckily, the computation is much simpler than for Mamba-1, and we do not need to implement a low-level GPU kernel. The algorithm consists mostly of matrix multiplications. -![Mamba-2 Blocks](https://n1o.github.io/images/mamba_2_diagonal_off_diagonal_blocks.png) +![Mamba-2 Blocks](/images/mamba_2_diagonal_off_diagonal_blocks.png) -This is an example for {% katex inline %} T=9 {% endkatex %} where we decompose it into chunks of length {% katex inline %} Q = 3 {% endkatex %}, we can generalize it as: +This is an example for $T=9$ where we decompose it into chunks of length $Q = 3$, we can generalize it as: -1. {% katex inline %} M^{(j,j)} = SSM(A_{jQ:(j+1)Q},B_{jQ(j+1)Q},C_{jQ:(j+1)Q} {% endkatex %} for the diagonal blocks -2. {% katex inline %} M^{(i,j)} = \begin{bmatrix}C_{jQ}^TA_{jQ:jQ-1} \\ \vdots \\ C^T_{(j+1)Q-1}A_{(j+1)Q-1:jQ-1}\end{bmatrix}A_{jQ-1:(i+1)Q-1} \begin{bmatrix}B_{iQ}^TA_{(i+1)Q-1:iQ} \\ \vdots \\ B_{(i+1)Q-1}^T A_{(i+1)Q-1:(i+1)Q-1}\end{bmatrix}^T {% endkatex %} for the off-diagonal low rank blocks +1. $M^{(j,j)} = SSM(A_{jQ:(j+1)Q},B_{jQ(j+1)Q},C_{jQ:(j+1)Q}$ for the diagonal blocks +2. $M^{(i,j)} = \begin{bmatrix}C_{jQ}^TA_{jQ:jQ-1} \\ \vdots \\ C^T_{(j+1)Q-1}A_{(j+1)Q-1:jQ-1}\end{bmatrix}A_{jQ-1:(i+1)Q-1} \begin{bmatrix}B_{iQ}^TA_{(i+1)Q-1:iQ} \\ \vdots \\ B_{(i+1)Q-1}^T A_{(i+1)Q-1:(i+1)Q-1}\end{bmatrix}^T$ for the off-diagonal low rank blocks -##### Diagonal Blocks +#### Diagonal Blocks -The general idea is that {% katex inline %} Q {% endkatex %} is rather small. Because of this, we can use the dual quadratic form of Structured Masked Attention (more on this later) and perform the computation for each block in parallel. +The general idea is that $Q$ is rather small. Because of this, we can use the dual quadratic form of Structured Masked Attention (more on this later) and perform the computation for each block in parallel. -##### Low Rank Blocks +#### Low Rank Blocks Here, we have three parts (the following example is the breakdown of the leftmost bottom block from the image above): -1. {% katex inline %} \begin{bmatrix} C_6^T A_{6:5} \\\ C_7^TA_{7:5} \\\ C_8^TA_{8:5} \end{bmatrix}^T {% endkatex %} this are the left factors (C-block factors) -2. {% katex inline %} A_{5:2} {% endkatex %} this are the center factors (A-block factors) -3. {% katex inline %} \begin{bmatrix} B_0^T A_{2:0} \\\ B_1^TA_{2:1} \\\ B_2^TA_{2:2} \end{bmatrix}^T {% endkatex %} this are the right factors (B-block factors) +1. $\begin{bmatrix} C_6^T A_{6:5} \\\ C_7^TA_{7:5} \\\ C_8^TA_{8:5} \end{bmatrix}^T$ this are the left factors (C-block factors) +2. $A_{5:2}$ this are the center factors (A-block factors) +3. $\begin{bmatrix} B_0^T A_{2:0} \\\ B_1^TA_{2:1} \\\ B_2^TA_{2:2} \end{bmatrix}^T$ this are the right factors (B-block factors) -##### Pytorch +#### Pytorch Compared to Mamba-1's selective scan the implementation is way more straight forward: @@ -320,76 +326,76 @@ def ssd(X, A, B, C, block_len=64, initial_states=None): """ assert X.dtype == A.dtype == B.dtype == C.dtype assert X.shape[1] % block_len == 0 - ## Rearrange into blocks/chunks + # Rearrange into blocks/chunks X, A, B, C = [rearrange(x, "b (c l) ... -> b c l ...", l=block_len) for x in (X, A, B, C)] A = rearrange(A, "b c l h -> b h c l") A_cumsum = torch.cumsum(A, dim=-1) - ## 1. Compute the output for each intra-chunk (diagonal blocks) + # 1. Compute the output for each intra-chunk (diagonal blocks) L = torch.exp(segsum(A)) Y_diag = torch.einsum("bclhn,bcshn,bhcls,bcshp->bclhp", C, B, L, X) - ## 2. Compute the state for each intra-chunk - ## (right term of low-rank factorization of off-diagonal blocks; B terms) + # 2. Compute the state for each intra-chunk + # (right term of low-rank factorization of off-diagonal blocks; B terms) decay_states = torch.exp((A_cumsum[:, :, :, -1:] - A_cumsum)) states = torch.einsum("bclhn,bhcl,bclhp->bchpn", B, decay_states, X) - ## 3. Compute the inter-chunk SSM recurrence; produces correct SSM states at chunk boundaries - ## (middle term of factorization of off-diag blocks; A terms) + # 3. Compute the inter-chunk SSM recurrence; produces correct SSM states at chunk boundaries + # (middle term of factorization of off-diag blocks; A terms) if initial_states is None: initial_states = torch.zeros_like(states[:, :1]) states = torch.cat([initial_states, states], dim=1) decay_chunk = torch.exp(segsum(F.pad(A_cumsum[:, :, :, -1], (1, 0)))) new_states = torch.einsum("bhzc,bchpn->bzhpn", decay_chunk, states) states, final_state = new_states[:, :-1], new_states[:, -1] - ## 4. Compute state -> output conversion per chunk - ## (left term of low-rank factorization of off-diagonal blocks; C terms) + # 4. Compute state -> output conversion per chunk + # (left term of low-rank factorization of off-diagonal blocks; C terms) state_decay_out = torch.exp(A_cumsum) Y_off = torch.einsum('bclhn,bchpn,bhcl->bclhp', C, states, state_decay_out) - ## Add output of intra-chunk and inter-chunk terms (diagonal and off-diagonal blocks) + # Add output of intra-chunk and inter-chunk terms (diagonal and off-diagonal blocks) Y = rearrange(Y_diag+Y_off, "b c l h p -> b (c l) h p") return Y, final_state ``` -##### Performance +#### Performance Mamba-2, like Mamba-1, with hidden state size \( N \), has the same training speed \( O(TN^2) \) and inference speed \( O(N^2) \). However, the biggest improvement is the use of matrix multiplication in Mamba-2, which is much more efficient than the selective scan in Mamba-1. -## State Space Duality Additional Notes +# State Space Duality Additional Notes Overall, the State Space Duality paper introduces many concepts; here are arguably the most important ones: -### Structured Masked Attention +## Structured Masked Attention This builds upon the notion of linear attention, where we expressed the causal mask matrix L as a cumulative sum. However, we can generalize the mask matrix L to any matrix that supports fast matrix multiplication. -![[https://n1o.github.io/images/structured_attention.png]] +![[/images/structured_attention.png]] In this case, we view the attention mechanism through the following equations (this is also the quadratic form mentioned earlier): -{% katex %} Y = MV {% endkatex %} +$$Y = MV$$ -{% katex %} M = QK^T \circ L {% endkatex %} +$$M = QK^T \circ L$$ Where L is our mask matrix, which we can choose as we like. In the context of State Space duality, we choose it as 1-semiseparable matrix. -### Multi patterns for SSMs +## Multi patterns for SSMs Again, this builds upon analogies to Attention, where multihead attention involves applying self-attention multiple times and concatenating the results. We can achieve something similar by applying the SSD algorithm and broadcasting it across multiple dimensions. -#### Multi-Contract SSM +### Multi-Contract SSM This is analogous to Multi-Query Attention, where we share K and V across all the heads of Q. For attention, this makes a lot of sense since we cache K and V pairs. In SSMs, this is equivalent to sharing X and B across multiple heads of the SSM, and having C (parameters that control the contraction) be independent per head. -#### Multi-Expand SSM +### Multi-Expand SSM Here, we share C and X across multiple heads, and B (controls expansion) is independent per head. -#### Multi-Input SSM +### Multi-Input SSM Here, we share B and C across multiple heads, and X is independent. For an SSM like Mamba, we consider X as the input. Because of this, it is a better fit to have a unique X per head. Technically, we can view the S6 layer introduced in Mamba as having Head Dimension P = 1, which means that each channel has independent SSM dynamics A, and it is a Multi-Input SSM where we share B and C matrices across all channels. -## TLDR; +# TLDR; This was probably a lot to take in, so to sum it up, we introduced Mamba. Mamba is a State Space model whose dynamics are dependent on the input, which improves its ability for In-Context learning. Because we want efficient computation, we need to derive a hardware-efficient algorithm, and to do that, we need to enforce structure on the matrices used by Mamba. Mamba-2 tackles the efficiency problem by enforcing even more constraints, making the model more contained but easier to scale and allowing for larger hidden dimensions. From 8bfb4aab2b388c60f6d7c3fd163f21b8179f7975 Mon Sep 17 00:00:00 2001 From: mbarak Date: Fri, 6 Sep 2024 11:13:12 +0200 Subject: [PATCH 3/3] Enable aswesome ssm --- config.toml | 9 +++++++-- content/awesome-SSM.md | 2 +- 2 files changed, 8 insertions(+), 3 deletions(-) diff --git a/config.toml b/config.toml index 340bd63..b2da5e2 100644 --- a/config.toml +++ b/config.toml @@ -115,11 +115,16 @@ disqusShortname = "mbarak-io" url = "awesome-t5/" [[languages.en.menu.main]] - name = "Projects" + name = "Awesome SSM" weight = 4 + url = "awesome-ssm/" + + [[languages.en.menu.main]] + name = "Projects" + weight = 5 url = "projects/" [[languages.en.menu.main]] name = "Contact me" - weight = 5 + weight = 6 url = "contact/" diff --git a/content/awesome-SSM.md b/content/awesome-SSM.md index ca3d879..cea210f 100644 --- a/content/awesome-SSM.md +++ b/content/awesome-SSM.md @@ -4,7 +4,7 @@ date: 2024-09-05T10:39:29+01:00 draft: false --- -Similarly to the [Awesome T5]({{< relref "posts/awesome-t5.md" >}}) series, this series will cover a bunch of posts about State Space Models, their extensions and applications. +This series will cover a bunch of posts about State Space Models, their extensions and applications. # Basics - [Mamba, Mamba2]({{< relref "posts/from-mamba-to-mamba2.md" >}})