Skip to content

Commit

Permalink
fixup docker config/lib, clear cache
Browse files Browse the repository at this point in the history
Closes: #1
  • Loading branch information
matatonic committed Sep 18, 2024
1 parent daa7483 commit 09857c9
Show file tree
Hide file tree
Showing 3 changed files with 19 additions and 4 deletions.
4 changes: 2 additions & 2 deletions Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -6,11 +6,11 @@ RUN apt-get update && apt-get install --no-install-recommends -y \
&& apt-get clean && rm -rf /var/lib/apt/lists/*

WORKDIR /app
RUN mkdir config models lora
RUN mkdir -p config/lib models lora
COPY requirements.txt .
#RUN --mount=type=cache,target=/root/.cache/pip pip install -r requirements.txt
RUN pip --no-cache install -r requirements.txt
COPY config/lib /app/config/
COPY config/lib /app/config/lib
COPY *.py *.json LICENSE /app/

ENV CLI_COMMAND="python images.py"
Expand Down
13 changes: 13 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,19 @@ An OpenAI API compatible image generation server for the FLUX.1 family of models
- [ ] **Easy to setup and use**: Maybe?


## Recent Updates

#### 2024-09-18
- fixup Docker config/lib path, thanks @[nicolaschan](https://github.com/nicolaschan)

<details>
<summary>Click to expand for older updates.</summary>

#### 2024-09-05
- Initial release

</details>

## Quickstart

> This is brand new software, if you find any problems or have suggestions please open a [new issue](https://githib.com/matatonic/openedai-images-flux/issues) on GitHub!
Expand Down
6 changes: 4 additions & 2 deletions images.py
Original file line number Diff line number Diff line change
Expand Up @@ -267,8 +267,10 @@ async def generate_images(pipe, **generation_kwargs) -> list:

generation_kwargs['generator'] = torch.Generator("cpu").manual_seed(seed)

return pipe(**generation_kwargs).images, seed

try:
return pipe(**generation_kwargs).images, seed
finally:
torch.cuda.empty_cache()

async def enhance_prompt(prompt: str, **enhancer) -> str:
enhancer['messages'].extend([{'role': 'user', 'content': prompt }])
Expand Down

0 comments on commit 09857c9

Please sign in to comment.