Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update deferred semaphore docstring #1375

Closed
wants to merge 2 commits into from
Closed
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
14 changes: 7 additions & 7 deletions truss/templates/server/model_wrapper.py
Original file line number Diff line number Diff line change
Expand Up @@ -84,21 +84,21 @@ async def deferred_semaphore_and_span(
"""
await semaphore.acquire()
trace.use_span(span, end_on_exit=False)
deferred = False
released = False

def release_and_end() -> None:
nonlocal released
released = True
semaphore.release()
span.end()

def defer() -> Callable[[], None]:
nonlocal deferred
deferred = True
return release_and_end

try:
yield defer
finally:
if not deferred:
if not released:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this correct?n Regardless of whether "defer" is called or not, it looks like we still call "release_and_end".

In the case that defer is called, wouldn't we want to not immediately release the sempahore? I feel like we're losing the deferred behavior unless I'm missing something

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wow you're 100% right, great catch! I think we can't get the option to defer cleanup, with automatic cleanup if we choose not to invoke it. The original behavior makes sense to me, but I've just adjusted the docstring to reflect the current behavior.

release_and_end()


Expand Down Expand Up @@ -738,13 +738,13 @@ async def _handle_generator_response(
generator: Union[Generator[bytes, None, None], AsyncGenerator[bytes, None]],
span: trace.Span,
trace_ctx: trace.Context,
get_cleanup_fn: Callable[[], Callable[[], None]] = lambda: lambda: None,
cleanup_fn: Callable[[], None] = lambda: None,
):
if self._should_gather_generator(request):
return await _gather_generator(generator)
else:
return await self._stream_with_background_task(
generator, span, trace_ctx, cleanup_fn=get_cleanup_fn()
generator, span, trace_ctx, cleanup_fn=cleanup_fn
)

async def completions(
Expand Down Expand Up @@ -824,7 +824,7 @@ async def __call__(
predict_result,
span_predict,
detached_ctx,
get_cleanup_fn=get_defer_fn,
cleanup_fn=get_defer_fn(),
)

if isinstance(predict_result, starlette.responses.Response):
Expand Down
Loading