Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update about.rst #13

Merged
merged 1 commit into from
Dec 4, 2023
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions docs/source/about.rst
Original file line number Diff line number Diff line change
Expand Up @@ -12,8 +12,8 @@ An API for transparent science on black-box AI
models, but they do not let you see model internals.

The nnsight library is different: it gives you full access to all the neural network internals.
When used together with a remote service like the National Deep Inference Facility (NDIF),
it lets you run expriments on huge open models easily, with full transparent access.
When used together with a remote service like the `National Deep Inference Facility <https://ndif.us/>`_ (NDIF),
it lets you run experiments on huge open models easily, with full transparent access.
The nnsight library is also terrific for studying smaller local models.

.. figure:: _static/images/remote_execution.png
Expand All @@ -25,7 +25,7 @@ How you use nnsight

Nnsight is built on pytorch.

Running inference on a huge remote model with nnsight is very similar to running a neural network locally on your own workstataion. In fact, with nnsight, the same code for running experiments locally on small models can also be used on large models, just by changing a few arguments.
Running inference on a huge remote model with nnsight is very similar to running a neural network locally on your own workstation. In fact, with nnsight, the same code for running experiments locally on small models can also be used on large models, just by changing a few arguments.

The difference between nnsight and normal inference is that when you use nnsight, you do not treat the model as an opaque black box.
Instead, you set up a python ``with`` context that enables you to get direct access to model internals while the neural network runs.
Expand Down