Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Develop a canonical form for scenarios for notebook testing #39

Open
cmickeyb opened this issue Apr 29, 2024 · 1 comment
Open

Develop a canonical form for scenarios for notebook testing #39

cmickeyb opened this issue Apr 29, 2024 · 1 comment
Assignees
Labels
testing Support additional testing

Comments

@cmickeyb
Copy link
Contributor

cmickeyb commented Apr 29, 2024

Automating notebook testing will require significant effort. Until that is necessary, we require a more systematic way of testing the notebooks. My proposal is that we develop a set of "scenarios" that test one or more aspects of the notebooks. Each scenario describes what we expect to work. A scenario would be described by a markdown file in the directory /docs/tests and would have a form such as {test_order}_{test_name}.md where test_order is a number that increases with the complexity of the tests and test_name is a short name for the test. Each document would contain sections for a short description of the test, expected setup (e.g. keys, services and service groups), and then a series of textually described tests. A test must include a description of the expected outcome.

This is an example of what I've been developing as a scenario for multi-user testing.

Verify Token Transfer

Use token ownership transfer to test multi-user capabilities.

Setup

Prepare the multiuser Jupyter docker test. To start the services, run make -C docker test_multiuser_jupyter from the PDO contracts root directory. When the services start the two Jupyter containers (container_1 and container_2) will show the access URLs. Open each URL in a separate browser window. For the rest of this discussion, we will refer to the windows as container_1 (for the user) and container_2 (for the issuer).

Container_1: Create the keys for the token user. Open the notebook documents/getting_started.ipynb and run all cells. Navigate to "Create Keys" and create a new key pair for the identity "blue_user". Refresh the list of public keys and click on the "blue_user" key file to download the "blue_user" public key.

Container_2: Create the keys for the token issuer. Open the notebook documents/getting_started.ipynb and run all cells. Navigate to "Create Keys" and create a new key pair for the identity "blue_issuer". Refresh the public and private key lists to verify that the keys have been created.

Container_2: Import the "blue_user" public key into the issuer. Navigate to the "Create Keys" section and import the the "blue_user" public key that was downloaded earlier. Refresh the public key list to verify that the key was uploaded successfully.

Container_2: Create the service group. Navigate to "Create Service Group" and create a new three new service groups, one each for the enclave, provisioning and storage services (select each of the tabs separately). Select the first three services of each type for each group. Call each of the groups "issuer_group". For the storage service group, set the required duration to 3600. Refresh each of the group lists to verify that the groups have been created.

Test

Container_2: Create the token issuer and the tokens. From the launcher open the notebook in exchange/factories/token_issuer.ipynb. Run all cells. Respond to the prompts as follows:

  • Identity: blue_issuer
  • Name of the token class: blue_widget
  • Token description: my token
  • Count: 5
  • Service group: issuer_group

This will create a link to a token issuer notebook. Click on the link to open the token issuer notebook. Run all cells. You can verify that the tokens were correctly created in the "Mint the Tokens" section. The last line in the output should say "5 tokens minted".

In the "Create Token Notebooks" section, edit the code block and replace %%skip True with %%skip False. Then run the cell to generate a new notebook for each of the tokens. There should be five links in the output section, each labelled with the token's name.

Click on "Token token_1" to open the token_1 notebook. Run all cells. To verify that the notebook is correct, in the "Invoke Echo Operation" section, change %%skip True to %%skip False and run the cell. The output should contain the string parameter from the invocation.

In the "Transfer Ownership" section, change %%skip True to %%skip False and change user2 to blue_user. Run the cell. The output should indicate that the token has been transferred.

In the "Share Contract" section, change %%skip True to %%skip False and run the cell. This should create a "Download Contract" button. Click the button and save the contract collection file.

Shell: Move the contract collection file. Move the contract collection file downloaded in the previous step into the
docker/xfer/client directory where you are running to contracts docker containers. The name of the file will be something like blue_widget_<ID>.zip where ID is a unique identifier for the contract.

Container_1: Import and run the token contract. From the launcher open the notebook in exchange/factories/token_import.ipynb. Run all cells. Respond to the prompts as follows:

  • Identity: blue_user
  • Name of the token class: blue_widget
  • Name of the token: token_1
  • Name of the token import file: /project/pdo/xfer/client/blue_widget_<ID>.zip

This will create a link to a token notebook. Click on the link to open the notebook. Run all cells. To verify that the notebook is correct, in the "Invoke Echo Operation" section, change %%skip True to %%skip False and run the cell. The output should contain the string parameter from the invocation.

@cmickeyb cmickeyb added the testing Support additional testing label Apr 29, 2024
@cmickeyb cmickeyb self-assigned this Apr 29, 2024
@g2flyer
Copy link

g2flyer commented Apr 29, 2024

having scenarios would be very useful. I wonder though whether we could make them a bit less verbose and more check-list like? I think that would make (repeated) testing probably a bit faster and less error-prone?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
testing Support additional testing
Projects
None yet
Development

No branches or pull requests

2 participants