Skip to content

Commit

Permalink
Merge pull request #782 from reportportal/develop
Browse files Browse the repository at this point in the history
Release 24.2 docs
  • Loading branch information
AmsterGet authored Sep 30, 2024
2 parents 0f801c6 + 2a1d156 commit 3cf34e5
Show file tree
Hide file tree
Showing 167 changed files with 1,687 additions and 350 deletions.
140 changes: 94 additions & 46 deletions docs/analysis/UniqueErrorAnalysis.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -5,84 +5,132 @@ sidebar_label: Unique Error Analysis

# Unique Error Analysis

You can look at the test failure analysis from different points of view: qualitative (passing rate – How many tests have failed?) and quantitative (Why have they failed?). For example, if 1000 test cases are failed, then
You can analyze test failures from two perspectives: quantitative (How many tests have failed?) and qualitative (Why did they fail?). For example, if 1,000 test cases fail, then:

1. they could fail for the same reason
1. They could all fail for the same reason.

2. they could fail for various reasons
2. They could fail for various reasons.

While failed tests allow you to understand what is broken, Unique Error analysis functionality implemented in version 5.7 will show you why it broke. The main advantage of this solution is that a list of all unique errors of the launch is presented in one place. Moreover, the system automatically groups tests by the same errors: when you expand some error, you see a list of steps where it occurred.
While failed tests allow you to understand what is broken, Unique Error analysis functionality implemented in version 5.7 will show you why it broke. Moreover, the system automatically groups tests by the same errors: when you expand some error, you see a list of tests where it occurred.

<MediaViewer src="https://youtu.be/mOssRH7ZxUM" alt="Video of Unique Error functionality" type="video" />
:::important
The main advantage of this solution is that a list of all unique errors of the launch is presented in one place.
:::

Thanks to the Unique Error analysis functionality, ReportPortal got the following benefits:

**“Unique error auto-analysis”** is set ON by default.
1. A list of unique errors for the launch, with grouping.
2. Easier test results analysis.
3. Machine learning (ML) suggestions for groups of tests.
4. The [‘New Errors’](/quality-gates/QualityRulesConfiguration/#new-errors-in-the-run) Quality Gates rule.

<MediaViewer src={require('./img/UniqueErrorAnalysis/UniqueError1.png')} alt="Unique error auto-analysis ON" />
## How to run Unique Error analysis

## “Include/exclude numbers” settings
You can run Unique Error analysis automatically or manually.

There are 2 settings: “include/exclude numbers” – it depends if you decide that numbers in error logs have significant value for analysis or not.
To run Unique Error analysis **automatically**:

<MediaViewer src={require('./img/UniqueErrorAnalysis/UniqueError2.png')} alt="Include/exclude numbers" />
1. Go to the Project Settings.
2. Open Analyzer tab.
3. Open ‘Unique Errors’ settings.
4. Check the "Auto-Unique Error" checkbox.

Now, Unique error auto-analysis will be started after a launch has been finished.

To see the list of “Unique errors” for the launch, open any item level in the launch and click **“Unique errors”** tab.
<MediaViewer src={require('./img/UniqueErrorAnalysis/UniqueError1.png')} alt="Automatic Unique Error analysis is ON" />

<MediaViewer src={require('./img/UniqueErrorAnalysis/UniqueError3.png')} alt="AI-based defects triage: Unique Errors" />
5. Run a launch.
6. After launch finish, open any item level in the launch.
7. Click ‘Unique errors’ tab.

<MediaViewer src={require('./img/UniqueErrorAnalysis/UniqueError4.png')} alt="Unique errors Test level" />
<MediaViewer src={require('./img/UniqueErrorAnalysis/UniqueError2.png')} alt="Unique Error analysis on Suite level" />

<MediaViewer src={require('./img/UniqueErrorAnalysis/UniqueError5.png')} alt="Unique errors Step level" />
<MediaViewer src={require('./img/UniqueErrorAnalysis/UniqueError3.png')} alt="Unique Error analysis on Test level" />

Finally, you can see the list of “Unique errors”.
<MediaViewer src={require('./img/UniqueErrorAnalysis/UniqueError4.png')} alt="Unique Error analysis on Step level" />

<MediaViewer src={require('./img/UniqueErrorAnalysis/UniqueError6.png')} alt="Unique errors Log level" />
:::note
Automatic Unique Error analysis is activated by default.
:::

There you can see a list with groups of error logs. You can expand a group to check what tests belong to the same one and it could give you a hint during error analysis and defects assigning. The groups are formed based on unique error logs, some small error logs can be merged and displayed as one error.
In case automatic Unique Error analysis is turned off, there are two ways to run Unique Error analysis **manually**:

<MediaViewer src={require('./img/UniqueErrorAnalysis/UniqueErrorAddition1.png')} alt="List with groups of error logs" />
1. Run Unique error analysis manually from any item level.

## How to run “Unique error analysis manually
<MediaViewer src={require('./img/UniqueErrorAnalysis/UniqueError5.png')} alt="Run Unique error analysis manually from item level" />

:::tip
You can also run “Unique error analysis” manually from any item level in case auto analysis is set OFF. Please, follow the steps below:
:::
<MediaViewer src={require('./img/UniqueErrorAnalysis/UniqueError7.png')} alt="Run “Unique error analysis” manually" />
2. Run Unique Error analysis from the menu next to a particular launch.

<MediaViewer src={require('./img/UniqueErrorAnalysis/UniqueError6.png')} alt="Run Unique Error analysis for the launch manually" />

After Unique Error analysis is finished, open any item level in the launch and click the ‘Unique errors’ tab.

## How to see Unique Errors

Finally, you can see the list of Unique Errors.

There you can see a list of unique logs with number of failed tests related to them. The number of matches found per Unique Errors is displayed in the column ‘MATCHED TESTS’. By default, Unique Errors are displayed from the one with the highest number of failed tests. By clicking on the column name, the sorting is updated each time.

<MediaViewer src={require('./img/UniqueErrorAnalysis/UniqueError7.png')} alt="MATCHED TESTS column" />

You can expand unique error to check what tests belong to the same one and it could give you a hint during error analysis and defects assigning. The groups are formed based on unique error logs, some small error logs can be merged and displayed as one error.

<MediaViewer src={require('./img/UniqueErrorAnalysis/UniqueError8.png')} alt="Expanding unique error" />

## How to get ML suggestions for the group of tests

You can also run “Unique Error analysis” from the menu next to a particular launch.
In addition, there is a possibility to get ML suggestions for the group of tests. It speeds up the process of analyzing failed tests and assigning defect types by several times.

<MediaViewer src={require('./img/UniqueErrorAnalysis/UniqueError8.png')} alt="run “Unique error analysis” for particular launch" />
For that,

## Examples with “Include/exclude numbers” settings
1. Open any item level in the launch.
2. Click ‘Unique Errors’ tab.
3. Expand a cluster with unique error.
4. Select some failed tests.

<MediaViewer src={require('./img/UniqueErrorAnalysis/UniqueError9.png')} alt="Some failed tests" />

5. Open ‘Make decision’ modal.
6. Check Analyzer suggestions.

<MediaViewer src={require('./img/UniqueErrorAnalysis/UniqueError10.png')} alt="Analyzer suggestions" />

<MediaViewer src="https://youtu.be/fVdpuDWI3vU" alt="Video of Unique Error Analysis in our test results reporting service" type="video" />

## 'Include/exclude numbers' settings

There are 2 settings to decide whether numbers in error logs have significant value for analysis or not:

1. Include numbers to analyzed logs.

2. Exclude numbers from analyzed logs.

<MediaViewer src={require('./img/UniqueErrorAnalysis/UniqueError11.png')} alt="Include/exclude numbers to analyzed logs" />

Let’s consider some examples with the same Unique Errors but with different include/exclude numbers settings.

We have 2 errors with identical text, but the difference is numeric value in the first error.

1. An example with **“Include numbers to analyzed logs”** setting. Error with numeric value is displayed:

<MediaViewer src={require('./img/UniqueErrorAnalysis/UniqueError9.png')} alt="Include numbers to analyzed logs setting" />
We have 2 errors with identical text, but the difference is numeric value in the first error.

1. An example with 'Include numbers to analyzed logs' setting. Error with numeric value is displayed:

<MediaViewer src={require('./img/UniqueErrorAnalysis/UniqueError12.png')} alt="Example with 'Include numbers to analyzed logs' setting" />

2. An example with 'Exclude numbers from analyzed logs' setting. As you can see, error with numeric value is not displayed:

2. An example with **Exclude numbers from analyzed logs** setting. As you can see, error with numeric value is not displayed:
<MediaViewer src={require('./img/UniqueErrorAnalysis/UniqueError13.png')} alt="Example with 'Exclude numbers from analyzed logs' setting" />

<MediaViewer src={require('./img/UniqueErrorAnalysis/UniqueError10.png')} alt="Exclude numbers from analyzed logs setting" />
## How to extend Unique Error analysis

## How to get ML suggestions for the group of steps
Unique Error analysis functionality can be extended by the [ReportPortal premium feature](https://reportportal.io/pricing/saas)[Quality Gates](/quality-gates/).

In addition, there is a possibility to get **ML suggestions** for the group of steps. It speeds up the process of analyzing failed tests and assigning defect types by several times.
If you have Quality Gate plugin installed, you can create Quality Gate for the launch with the 'New Errors' rule. This rule helps identify any new unique errors in the current launch by comparing them to another specified launch.

<MediaViewer src={require('./img/UniqueErrorAnalysis/UniqueErrorAddition2.png')} alt="Select steps" />
If there is Quality Gate for the launch with the 'New Errors' rule, the 'ALL UNIQUE ERRORS' dropdown is visible and enabled. The available options are: All, New Errors and Known Errors.

<MediaViewer src={require('./img/UniqueErrorAnalysis/UniqueErrorAddition3.png')} alt="Get ML suggestions for the group of steps" />
<MediaViewer src={require('./img/UniqueErrorAnalysis/UniqueError14.png')} alt="Enabled ALL UNIQUE ERRORS dropdown" />

Also, the new Quality Gates rule – [“New Errors”](/quality-gates/QualityRulesConfiguration#new-errors-in-the-run) – was implemented based on the “Unique Error analysis” functionality. This rule helps to identify if there are new unique errors in the current launch by comparing it to another specified launch.
If there is no Quality Gate for the launch with the 'New Errors' rule, the 'ALL UNIQUE ERRORS' dropdown is visible but disabled. A tooltip appears on hover for the disabled state: 'No Quality Gate with 'New Error' rule for the launch.'

To summarize, ReportPortal got the following benefits thanks to the “Unique Error analysis” functionality:
<MediaViewer src={require('./img/UniqueErrorAnalysis/UniqueError15.png')} alt="Disabled ALL UNIQUE ERRORS dropdown" />

1. a list of unique errors for the launch with grouping,
2. facilitating tests results analysis,
3. ML suggestions for a group of steps,
4. new Quality Gates rule.
If Quality Gate plugin is not installed or disabled, the 'ALL UNIQUE ERRORS' dropdown is invisible.

This way you can easily sort out the failures based on the unique errors found.
<MediaViewer src={require('./img/UniqueErrorAnalysis/UniqueError16.png')} alt="Invisible ALL UNIQUE ERRORS dropdown" />
Binary file modified docs/analysis/img/UniqueErrorAnalysis/UniqueError1.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/analysis/img/UniqueErrorAnalysis/UniqueError10.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/analysis/img/UniqueErrorAnalysis/UniqueError2.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/analysis/img/UniqueErrorAnalysis/UniqueError3.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/analysis/img/UniqueErrorAnalysis/UniqueError4.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/analysis/img/UniqueErrorAnalysis/UniqueError5.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/analysis/img/UniqueErrorAnalysis/UniqueError6.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/analysis/img/UniqueErrorAnalysis/UniqueError7.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/analysis/img/UniqueErrorAnalysis/UniqueError8.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/analysis/img/UniqueErrorAnalysis/UniqueError9.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file not shown.
Binary file not shown.
Binary file not shown.
4 changes: 4 additions & 0 deletions docs/configuration/IntegrationViaPlugin.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -11,12 +11,16 @@ Users can enhance ReportPortal by adding additional integrations with:
* [Rally](/plugins/Rally)
* [Azure DevOps](/plugins/AzureDevOps)
* [GitLab](/plugins/GitLab)
* [Monday](/plugins/Monday)
* [Sauce Labs](/plugins/SauceLabs)
* [E-mail server](/plugins/EmailServer)
* [Slack](/plugins/Slack)
* [LDAP](/plugins/authorization/LDAP)
* [Active Directory](/plugins/authorization/ActiveDirectory)
* [Okta SAML](/plugins/authorization/SamlProviders/OktaSaml)
* [Azure SAML](/plugins/authorization/SamlProviders/AzureSaml)
* [JUnit](/plugins/JUnit)
* [RobotFramework](/plugins/RobotFramework)

<MediaViewer src={require('./img/Integrations.png')} alt="Integrations with our test automation results dashboard" />

Expand Down
119 changes: 119 additions & 0 deletions docs/configuration/LaunchesRetentionPolicy.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,119 @@
# Launches retention policy

Retention policies play a pivotal role in ensuring that companies maintain control over their data. In ReportPortal, on the General tab in Project Settings, the retention policy outlines the guidelines for how long regular launches and their descendants (logs, attachments) should be kept and when they should be deleted.

You can also mark some launches as important, meaning they are generally not subject to deletion by the time-based retention policy. These launches can only be deleted manually or if [storage is overused](/configuration/LaunchesRetentionPolicy#important-launch-deletion-on-full-storage) (in this case new launches will displace old ones, even if they are marked as important).

You can mark a launch as important either through the application's UI or during test reporting using special system attributes.

## Mark launch as important on UI

1. Go to the 'Launches' page.
2. Open the burger menu for the launch.
3. Click on the 'Mark as Important' option on the menu.
4. Confirm your decision by clicking the 'Mark as Important' button on the 'Mark as Important' modal window.

<MediaViewer src={require('./img/launches-retention-policy/LaunchesRetentionPolicy1.png')} alt="Mark launch as important on UI in our test automation dashboard" />

:::note
Administrator can mark any launch as important.<br />
Project Manager can mark as important any launch within their own project.<br />
Customers and Members can mark as important only their own launch.
:::

## Unmark important launches on UI

1. Go to the 'Launches' page.
2. Open the burger menu for the launch.
3. Click on the 'Unmark as Important' option in the menu.
4. Click on the 'Unmark' button on the 'Unmark as Important' modal window.

<MediaViewer src={require('./img/launches-retention-policy/LaunchesRetentionPolicy2.png')} alt="Unmark important launch on UI" />

:::note
Administrator can unmark any launch as important.<br />
Project Manager can unmark as important any launch within their own project.<br />
Customers and Members can unmark as important only their own launch.
:::

<MediaViewer src="https://youtu.be/XMSEe3VAOBY" alt="Mark launch as important in our test report dashboard" type="video" />

## Report launch as important

You can also mark launch as important while reporting using following parameters:

<MediaViewer src={require('./img/launches-retention-policy/LaunchesRetentionPolicy3.png')} alt="Report launch as important" />

<MediaViewer src={require('./img/launches-retention-policy/LaunchesRetentionPolicy4.png')} alt="Launch which is reported as important" />

In case launch is reported with system attributes **retentionPolicy : regular** and **retentionPolicy : important**, then attribute **retentionPolicy : important** is taken for defining the field value.

<MediaViewer src={require('./img/launches-retention-policy/LaunchesRetentionPolicy5.png')} alt="Launch with retentionPolicy : regular and retentionPolicy : important attributes" />

In case launch is reported without **retentionPolicy** system attribute or with random attribute like **retentionPolicy : qwerty**, then the behavior will be as for the launch with system attribute **retentionPolicy : regular**.

<MediaViewer src={require('./img/launches-retention-policy/LaunchesRetentionPolicy6.png')} alt="Launch without retentionPolicy attribute" />

<MediaViewer src={require('./img/launches-retention-policy/LaunchesRetentionPolicy7.png')} alt="Launch with random retentionPolicy attribute" />

<MediaViewer src={require('./img/launches-retention-policy/LaunchesRetentionPolicy8.png')} alt="Launch with retentionPolicy : regular attribute" />

<MediaViewer src="https://youtu.be/O1Ojy63M1Pw" alt="Report launch as important in our test reporting tool" type="video" />

## Delete important launch on UI

1. Go to the 'Launches' page.
2. Select important launch.
3. Click Actions -> Delete.
4. Click 'Delete Important Launch' button on the 'Delete Launch' modal window.

OR

1. Go to the 'Launches' page.
2. Open the burger menu for the important launch.
3. Click on the 'Delete' option on the menu.
4. Click 'Delete Important Launch' button on the 'Delete Launch' modal window.

<MediaViewer src={require('./img/launches-retention-policy/LaunchesRetentionPolicy9.png')} alt="Delete important launch on UI" />

:::note
Administrator can delete important launches of all users.<br />
Project Manager can delete the launches of users within their own project.<br />
Customers and Members can delete only their own launches.
:::

## Bulk delete important launches

1. Go to the 'Launches' page.
2. Select at least two important launches.
3. Click Actions -> Delete.
4. Click 'Delete Important Launches' button on the 'Delete Launches' modal window.

<MediaViewer src={require('./img/launches-retention-policy/LaunchesRetentionPolicy10.png')} alt="Bulk delete important launches" />

:::note
Administrator can perform bulk delete important launches of all users.<br />
Project Manager can perform bulk delete the launches of users within their own project.<br />
Customers and Members can perform bulk delete only their own launches.
:::

## Bulk delete important and regular launches

1. Go to the 'Launches' page.
2. Select at least one important launch and one regular launch.
3. Click Actions -> Delete.
4. Click ‘Delete with Important Launches' button on the 'Delete Launches' modal window.

<MediaViewer src={require('./img/launches-retention-policy/LaunchesRetentionPolicy11.png')} alt="Bulk delete important and regular launches" />

<MediaViewer src="https://youtu.be/Ha6G82dPKDg" alt="Delete important launches" type="video" />

## Important launch deletion on full storage

If all your storage is used and it consists of both regular and important launches, as soon as a new launch arrives, regular launches and all their related descendants will be deleted, starting from the oldest ones. The weight of deleted regular launches will be proportional to the weight of the new launches that are being reported.

If there are no regular launches remaining and all your memory space is used by important launches, then important launches and their descendants will be deleted starting from the oldest until the storage no longer exceeds capacity. The weight of deleted important launches will be proportional to the weight of the new launches that are being reported.

When it comes to attachments, in case of storage overusage by attachments of both regular and important launches, the same logic will apply. Attachments of regular launches are subject to deletion first. If there are no attachments from regular launches left, then attachments from important launches will be deleted, starting from the oldest, until the storage no longer exceeds capacity.

Thus, storing and managing data within ReportPortal is now even more convenient. The updates to the launches retention policy introduce the option to mark launches as important, provide greater flexibility and ensure that crucial data is kept as long as needed.
Loading

0 comments on commit 3cf34e5

Please sign in to comment.