Skip to content

Commit

Permalink
Merge pull request #2107 from sdgilley/sdg-rebrand-azure
Browse files Browse the repository at this point in the history
[BULK UPDATE] - add Azure to AI Foundry
  • Loading branch information
JamesJBarnett authored Dec 23, 2024
2 parents 8a7bf4c + f8428c1 commit 6df060c
Show file tree
Hide file tree
Showing 129 changed files with 359 additions and 359 deletions.
6 changes: 3 additions & 3 deletions articles/ai-services/agents/concepts/tracing.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,9 +25,9 @@ Tracing solves this by allowing you to clearly see the inputs and outputs of eac

Tracing lets you analyze your agent's performance and behavior by using OpenTelemetry and adding an Application Insights resource to your Azure AI Foundry project.

To add an Application Insights resource, navigate to the **Tracing** tab in the [AI Foundry portal](https://ai.azure.com/), and create a new resource if you don't already have one.
To add an Application Insights resource, navigate to the **Tracing** tab in the [Azure AI Foundry portal](https://ai.azure.com/), and create a new resource if you don't already have one.

:::image type="content" source="../media/ai-foundry-tracing.png" alt-text="A screenshot of the tracing screen in the AI Foundry portal." lightbox="../media/ai-foundry-tracing.png":::
:::image type="content" source="../media/ai-foundry-tracing.png" alt-text="A screenshot of the tracing screen in the Azure AI Foundry portal." lightbox="../media/ai-foundry-tracing.png":::

Once created, you can get an Application Insights connection string, configure your agents, and observe the full execution path of your agent through Azure Monitor. Typically you want to enable tracing before you create an agent.

Expand All @@ -46,7 +46,7 @@ You will also need an exporter to send results to your observability backend. Yo
pip install opentelemetry-exporter-otlp
```

Once you have the packages installed, you can use one the following Python samples to implement tracing with your agents. Samples that use console tracing display the results locally in the console. Samples that use Azure Monitor send the traces to the Azure Monitor in the [AI Foundry portal](https://ai.azure.com/), in the **Tracing** tab in the left navigation menu for the portal.
Once you have the packages installed, you can use one the following Python samples to implement tracing with your agents. Samples that use console tracing display the results locally in the console. Samples that use Azure Monitor send the traces to the Azure Monitor in the [Azure AI Foundry portal](https://ai.azure.com/), in the **Tracing** tab in the left navigation menu for the portal.

> [!NOTE]
> There is a known bug in the agents tracing functionality. The bug will cause the agent's function tool to call related info (function names and parameter values, which could contain sensitive information) to be included in the traces even when content recording is not enabled.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ from azure.ai.projects.models import FilePurpose
from azure.identity import DefaultAzureCredential
from pathlib import Path

# Create an Azure AI Client from a connection string, copied from your AI Foundry project.
# Create an Azure AI Client from a connection string, copied from your Azure AI Foundry project.
# At the moment, it should be in the format "<HostName>;<AzureSubscriptionId>;<ResourceGroup>;<HubName>"
# Customer needs to login to Azure subscription via Azure CLI and set the environment variables
project_client = AIProjectClient.from_connection_string(
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,5 +8,5 @@ ms.date: 12/11/2024
---

> [!TIP]
> You can also find your connection string in the **overview** for your project in the [AI Foundry portal](https://ai.azure.com/), under **Project details** > **Project connection string**.
> You can also find your connection string in the **overview** for your project in the [Azure AI Foundry portal](https://ai.azure.com/), under **Project details** > **Project connection string**.
> :::image type="content" source="../media/quickstart/portal-connection-string.png" alt-text="A screenshot showing the connection string in the Azure AI Foundry portal." lightbox="../media/quickstart/portal-connection-string.png":::
Original file line number Diff line number Diff line change
Expand Up @@ -11,4 +11,4 @@ ms.author: eur

> [!div class="checklist"]
> - Azure subscription - [Create one for free](https://azure.microsoft.com/free/cognitive-services).
> - Some Azure AI services features are free to try in the Azure AI Foundry portal. For access to all capabilities described in this article, you need to [connect AI services in AI Foundry](../../../ai-studio/ai-services/how-to/connect-ai-services.md).
> - Some Azure AI services features are free to try in the Azure AI Foundry portal. For access to all capabilities described in this article, you need to [connect AI services in Azure AI Foundry](../../../ai-studio/ai-services/how-to/connect-ai-services.md).
Original file line number Diff line number Diff line change
Expand Up @@ -11,4 +11,4 @@ ms.custom: include, ignite-2024
---

> [!TIP]
> You can use [**AI Foundry**](../../../ai-studio/what-is-ai-studio.md) to try summarization without needing to write code.
> You can use [**Azure AI Foundry**](../../../ai-studio/what-is-ai-studio.md) to try summarization without needing to write code.
Original file line number Diff line number Diff line change
Expand Up @@ -9,4 +9,4 @@ ms.custom: include, ignite-2024
---

> [!TIP]
> You can use [**AI Foundry**](../../../../ai-studio/what-is-ai-studio.md) to try summarization without needing to write code.
> You can use [**Azure AI Foundry**](../../../../ai-studio/what-is-ai-studio.md) to try summarization without needing to write code.
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ The Conversational PII detection models (both version `2024-11-01-preview` and `
As of June 2024, we now provide General Availability support for the Conversational PII service (English-language only). Customers can now redact transcripts, chats, and other text written in a conversational style (i.e. text with “um”s, “ah”s, multiple speakers, and the spelling out of words for more clarity) with better confidence in AI quality, Azure SLA support and production environment support, and enterprise-grade security in mind.

> [!TIP]
> Try out PII detection [in AI Foundry portal](https://ai.azure.com/explore/language), where you can [utilize a currently existing Language Studio resource or create a new AI Foundry resource](../../../ai-studio/ai-services/connect-ai-services.md)
> Try out PII detection [in Azure AI Foundry portal](https://ai.azure.com/explore/language), where you can [utilize a currently existing Language Studio resource or create a new Azure AI Foundry resource](../../../ai-studio/ai-services/connect-ai-services.md)
* [**Quickstarts**](quickstart.md) are getting-started instructions to guide you through making requests to the service.
* [**How-to guides**](how-to-call.md) contain instructions for using the service in more specific or customized ways.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,4 +11,4 @@ ms.custom: include, build-2024, ignite-2024
---

> [!TIP]
> You can use [**AI Foundry**](../../../../ai-studio/what-is-ai-studio.md) to try summarization without needing to write code.
> You can use [**Azure AI Foundry**](../../../../ai-studio/what-is-ai-studio.md) to try summarization without needing to write code.
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ Use this article to learn more about this feature, and how to use it in your app
Out of the box, the service provides summarization solutions for three types of genre, plain texts, conversations, and native documents. Text summarization only accepts plain text blocks, and conversation summarization accept conversational input, including various speech audio signals in order for the model to effectively segment and summarize, and native document can directly summarize for documents in their native formats, such as Words, PDF, etc.

> [!TIP]
> Try out Summarization [in AI Foundry portal](https://ai.azure.com/explore/language), where you can [utilize a currently existing Language Studio resource or create a new AI Foundry resource](../../../ai-studio/ai-services/connect-ai-services.md) in order to use this service.
> Try out Summarization [in Azure AI Foundry portal](https://ai.azure.com/explore/language), where you can [utilize a currently existing Language Studio resource or create a new Azure AI Foundry resource](../../../ai-studio/ai-services/connect-ai-services.md) in order to use this service.
# [Text summarization](#tab/text-summarization)

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ ms.custom: language-service-health, ignite-2024
Text Analytics for health is one of the prebuilt features offered by [Azure AI Language](../overview.md). It is a cloud-based API service that applies machine-learning intelligence to extract and label relevant medical information from a variety of unstructured texts such as doctor's notes, discharge summaries, clinical documents, and electronic health records.

> [!TIP]
> Try out Text Analytics for health [in AI Foundry portal](https://ai.azure.com/explore/language), where you can [utilize a currently existing Language Studio resource or create a new AI Foundry resource](../../../ai-studio/ai-services/connect-ai-services.md) in order to use this service.
> Try out Text Analytics for health [in Azure AI Foundry portal](https://ai.azure.com/explore/language), where you can [utilize a currently existing Language Studio resource or create a new Azure AI Foundry resource](../../../ai-studio/ai-services/connect-ai-services.md) in order to use this service.
This documentation contains the following types of articles:
* The [**quickstart article**](quickstart.md) provides a short tutorial that guides you with making your first request to the service.
Expand Down
2 changes: 1 addition & 1 deletion articles/ai-services/openai/assistants-quickstart.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ Azure OpenAI Assistants (Preview) allows you to create AI assistants tailored to

::: zone pivot="ai-foundry-portal"

[!INCLUDE [AI Foundry portal](includes/assistants-ai-studio.md)]
[!INCLUDE [Azure AI Foundry portal](includes/assistants-ai-studio.md)]

::: zone-end

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ To help with simplifying the sizing effort, the following table outlines the TPM
|Max Output TPM per PTU| 833|12,333|
|Latency Target Value |25 Tokens Per Second|33 Tokens Per Second|

For a full list see the [AOAI Foundry calculator](https://oai.azure.com/portal/calculator).
For a full list see the [AOAI in Azure AI Foundry calculator](https://oai.azure.com/portal/calculator).


> [!NOTE]
Expand Down Expand Up @@ -116,7 +116,7 @@ Azure OpenAI is a highly sought-after service where customer demand might exceed

#### Regional capacity guidance

To find the capacity needed for their deployments, use the capacity API or the AI Foundry deployment experience to provide real-time information on capacity availability.
To find the capacity needed for their deployments, use the capacity API or the Azure AI Foundry deployment experience to provide real-time information on capacity availability.

In Azure AI Foundry, the deployment experience identifies when a region lacks the capacity needed to deploy the model. This looks at the desired model, version and number of PTUs. If capacity is unavailable, the experience directs users to a select an alternative region.

Expand All @@ -128,7 +128,7 @@ If an acceptable region isn't available to support the desire model, version and

- Attempt the deployment with a smaller number of PTUs.
- Attempt the deployment at a different time. Capacity availability changes dynamically based on customer demand and more capacity might become available later.
- Ensure that quota is available in all acceptable regions. The [model capacities API](/rest/api/aiservices/accountmanagement/model-capacities/list?view=rest-aiservices-accountmanagement-2024-04-01-preview&tabs=HTTP&preserve-view=true) and AI Foundry experience consider quota availability in returning alternative regions for creating a deployment.
- Ensure that quota is available in all acceptable regions. The [model capacities API](/rest/api/aiservices/accountmanagement/model-capacities/list?view=rest-aiservices-accountmanagement-2024-04-01-preview&tabs=HTTP&preserve-view=true) and Azure AI Foundry experience consider quota availability in returning alternative regions for creating a deployment.

### Determining the number of PTUs needed for a workload

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ Below are examples of recommended system message components you can include to p
The following steps show how to leverage safety system messages in Azure AI Foundry portal.

1. Go to Azure AI Foundry and navigate to Azure OpenAI and the Chat playground.
:::image type="content" source="../media/navigate-chat-playground.PNG" alt-text="Screenshot of the AI Foundry portal selection.":::
:::image type="content" source="../media/navigate-chat-playground.PNG" alt-text="Screenshot of the Azure AI Foundry portal selection.":::
1. Navigate to the default safety system messages integrated in the studio.
:::image type="content" source="../media/navigate-system-message.PNG" alt-text="Screenshot of the system message navigation.":::
1. Select the system message(s) that are applicable to your scenario.
Expand Down
2 changes: 1 addition & 1 deletion articles/ai-services/openai/gpt-v-quickstart.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ Get started using GPT-4 Turbo with images with the Azure OpenAI Service.
::: zone pivot="ai-foundry-portal"

[!INCLUDE [AI Foundry portal quickstart](includes/gpt-v-studio.md)]
[!INCLUDE [Azure AI Foundry portal quickstart](includes/gpt-v-studio.md)]

::: zone-end

Expand Down
6 changes: 3 additions & 3 deletions articles/ai-services/openai/how-to/batch.md
Original file line number Diff line number Diff line change
Expand Up @@ -82,7 +82,7 @@ The following aren't currently supported:
### Global batch deployment

In the AI Foundry portal the deployment type will appear as `Global-Batch`.
In the Azure AI Foundry portal the deployment type will appear as `Global-Batch`.

:::image type="content" source="../media/how-to/global-batch/global-batch.png" alt-text="Screenshot that shows the model deployment dialog in Azure AI Foundry portal with Global-Batch deployment type highlighted." lightbox="../media/how-to/global-batch/global-batch.png":::

Expand All @@ -91,7 +91,7 @@ In the AI Foundry portal the deployment type will appear as `Global-Batch`.
::: zone pivot="ai-foundry-portal"

[!INCLUDE [AI Foundry portal](../includes/batch/batch-studio.md)]
[!INCLUDE [Azure AI Foundry portal](../includes/batch/batch-studio.md)]

::: zone-end

Expand Down Expand Up @@ -154,7 +154,7 @@ Yes. Similar to other deployment types, you can create content filters and assoc

### Can I request additional quota?

Yes, from the quota page in the AI Foundry portal. Default quota allocation can be found in the [quota and limits article](../quotas-limits.md#global-batch-quota).
Yes, from the quota page in the Azure AI Foundry portal. Default quota allocation can be found in the [quota and limits article](../quotas-limits.md#global-batch-quota).

### What happens if the API doesn't complete my request within the 24 hour time frame?

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -271,7 +271,7 @@ So far you have already setup each resource work independently. Next you need to
| `Storage Blob Data Contributor` | Azure OpenAI | Storage Account | Reads from the input container, and writes the preprocessed result to the output container. |
| `Cognitive Services OpenAI Contributor` | Azure AI Search | Azure OpenAI | Custom skill. |
| `Storage Blob Data Reader` | Azure AI Search | Storage Account | Reads document blobs and chunk blobs. |
| `Reader` | Azure AI Foundry Project | Azure Storage Private Endpoints (Blob & File) | Read search indexes created in blob storage within an AI Foundry Project. |
| `Reader` | Azure AI Foundry Project | Azure Storage Private Endpoints (Blob & File) | Read search indexes created in blob storage within an Azure AI Foundry Project. |
| `Cognitive Services OpenAI User` | Web app | Azure OpenAI | Inference. |


Expand Down
2 changes: 1 addition & 1 deletion articles/ai-services/openai/how-to/realtime-audio.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ Before you can use GPT-4o real-time audio, you need:

- An Azure subscription - <a href="https://azure.microsoft.com/free/cognitive-services" target="_blank">Create one for free</a>.
- An Azure OpenAI resource created in a [supported region](#supported-models). For more information, see [Create a resource and deploy a model with Azure OpenAI](create-resource.md).
- You need a deployment of the `gpt-4o-realtime-preview` model in a supported region as described in the [supported models](#supported-models) section. You can deploy the model from the [Azure AI Foundry portal model catalog](../../../ai-studio/how-to/model-catalog-overview.md) or from your project in AI Foundry portal.
- You need a deployment of the `gpt-4o-realtime-preview` model in a supported region as described in the [supported models](#supported-models) section. You can deploy the model from the [Azure AI Foundry portal model catalog](../../../ai-studio/how-to/model-catalog-overview.md) or from your project in Azure AI Foundry portal.

Here are some of the ways you can get started with the GPT-4o Realtime API for speech and audio:
- For steps to deploy and use the `gpt-4o-realtime-preview` model, see [the real-time audio quickstart](../realtime-audio-quickstart.md).
Expand Down
2 changes: 1 addition & 1 deletion articles/ai-services/openai/how-to/use-web-app.md
Original file line number Diff line number Diff line change
Expand Up @@ -166,7 +166,7 @@ This can be accomplished using the Advanced edit or simple Edit options as previ

### Using Azure AI Foundry

Follow [this tutorial on integrating Azure AI Search with AI Foundry](/azure/ai-studio/tutorials/deploy-chat-web-app#add-your-data-and-try-the-chat-model-again) and redeploy your application.
Follow [this tutorial on integrating Azure AI Search with Azure AI Foundry](/azure/ai-studio/tutorials/deploy-chat-web-app#add-your-data-and-try-the-chat-model-again) and redeploy your application.

### Using Azure OpenAI Studio

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -87,7 +87,7 @@ Give your Azure OpenAI resource the **Key Vault Secrets Officer** role.

## Link Weights & Biases with Azure OpenAI

1. Navigate to the [AI Foundry portal](https://ai.azure.com) and select your Azure OpenAI fine-tuning resource.
1. Navigate to the [Azure AI Foundry portal](https://ai.azure.com) and select your Azure OpenAI fine-tuning resource.

:::image type="content" source="../media/how-to/weights-and-biases/manage-integrations.png" alt-text="Screenshot of the manage integrations button." lightbox="../media/how-to/weights-and-biases/manage-integrations.png":::

Expand Down
4 changes: 2 additions & 2 deletions articles/ai-services/openai/includes/assistants-ai-studio.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
---
title: Quickstart - getting started with Azure OpenAI assistants (preview) in AI Foundry portal
title: Quickstart - getting started with Azure OpenAI assistants (preview) in Azure AI Foundry portal
titleSuffix: Azure OpenAI
description: Walkthrough on how to get started with Azure OpenAI assistants with new features like code interpreter in AI Foundry portal (Preview).
description: Walkthrough on how to get started with Azure OpenAI assistants with new features like code interpreter in Azure AI Foundry portal (Preview).
manager: nitinme
ms.service: azure-ai-studio
ms.custom:
Expand Down
2 changes: 1 addition & 1 deletion articles/ai-services/openai/includes/assistants-studio.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ Use the **Assistant setup** pane to create a new AI assistant or to select an ex
| **Deployment** | This is where you set which model deployment to use with your assistant. |
| **Functions**| Create custom function definitions for the models to formulate API calls and structure data outputs based on your specifications |
| **Code interpreter** | Code interpreter provides access to a sandboxed Python environment that can be used to allow the model to test and execute code. |
| **Files** | You can upload up to 20 files, with a max file size of 512 MB to use with tools. You can upload up to 10,000 files using [AI Foundry portal](../assistants-quickstart.md?pivots=ai-foundry-portal). |
| **Files** | You can upload up to 20 files, with a max file size of 512 MB to use with tools. You can upload up to 10,000 files using [Azure AI Foundry portal](../assistants-quickstart.md?pivots=ai-foundry-portal). |

### Tools

Expand Down
2 changes: 1 addition & 1 deletion articles/ai-services/openai/includes/batch/batch-studio.md
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ For this article, we'll create a file named `test.jsonl` and will copy the conte

Once your input file is prepared, you first need to upload the file to then be able to kick off a batch job. File upload can be done both programmatically or via the Studio.

1. Sign in to [AI Foundry portal](https://ai.azure.com).
1. Sign in to [Azure AI Foundry portal](https://ai.azure.com).
2. Select the Azure OpenAI resource where you have a global batch model deployment available.
3. Select **Batch jobs** > **+Create batch jobs**.

Expand Down
Loading

0 comments on commit 6df060c

Please sign in to comment.