View Diff on GitHub
# ハイライト
このコード変更は、Azure OpenAIのドキュメント内で、主にAzure AI Foundryポータルへのリンクを「https://ai.azure.com」から「https://ai.azure.com/?cid=learnDocs」に修正することで、最新の情報やリソースにアクセスしやすくなるよう調整されています。同時に、いくつかの新機能や用語の改良も含まれています。
新機能
- GPT-4Oリアルタイムプレビューのレート制限に関する更新。
- 新しい音声モデルのサポート追加。
破壊的変更
なし。
その他の更新
- Azure AI Foundryポータルへのリンクの一貫した更新。
- サンプリングパラメーター名のわかりやすさの向上。
インサイト
このドキュメントの変更は、主にリンク先の修正であり、Azureのユーザーが必要な情報に迅速かつ一貫してアクセスできるようにするという目的があります。リンクの変更は一見小さな修正に見えるかもしれませんが、正確な情報へのアクセスはユーザー体験を大きく左右するため、極めて重要です。
リンクの修正が一貫して行われているため、ユーザーがさまざまなリソースを使用する際に閲覧体験が統一され、情報の取得が効率化されます。また、Azure AI Foundryポータル自体の更新情報に基づいてリンクが変更されていますので、常に最新のドキュメントに基づいて利用できるという利点もあります。
さらに、GPT-4Oリアルタイムプレビューの新しいレート制限機能や、新しい音声モデルのサポート追加は、Azure OpenAIサービスの進化を示し、ユーザーが最新の技術を引き続き利用できることを保証しています。また、一部の用語がよりわかりやすくなるように修正されたことは、特に開発者にとって役立つ情報の提供に寄与しています。
これらの変更は、Azure OpenAIサービスにおけるユーザーエクスペリエンスと情報の明確さを高めるための重要なステップです。ユーザーが直面する可能性のある混乱や情報不足を減らし、より効率的かつ効果的にAzureサービスを利用できるようになることが期待されます。
Summary Table
Modified Contents
articles/ai-services/openai/concepts/assistants.md
Diff
@@ -35,7 +35,7 @@ Assistants API supports persistent automatically managed threads. This means tha
> [!TIP]
> There is no additional [pricing](https://azure.microsoft.com/pricing/details/cognitive-services/openai-service/) or [quota](../quotas-limits.md) for using Assistants unless you use the [code interpreter](../how-to/code-interpreter.md) or [file search](../how-to/file-search.md) tools.
-Assistants API is built on the same capabilities that power OpenAI’s GPT product. Some possible use cases range from AI-powered product recommender, sales analyst app, coding assistant, employee Q&A chatbot, and more. Start building on the no-code Assistants playground on the [Azure AI Foundry portal](https://ai.azure.com/) or start building with the API.
+Assistants API is built on the same capabilities that power OpenAI’s GPT product. Some possible use cases range from AI-powered product recommender, sales analyst app, coding assistant, employee Q&A chatbot, and more. Start building on the no-code Assistants playground on the [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) or start building with the API.
> [!IMPORTANT]
> Retrieving untrusted data using Function calling, Code Interpreter or File Search with file input, and Assistant Threads functionalities could compromise the security of your Assistant, or the application that uses the Assistant. Learn about mitigation approaches [here](https://aka.ms/oai/assistant-rai).
Summary
{
"modification_type": "minor update",
"modification_title": "Assistants APIのリンクの更新"
}
Explanation
この変更は、Assistants APIに関するドキュメントの内容をわずかに修正したものです。具体的には、Azure AI Foundryポータルへのリンクが更新され、新しいURLが追加されました。元のURLは変更され、新しいリンクは「https://ai.azure.com/?cid=learnDocs」となっています。この更新により、ユーザーは最新の情報にアクセスできるようになります。変更された部分は、言及されたアプリケーションやAPIの利用方法に関する記述の一部であり、全体の内容には大きな影響はありません。
articles/ai-services/openai/concepts/content-streaming.md
Diff
@@ -30,15 +30,15 @@ Customers must understand that while the feature improves latency, it's a trade-
**Customer Copyright Commitment**: Content that is retroactively flagged as protected material might not be eligible for Customer Copyright Commitment coverage.
-To enable Asynchronous Filter in [Azure AI Foundry portal](https://ai.azure.com/), follow the [Content filter how-to guide](/azure/ai-services/openai/how-to/content-filters) to create a new content filtering configuration, and select **Asynchronous Filter** in the Streaming section.
+To enable Asynchronous Filter in [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs), follow the [Content filter how-to guide](/azure/ai-services/openai/how-to/content-filters) to create a new content filtering configuration, and select **Asynchronous Filter** in the Streaming section.
## Comparison of content filtering modes
| Compare | Streaming - Default | Streaming - Asynchronous Filter |
|---|---|---|
|Status |GA |Public Preview |
| Eligibility |All customers |Customers approved for modified content filtering |
-| How to enable | Enabled by default, no action needed |Customers approved for modified content filtering can configure it directly in [Azure AI Foundry portal](https://ai.azure.com/) (as part of a content filtering configuration, applied at the deployment level) |
+| How to enable | Enabled by default, no action needed |Customers approved for modified content filtering can configure it directly in [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) (as part of a content filtering configuration, applied at the deployment level) |
|Modality and availability |Text; all GPT models |Text; all GPT models |
|Streaming experience |Content is buffered and returned in chunks |Zero latency (no buffering, filters run asynchronously) |
|Content filtering signal |Immediate filtering signal |Delayed filtering signal (in up to ~1,000-character increments) |
Summary
{
"modification_type": "minor update",
"modification_title": "コンテンツストリーミングに関するリンクの更新"
}
Explanation
この変更は、コンテンツストリーミングに関するドキュメントの一部を修正し、情報を最新のものに更新するために行われました。具体的には、Azure AI Foundryポータルへのリンクが「https://ai.azure.com/」から「https://ai.azure.com/?cid=learnDocs」に変更されました。この更新により、読者は必要な情報をより効率的に見つけることができるようになります。また、非同期フィルタを有効にする手順に関する記述も修正され、同様にドキュメント全体の明確さと一貫性が高められています。全体として、この変更は機能自体には大きな影響を与えず、ユーザー体験の向上を目的としています。
articles/ai-services/openai/concepts/gpt-4-v-prompt-engineering.md
Diff
@@ -27,7 +27,7 @@ To unlock the full potential of vision-enabled chat models, it's essential to ta
- **Define output format:** Clearly mention the desired format for the output, such as markdown, JSON, HTML, etc. You can also suggest a specific structure, length, or specific attributes about the response.
## Example prompt inputs and outputs
-There are many ways to craft system prompts to tailor the output specifically to your needs. The following sample inputs and outputs showcase how adjusting your prompts can give you different results. Try out the model for yourself using these images and adjusting the system prompt in the [Azure AI Foundry playground](https://ai.azure.com/).
+There are many ways to craft system prompts to tailor the output specifically to your needs. The following sample inputs and outputs showcase how adjusting your prompts can give you different results. Try out the model for yourself using these images and adjusting the system prompt in the [Azure AI Foundry playground](https://ai.azure.com/?cid=learnDocs).
### Contextual specificity
Context can help improve feedback from the model. For example, if you're working on image descriptions for a product catalog, ensure your prompt reflects that in a clear and concise way. A prompt like “Describe images for an outdoor hiking product catalog, focusing on enthusiasm and professionalism” guides the model to generate responses that are both accurate and contextually rich.
Summary
{
"modification_type": "minor update",
"modification_title": "プロンプトエンジニアリングに関するリンクの更新"
}
Explanation
この変更は、GPT-4に関するプロンプトエンジニアリングのドキュメントを修正し、リンクを最新のものに更新するために行われました。具体的には、Azure AI Foundryプレイグラウンドへのリンクが「https://ai.azure.com/」から「https://ai.azure.com/?cid=learnDocs」に変更されました。この更新により、ユーザーが必要な情報をより簡単に見つけられるようになります。また、出力フォーマットの定義に関する説明は残されたままで、従来通りの内容が維持されています。この変更は、ドキュメントの全体的な一貫性を高め、ユーザーにより良い体験を提供することを目的としています。
articles/ai-services/openai/concepts/gpt-with-vision.md
Diff
@@ -29,7 +29,7 @@ This section describes the limitations of vision-enabled chat models.
- **Maximum input image size**: The maximum size for input images is restricted to 20 MB.
- **Low resolution accuracy**: When images are analyzed using the "low resolution" setting, it allows for faster responses and uses fewer input tokens for certain use cases. However, this could impact the accuracy of object and text recognition within the image.
-- **Image chat restriction**: When you upload images in [Azure AI Foundry portal](https://ai.azure.com/) or the API, there is a limit of 10 images per chat call.
+- **Image chat restriction**: When you upload images in [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) or the API, there is a limit of 10 images per chat call.
## Special pricing information
@@ -84,8 +84,8 @@ Additionally, there's a one-time indexing cost of $0.15 to generate the Video Re
### Video support
- **Low resolution**: Video frames are analyzed using GPT-4 Turbo with Vision's "low resolution" setting, which may affect the accuracy of small object and text recognition in the video.
-- **Video file limits**: Both MP4 and MOV file types are supported. In [Azure AI Foundry portal](https://ai.azure.com/), videos must be less than 3 minutes long. When you use the API there is no such limitation.
-- **Prompt limits**: Video prompts only contain one video and no images. In [Azure AI Foundry portal](https://ai.azure.com/), you can clear the session to try another video or images.
+- **Video file limits**: Both MP4 and MOV file types are supported. In [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs), videos must be less than 3 minutes long. When you use the API there is no such limitation.
+- **Prompt limits**: Video prompts only contain one video and no images. In [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs), you can clear the session to try another video or images.
- **Limited frame selection**: The service selects 20 frames from the entire video, which might not capture all the critical moments or details. Frame selection can be approximately evenly spread through the video or focused by a specific video retrieval query, depending on the prompt.
- **Language support**: The service primarily supports English for grounding with transcripts. Transcripts don't provide accurate information on lyrics in songs.
-->
Summary
{
"modification_type": "minor update",
"modification_title": "視覚対応チャットモデルに関するリンクの更新"
}
Explanation
この変更は、視覚対応チャットモデルに関するドキュメントを修正し、リンクを最新のものに更新することを目的としています。具体的には、Azure AI Foundryポータルへのリンクが「https://ai.azure.com/」から「https://ai.azure.com/?cid=learnDocs」に変更されました。この更新により、ユーザーが必要な情報をより簡単に見つけられるようになります。また、入力画像サイズや解像度の制限、ビデオファイルの制限に関する内容はそのまま残されており、重要な情報は保たれています。全体として、この変更は文書の整合性を維持しつつ、ユーザーエクスペリエンスの向上を図るものであると言えます。
articles/ai-services/openai/concepts/provisioned-migration.md
Diff
@@ -260,7 +260,7 @@ Azure Reservations for Azure OpenAI provisioned offers are specific to the provi
## Managing Provisioned Throughput Commitments
-Provisioned throughput commitments are created and managed by selecting **Management center** in the [Azure AI Foundry portal](https://ai.azure.com/)'s navigation menu > **Quota** > **Manage Commitments**.
+Provisioned throughput commitments are created and managed by selecting **Management center** in the [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs)'s navigation menu > **Quota** > **Manage Commitments**.
:::image type="content" source="../media/how-to/provisioned-onboarding/notifications.png" alt-text="Screenshot of commitment purchase UI with notifications." lightbox="../media/how-to/provisioned-onboarding/notifications.png":::
Summary
{
"modification_type": "minor update",
"modification_title": "Azure AI Foundryポータルへのリンクの更新"
}
Explanation
この変更は、Azure AI Foundryポータルに関連する文書のリンクを最新のものに修正するために行われました。具体的には、ポータルへのリンクが「https://ai.azure.com/」から「https://ai.azure.com/?cid=learnDocs」に変更されています。この小さな更新により、ユーザーは正確で最新の情報にアクセスしやすくなります。また、プロビジョニングされたスループットのコミットメントに関する内容はそのまま保持されており、文書の重要な情報は変わりません。この変更は、ユーザーエクスペリエンスを向上させることを目的としています。
articles/ai-services/openai/concepts/safety-system-message-templates.md
Diff
@@ -31,7 +31,7 @@ Below are examples of recommended system message components you can include to p
## Add safety system messages in Azure AI Foundry portal
-The following steps show how to leverage safety system messages in [Azure AI Foundry portal](https://ai.azure.com/).
+The following steps show how to leverage safety system messages in [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs).
1. Go to Azure AI Foundry and navigate to Azure OpenAI and the Chat playground.
:::image type="content" source="../media/navigate-chat-playground.PNG" alt-text="Screenshot of the Azure AI Foundry portal selection.":::
Summary
{
"modification_type": "minor update",
"modification_title": "Azure AI Foundryポータルへのリンクの更新"
}
Explanation
この変更は、Azure AI Foundryポータルに関する文書内のリンクを最新のものに更新することを目的としています。具体的には、ポータルのリンクが「https://ai.azure.com/」から「https://ai.azure.com/?cid=learnDocs」に変更されています。この更新により、ユーザーがより正確で便利な情報にアクセスしやすくなります。安全なシステムメッセージの活用方法についての手順は変更されておらず、重要な内容はそのまま保持されています。この小さな修正は、ユーザーエクスペリエンスを向上させるための取り組みの一環です。
articles/ai-services/openai/concepts/use-your-data.md
Diff
@@ -18,14 +18,14 @@ Use this article to learn about Azure OpenAI On Your Data, which makes it easier
## What is Azure OpenAI On Your Data
-Azure OpenAI On Your Data enables you to run advanced AI models such as GPT-35-Turbo and GPT-4 on your own enterprise data without needing to train or fine-tune models. You can chat on top of and analyze your data with greater accuracy. You can specify sources to support the responses based on the latest information available in your designated data sources. You can access Azure OpenAI On Your Data using a REST API, via the SDK or the web-based interface in the [Azure AI Foundry portal](https://ai.azure.com/). You can also create a web app that connects to your data to enable an enhanced chat solution or deploy it directly as a copilot in the Copilot Studio (preview).
+Azure OpenAI On Your Data enables you to run advanced AI models such as GPT-35-Turbo and GPT-4 on your own enterprise data without needing to train or fine-tune models. You can chat on top of and analyze your data with greater accuracy. You can specify sources to support the responses based on the latest information available in your designated data sources. You can access Azure OpenAI On Your Data using a REST API, via the SDK or the web-based interface in the [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs). You can also create a web app that connects to your data to enable an enhanced chat solution or deploy it directly as a copilot in the Copilot Studio (preview).
## Developing with Azure OpenAI On Your Data
:::image type="content" source="../media/use-your-data/workflow-diagram.png" alt-text="A diagram showing an example workflow.":::
Typically, the development process you'd use with Azure OpenAI On Your Data is:
-1. **Ingest**: Upload files using either [Azure AI Foundry portal](https://ai.azure.com/) or the ingestion API. This enables your data to be cracked, chunked and embedded into an Azure AI Search instance that can be used by Azure OpenAI models. If you have an existing [supported data source](#supported-data-sources), you can also connect it directly.
+1. **Ingest**: Upload files using either [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) or the ingestion API. This enables your data to be cracked, chunked and embedded into an Azure AI Search instance that can be used by Azure OpenAI models. If you have an existing [supported data source](#supported-data-sources), you can also connect it directly.
1. **Develop**: After trying Azure OpenAI On Your Data, begin developing your application using the available REST API and SDKs, which are available in several languages. It will create prompts and search intents to pass to the Azure OpenAI service.
@@ -38,7 +38,7 @@ Typically, the development process you'd use with Azure OpenAI On Your Data is:
1. **Response generation**: The resulting data is submitted along with other information like the system message to the Large Language Model (LLM) and the response is sent back to the application.
-To get started, [connect your data source](../use-your-data-quickstart.md) using [Azure AI Foundry portal](https://ai.azure.com/) and start asking questions and chatting on your data.
+To get started, [connect your data source](../use-your-data-quickstart.md) using [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) and start asking questions and chatting on your data.
## Azure Role-based access controls (Azure RBAC) for adding data sources
@@ -139,7 +139,7 @@ Azure OpenAI On Your Data lets you restrict the documents that can be used in re
### Index field mapping
-If you're using your own index, you'll be prompted in the [Azure AI Foundry portal](https://ai.azure.com/) to define which fields you want to map for answering questions when you add your data source. You can provide multiple fields for *Content data*, and should include all fields that have text pertaining to your use case.
+If you're using your own index, you'll be prompted in the [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) to define which fields you want to map for answering questions when you add your data source. You can provide multiple fields for *Content data*, and should include all fields that have text pertaining to your use case.
:::image type="content" source="../media/use-your-data/index-data-mapping.png" alt-text="A screenshot showing the index field mapping options in Azure AI Foundry portal." lightbox="../media/use-your-data/index-data-mapping.png":::
@@ -192,7 +192,7 @@ You might want to use Azure Blob Storage as a data source if you want to connect
To keep your Azure AI Search index up-to-date with your latest data, you can schedule an automatic index refresh rather than manually updating it every time your data is updated. Automatic index refresh is only available when you choose **Azure Blob Storage** as the data source. To enable an automatic index refresh:
-1. [Add a data source](../quickstart.md) using [Azure AI Foundry portal](https://ai.azure.com/).
+1. [Add a data source](../quickstart.md) using [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs).
1. Under **Select or add data source** select **Indexer schedule** and choose the refresh cadence you would like to apply.
:::image type="content" source="../media/use-your-data/indexer-schedule.png" alt-text="A screenshot of the indexer schedule in Azure AI Foundry portal." lightbox="../media/use-your-data/indexer-schedule.png":::
@@ -224,7 +224,7 @@ To modify the schedule, you can use the [Azure portal](https://portal.azure.com/
# [Upload files (preview)](#tab/file-upload)
-Using [Azure AI Foundry portal](https://ai.azure.com/), you can upload files from your machine to try Azure OpenAI On Your Data. You also have the option to create a new Azure Blob Storage account and Azure AI Search resource. The service then stores the files to an Azure storage container and performs ingestion from the container. You can use the [quickstart](../use-your-data-quickstart.md) article to learn how to use this data source option.
+Using [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs), you can upload files from your machine to try Azure OpenAI On Your Data. You also have the option to create a new Azure Blob Storage account and Azure AI Search resource. The service then stores the files to an Azure storage container and performs ingestion from the container. You can use the [quickstart](../use-your-data-quickstart.md) article to learn how to use this data source option.
:::image type="content" source="../media/quickstarts/add-your-data-source.png" alt-text="A screenshot showing options for selecting a data source in Azure AI Foundry portal." lightbox="../media/quickstarts/add-your-data-source.png":::
@@ -308,11 +308,11 @@ Mapping these fields correctly helps ensure the model has better response and ci
### Use Elasticsearch as a data source via API
-Along with using Elasticsearch databases in [Azure AI Foundry portal](https://ai.azure.com/), you can also use your Elasticsearch database using the [API](../references/elasticsearch.md).
+Along with using Elasticsearch databases in [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs), you can also use your Elasticsearch database using the [API](../references/elasticsearch.md).
# [MongoDB Atlas (preview)](#tab/mongo-db-atlas)
-You can connect your MongoDB Atlas vector index with Azure OpenAI On Your Data for inferencing. You can use it through the [Azure AI Foundry portal](https://ai.azure.com/), API and SDK.
+You can connect your MongoDB Atlas vector index with Azure OpenAI On Your Data for inferencing. You can use it through the [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs), API and SDK.
### Prerequisites
@@ -363,15 +363,15 @@ When you add your MongoDB Atlas data source, you can specify data fields to prop
## Deploy to a copilot (preview), Teams app (preview), or web app
-After you connect Azure OpenAI to your data, you can deploy it using the **Deploy to** button in [Azure AI Foundry portal](https://ai.azure.com/).
+After you connect Azure OpenAI to your data, you can deploy it using the **Deploy to** button in [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs).
:::image type="content" source="../media/use-your-data/deploy-model.png" alt-text="A screenshot showing the model deployment button in Azure AI Foundry portal." lightbox="../media/use-your-data/deploy-model.png":::
This gives you multiple options for deploying your solution.
#### [Copilot (preview)](#tab/copilot)
-You can deploy to a copilot in [Copilot Studio](/microsoft-copilot-studio/fundamentals-what-is-copilot-studio) (preview) directly from [Azure AI Foundry portal](https://ai.azure.com/), enabling you to bring conversational experiences to various channels such as: Microsoft Teams, websites, Dynamics 365, and other [Azure Bot Service channels](/microsoft-copilot-studio/publication-connect-bot-to-azure-bot-service-channels). The tenant used in the Azure OpenAI and Copilot Studio (preview) should be the same. For more information, see [Use a connection to Azure OpenAI On Your Data](/microsoft-copilot-studio/nlu-generative-answers-azure-openai).
+You can deploy to a copilot in [Copilot Studio](/microsoft-copilot-studio/fundamentals-what-is-copilot-studio) (preview) directly from [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs), enabling you to bring conversational experiences to various channels such as: Microsoft Teams, websites, Dynamics 365, and other [Azure Bot Service channels](/microsoft-copilot-studio/publication-connect-bot-to-azure-bot-service-channels). The tenant used in the Azure OpenAI and Copilot Studio (preview) should be the same. For more information, see [Use a connection to Azure OpenAI On Your Data](/microsoft-copilot-studio/nlu-generative-answers-azure-openai).
> [!NOTE]
> Deploying to a copilot in Copilot Studio (preview) is only available in US regions.
@@ -393,15 +393,15 @@ A Teams app lets you bring conversational experience to your users in Teams to i
- Your Azure account has been assigned **Cognitive Services OpenAI user** or **Cognitive Services OpenAI Contributor** role of the Azure OpenAI resource you're using, allowing your account to make Azure OpenAI API calls. For more information, see [Azure OpenAI On Your data configuration](../how-to/on-your-data-configuration.md#using-the-api) and [Add role assignment to an Azure OpenAI resource](/azure/ai-services/openai/how-to/role-based-access-control#add-role-assignment-to-an-azure-openai-resource) for instructions on setting this role in the Azure portal.
-You can deploy to a standalone Teams app directly from [Azure AI Foundry portal](https://ai.azure.com/). Follow the steps below:
+You can deploy to a standalone Teams app directly from [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs). Follow the steps below:
1. After you've added your data to the chat model, select **Deploy** and then **a new Teams app (preview)**.
1. Enter the name of your Teams app and download the resulting .zip file.
1. Extract the .zip file and open the folder in Visual Studio Code.
-1. If you chose **API key** in the data connection step, manually copy and paste your Azure AI Search key into the `src\prompts\chat\config.json` file. Your Azure AI Search Key can be found in [Azure AI Foundry portal](https://ai.azure.com/) Playground by selecting the **View code** button with the key located under Azure Search Resource Key. If you chose **System assigned managed identity**, you can skip this step. Learn more about different data connection options in the [Data connection](/azure/ai-services/openai/concepts/use-your-data?tabs=ai-search#data-connection) section.
+1. If you chose **API key** in the data connection step, manually copy and paste your Azure AI Search key into the `src\prompts\chat\config.json` file. Your Azure AI Search Key can be found in [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) Playground by selecting the **View code** button with the key located under Azure Search Resource Key. If you chose **System assigned managed identity**, you can skip this step. Learn more about different data connection options in the [Data connection](/azure/ai-services/openai/concepts/use-your-data?tabs=ai-search#data-connection) section.
1. Open the Visual Studio Code terminal and log into Azure CLI, selecting the account that you assigned **Cognitive Service OpenAI User** role to. Use the `az login` command in the terminal to log in.
@@ -467,7 +467,7 @@ A small chunk size like 256 produces more granular chunks. This size also means
### Runtime parameters
-You can modify the following additional settings in the **Data parameters** section in [Azure AI Foundry portal](https://ai.azure.com/) and [the API](../references/on-your-data.md). You don't need to reingest your data when you update these parameters.
+You can modify the following additional settings in the **Data parameters** section in [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) and [the API](../references/on-your-data.md). You don't need to reingest your data when you update these parameters.
|Parameter name | Description |
@@ -484,7 +484,7 @@ It's possible for the model to return `"TYPE":"UNCITED_REFERENCE"` instead of `"
You can define a system message to steer the model's reply when using Azure OpenAI On Your Data. This message allows you to customize your replies on top of the retrieval augmented generation (RAG) pattern that Azure OpenAI On Your Data uses. The system message is used in addition to an internal base prompt to provide the experience. To support this, we truncate the system message after a specific [number of tokens](#token-usage-estimation-for-azure-openai-on-your-data) to ensure the model can answer questions using your data. If you are defining extra behavior on top of the default experience, ensure that your system prompt is detailed and explains the exact expected customization.
-Once you select add your dataset, you can use the **System message** section in the [Azure AI Foundry portal](https://ai.azure.com/), or the `role_information` [parameter in the API](../references/on-your-data.md).
+Once you select add your dataset, you can use the **System message** section in the [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs), or the `role_information` [parameter in the API](../references/on-your-data.md).
:::image type="content" source="../media/use-your-data/system-message.png" alt-text="A screenshot showing the system message option in Azure AI Foundry portal." lightbox="../media/use-your-data/system-message.png":::
@@ -679,7 +679,7 @@ token_output = TokenEstimator.estimate_tokens(input_text)
## Troubleshooting
-To troubleshoot failed operations, always look out for errors or warnings specified either in the API response or [Azure AI Foundry portal](https://ai.azure.com/). Here are some of the common errors and warnings:
+To troubleshoot failed operations, always look out for errors or warnings specified either in the API response or [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs). Here are some of the common errors and warnings:
### Failed ingestion jobs
Summary
{
"modification_type": "minor update",
"modification_title": "Azure AI Foundryポータルへのリンクの更新"
}
Explanation
この変更は、Azure OpenAIに関するドキュメント内のリンクを最新のものに更新するために行われました。具体的に、ポータルへのリンクが「https://ai.azure.com/」から「https://ai.azure.com/?cid=learnDocs」に変更されています。この小さな変更により、ユーザーはより便利で正確な情報にアクセスしやすくなります。内容に関しては、Azure OpenAI On Your Dataに関する主要な情報や手順がそのまま保たれており、ドキュメントの全体的な可読性とユーザーエクスペリエンスを向上させることを目的としています。リンクの修正は、特に各所での一貫性を持たせる役割を果たしています。
articles/ai-services/openai/faq.yml
Diff
@@ -125,19 +125,19 @@ sections:
answer: |
A Limited Access registration form is not required to access most Azure OpenAI models. Learn more on the [Azure OpenAI Limited Access page](/legal/cognitive-services/openai/limited-access?context=/azure/ai-services/openai/context/context).
- question: |
- My guest account has been given access to an Azure OpenAI resource, but I'm unable to access that resource in the [Azure AI Foundry portal](https://ai.azure.com/). How do I enable access?
+ My guest account has been given access to an Azure OpenAI resource, but I'm unable to access that resource in the [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs). How do I enable access?
answer: |
- This is expected behavior when using the default sign-in experience for the [Azure AI Foundry](https://ai.azure.com).
+ This is expected behavior when using the default sign-in experience for the [Azure AI Foundry](https://ai.azure.com/?cid=learnDocs).
To access Azure AI Foundry from a guest account that has been granted access to an Azure OpenAI resource:
- 1. Open a private browser session and then navigate to [https://ai.azure.com](https://ai.azure.com).
+ 1. Open a private browser session and then navigate to [https://ai.azure.com](https://ai.azure.com/?cid=learnDocs).
2. Rather than immediately entering your guest account credentials instead select `Sign-in options`
3. Now select **Sign in to an organization**
4. Enter the domain name of the organization that granted your guest account access to the Azure OpenAI resource.
5. Now sign-in with your guest account credentials.
- You should now be able to access the resource via the [Azure AI Foundry portal](https://ai.azure.com/).
+ You should now be able to access the resource via the [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs).
Alternatively if you're signed into the [Azure portal](https://portal.azure.com) from the Azure OpenAI resource's Overview pane you can select **Go to Azure AI Foundry** to automatically sign in with the appropriate organizational context.
@@ -175,7 +175,7 @@ sections:
answer:
We do offer an Availability SLA for all resources and a Latency SLA for Provisioned-Managed Deployments. For more information about the SLA for Azure OpenAI Service, see the [Service Level Agreements (SLA) for Online Services page](https://azure.microsoft.com/support/legal/sla/cognitive-services/v1_1/).
- question: |
- How do I enable fine-tuning? Create a custom model is greyed out in [Azure AI Foundry portal](https://ai.azure.com/).
+ How do I enable fine-tuning? Create a custom model is greyed out in [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs).
answer: |
In order to successfully access fine-tuning, you need Cognitive Services OpenAI Contributor assigned. Even someone with high-level Service Administrator permissions would still need this account explicitly set in order to access fine-tuning. For more information, please review the [role-based access control guidance](/azure/ai-services/openai/how-to/role-based-access-control#cognitive-services-openai-contributor).
- question: |
@@ -304,9 +304,9 @@ sections:
answer:
You can customize your published web app in the Azure portal. The source code for the published web app is [available on GitHub](https://go.microsoft.com/fwlink/?linkid=2244395), where you can find information on changing the app frontend, as well as instructions for building and deploying the app.
- question: |
- Will my web app be overwritten when I deploy the app again from the [Azure AI Foundry portal](https://ai.azure.com/)?
+ Will my web app be overwritten when I deploy the app again from the [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs)?
answer:
- Your app code won't be overwritten when you update your app. The app will be updated to use the Azure OpenAI resource, Azure AI Search index (if you're using Azure OpenAI on your data), and model settings selected in the [Azure AI Foundry portal](https://ai.azure.com/) without any change to the appearance or functionality.
+ Your app code won't be overwritten when you update your app. The app will be updated to use the Azure OpenAI resource, Azure AI Search index (if you're using Azure OpenAI on your data), and model settings selected in the [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) without any change to the appearance or functionality.
- name: Using your data
questions:
- question: |
@@ -316,15 +316,15 @@ sections:
- question: |
How can I access Azure OpenAI on your data?
answer:
- All Azure OpenAI customers can use Azure OpenAI on your data via the [Azure AI Foundry portal](https://ai.azure.com/) and Rest API.
+ All Azure OpenAI customers can use Azure OpenAI on your data via the [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) and Rest API.
- question: |
What data sources does Azure OpenAI on your data support?
answer:
Azure OpenAI on your data supports ingestion from Azure AI Search, Azure Blob Storage, and uploading local files. You can learn more about Azure OpenAI on your data from the [conceptual article](./concepts/use-your-data.md) and [quickstart](./use-your-data-quickstart.md).
- question: |
How much does it cost to use Azure OpenAI on your data?
answer:
- When using Azure OpenAI on your data, you incur costs when you use Azure AI Search, Azure Blob Storage, Azure Web App Service, semantic search and OpenAI models. There's no additional cost for using the "your data" feature in the [Azure AI Foundry portal](https://ai.azure.com/).
+ When using Azure OpenAI on your data, you incur costs when you use Azure AI Search, Azure Blob Storage, Azure Web App Service, semantic search and OpenAI models. There's no additional cost for using the "your data" feature in the [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs).
- question: |
How can I customize or automate the index creation process?
answer:
@@ -354,7 +354,7 @@ sections:
answer:
You must send queries in the same language of your data. Your data can be in any of the languages supported by [Azure AI Search](/azure/search/search-language-support).
- question: |
- If Semantic Search is enabled for my Azure AI Search resource, will it be automatically applied to Azure OpenAI on your data in the [Azure AI Foundry portal](https://ai.azure.com/)?
+ If Semantic Search is enabled for my Azure AI Search resource, will it be automatically applied to Azure OpenAI on your data in the [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs)?
answer:
When you select "Azure AI Search" as the data source, you can choose to apply semantic search.
If you select "Azure Blob Container" or "Upload files" as the data source, you can create the index as usual. Afterwards you would reingest the data using the "Azure AI Search" option to select the same index and apply Semantic Search. You will then be ready to chat on your data with semantic search applied.
Summary
{
"modification_type": "minor update",
"modification_title": "Azure AI Foundryポータルへのリンクの更新"
}
Explanation
この変更は、Azure OpenAIに関連するFAQのYAMLファイルにおいて、Azure AI Foundryポータルへのリンクを最新のものに更新することを目的としています。具体的には、複数の質問と回答で、リンクが「https://ai.azure.com/」から「https://ai.azure.com/?cid=learnDocs」に修正されています。この更新により、ユーザーはより正確で便利な情報にアクセスできるようになり、情報の一貫性が向上します。一部の質問は、Azure OpenAIリソースへのアクセス、ファインチューニングの有効化、ウェブアプリの更新に関するものであり、これらの内容はそのまま保持されています。この変更は、特にユーザーエクスペリエンスを向上させるための重要な改善です。
articles/ai-services/openai/how-to/batch.md
Diff
@@ -91,7 +91,7 @@ The following aren't currently supported:
### Batch deployment
> [!NOTE]
-> In the [Azure AI Foundry portal](https://ai.azure.com/) the batch deployment types will appear as `Global-Batch` and `Data Zone Batch`. To learn more about Azure OpenAI deployment types, see our [deployment types guide](../how-to/deployment-types.md).
+> In the [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) the batch deployment types will appear as `Global-Batch` and `Data Zone Batch`. To learn more about Azure OpenAI deployment types, see our [deployment types guide](../how-to/deployment-types.md).
:::image type="content" source="../media/how-to/global-batch/global-batch.png" alt-text="Screenshot that shows the model deployment dialog in Azure AI Foundry portal with Global-Batch deployment type highlighted." lightbox="../media/how-to/global-batch/global-batch.png":::
@@ -163,7 +163,7 @@ Yes. Similar to other deployment types, you can create content filters and assoc
### Can I request additional quota?
-Yes, from the quota page in the [Azure AI Foundry portal](https://ai.azure.com/). Default quota allocation can be found in the [quota and limits article](../quotas-limits.md#batch-quota).
+Yes, from the quota page in the [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs). Default quota allocation can be found in the [quota and limits article](../quotas-limits.md#batch-quota).
### What happens if the API doesn't complete my request within the 24 hour time frame?
Summary
{
"modification_type": "minor update",
"modification_title": "Azure AI Foundryポータルへのリンクの更新"
}
Explanation
この変更は、バッチデプロイメントに関するドキュメント内のリンクを更新することを目的としています。具体的には、Azure AI Foundryポータルへのリンクが「https://ai.azure.com/」から「https://ai.azure.com/?cid=learnDocs」に変更されました。この修正により、ユーザーは最新の情報をより便利に参照できるようになります。ドキュメント全体の流れや内容は保持されており、バッチデプロイメントの種類に関する説明やQuotaに関する質問など、重要な情報はそのまま残されています。このマイナーな修正は、ドキュメントの一貫性を高め、読者にとっての利便性を向上させることを目的としています。
articles/ai-services/openai/how-to/completions.md
Diff
@@ -19,7 +19,7 @@ Azure OpenAI in Azure AI Foundry Models provides a **completion endpoint** that
> [!IMPORTANT]
> Unless you have a specific use case that requires the completions endpoint, we recommend instead using the [responses API](./responses.md) of [chat completions endpoint](./chatgpt.md) which allows you to take advantage of the latest models like GPT-4o, GPT-4o mini, and GPT-4 Turbo.
-The best way to start exploring completions is through the playground in [Azure AI Foundry](https://ai.azure.com). It's a simple text box where you enter a prompt to generate a completion. You can start with a simple prompt like this one:
+The best way to start exploring completions is through the playground in [Azure AI Foundry](https://ai.azure.com/?cid=learnDocs). It's a simple text box where you enter a prompt to generate a completion. You can start with a simple prompt like this one:
```console
write a tagline for an ice cream shop
Summary
{
"modification_type": "minor update",
"modification_title": "Azure AI Foundryへのリンクの更新"
}
Explanation
この変更は、Azure OpenAIの完了エンドポイントに関するドキュメント内でのリンクを更新することを目的としています。具体的には、Azure AI Foundryへのリンクが「https://ai.azure.com」から「https://ai.azure.com/?cid=learnDocs」に変更されました。この修正により、ユーザーは最新の情報やリソースにアクセスしやすくなります。それ以外の内容はそのまま残されており、完了エンドポイントの重要性や利用方法についての説明も含まれています。このマイナーな更新は、情報の一貫性を高め、ユーザーエクスペリエンスを向上させる狙いがあります。
articles/ai-services/openai/how-to/dall-e.md
Diff
@@ -237,7 +237,7 @@ The format in which DALL-E 3 generated images are returned. Must be one of `url`
## Call the Image Edit API
-The Image Edit API allows you to modify existing images based on text prompts you provide. The API call is similar to the image generation API call, but you also need to provide an image URL or base 64-encoded image data.
+The Image Edit API allows you to modify existing images based on text prompts you provide. The API call is similar to the image generation API call, but you also need to provide an input image (base64-encoded image data).
Summary
{
"modification_type": "minor update",
"modification_title": "イメージ編集APIの説明更新"
}
Explanation
この変更は、DALL-E 3に関連するイメージ編集APIの説明を更新しています。具体的には、API呼び出しに関する文言が変更されました。「画像URLまたはbase64エンコードされた画像データを提供する必要があります」という説明が、「入力画像(base64エンコードされた画像データ)を提供する必要があります」という表現に修正されています。このマイナーな更新は、ユーザーにとっての理解を改善し、入力条件についての明確さを提供することを目的としています。他の部分の内容は保持されており、APIの機能や用途に関する重要な情報はそのまま残されています。
articles/ai-services/openai/how-to/evaluations.md
Diff
@@ -135,7 +135,7 @@ When you click into each testing criteria, you will see different types of grade
## Getting started
-1. Select the **Azure OpenAI Evaluation (PREVIEW)** within [Azure AI Foundry portal](https://ai.azure.com/). To see this view as an option may need to first select an existing Azure OpenAI resource in a supported region.
+1. Select the **Azure OpenAI Evaluation (PREVIEW)** within [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs). To see this view as an option may need to first select an existing Azure OpenAI resource in a supported region.
2. Select **+ New evaluation**
:::image type="content" source="../media/how-to/evaluations/new-evaluation.png" alt-text="Screenshot of the Azure OpenAI evaluation UX with new evaluation selected." lightbox="../media/how-to/evaluations/new-evaluation.png":::
Summary
{
"modification_type": "minor update",
"modification_title": "Azure AI Foundryポータルへのリンクの更新"
}
Explanation
この変更は、Azure OpenAIの評価に関するドキュメント内での手順の一部を更新しています。具体的には、手順1の中でAzure AI Foundryポータルへのリンクが「https://ai.azure.com」から「https://ai.azure.com/?cid=learnDocs」に変更されました。この更新により、ユーザーはより関連性のある情報へアクセスしやすくなります。他の部分の内容はそのまま残っており、評価の開始方法に関する説明は引き続き提供されています。このマイナーな修正は、ユーザーエクスペリエンスを向上させることを目的としています。
articles/ai-services/openai/how-to/fine-tune-test.md
Diff
@@ -146,7 +146,7 @@ az cognitiveservices account deployment create
## [Portal](#tab/portal)
-After your custom model deploys, you can use it like any other deployed model. You can use the **Playgrounds** in the [Azure AI Foundry portal](https://ai.azure.com) to experiment with your new deployment. You can continue to use the same parameters with your custom model, such as `temperature` and `max_tokens`, as you can with other deployed models.
+After your custom model deploys, you can use it like any other deployed model. You can use the **Playgrounds** in the [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) to experiment with your new deployment. You can continue to use the same parameters with your custom model, such as `temperature` and `max_tokens`, as you can with other deployed models.
:::image type="content" source="../media/fine-tuning/chat-playground.png" alt-text="Screenshot of the Playground pane in Azure AI Foundry portal, with sections highlighted." lightbox="../media/fine-tuning/chat-playground.png":::
Summary
{
"modification_type": "minor update",
"modification_title": "Azure AI Foundryポータルへのリンクの更新"
}
Explanation
この変更は、カスタムモデルのデプロイ後の利用方法に関する説明を更新しています。具体的には、Azure AI Foundryポータルへのリンクが「https://ai.azure.com」から「https://ai.azure.com/?cid=learnDocs」に変更されました。この更新により、ユーザーはより適切な情報にアクセスしやすくなります。また、カスタムモデルのデプロイ後に進行できる操作についての具体的な手順は変更されておらず、引き続き「Playgrounds」での実験や、他のデプロイモデルと同様にtemperature
やmax_tokens
パラメータの使用が可能であることが強調されています。このマイナーな修正は、ユーザーの体験を向上させることを目的としています。
articles/ai-services/openai/how-to/fine-tuning-deploy.md
Diff
@@ -305,7 +305,7 @@ az cognitiveservices account deployment create
## [Portal](#tab/portal)
-After your custom model deploys, you can use it like any other deployed model. You can use the **Playgrounds** in the [Azure AI Foundry portal](https://ai.azure.com) to experiment with your new deployment. You can continue to use the same parameters with your custom model, such as `temperature` and `max_tokens`, as you can with other deployed models.
+After your custom model deploys, you can use it like any other deployed model. You can use the **Playgrounds** in the [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) to experiment with your new deployment. You can continue to use the same parameters with your custom model, such as `temperature` and `max_tokens`, as you can with other deployed models.
:::image type="content" source="../media/quickstarts/playground-load-new.png" alt-text="Screenshot of the Playground pane in Azure AI Foundry portal, with sections highlighted." lightbox="../media/quickstarts/playground-load-new.png":::
Summary
{
"modification_type": "minor update",
"modification_title": "Azure AI Foundryポータルへのリンクの更新"
}
Explanation
この変更は、カスタムモデルのデプロイ後に関する手順を更新しています。具体的には、Azure AI Foundryポータルへのリンクが「https://ai.azure.com」から「https://ai.azure.com/?cid=learnDocs」に変更されました。これにより、ユーザーはより関連性のある情報にアクセスしやすくなります。その他の手順や文言には変更がなく、引き続き「Playgrounds」を利用してデプロイしたモデルを試すことができ、temperature
やmax_tokens
などのパラメータも他のデプロイ済みモデルと同様に使用可能であることが記載されています。このマイナーな修正は、ユーザーの操作をよりスムーズにすることを目的としています。
articles/ai-services/openai/how-to/model-router.md
Diff
@@ -32,7 +32,7 @@ Model router is packaged as a single Azure AI Foundry model that you deploy. Fol
You can use model router through the [chat completions API](/azure/ai-services/openai/chatgpt-quickstart) in the same way you'd use other OpenAI chat models. Set the `model` parameter to the name of our model router deployment, and set the `messages` parameter to the messages you want to send to the model.
-In the [Azure AI Foundry portal](https://ai.azure.com/), you can navigate to your model router deployment on the **Models + endpoints** page and select it to enter the model playground. In the playground experience, you can enter messages and see the model's responses. Each response message shows which underlying model was selected to respond.
+In the [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs), you can navigate to your model router deployment on the **Models + endpoints** page and select it to enter the model playground. In the playground experience, you can enter messages and see the model's responses. Each response message shows which underlying model was selected to respond.
> [!IMPORTANT]
Summary
{
"modification_type": "minor update",
"modification_title": "Azure AI Foundryポータルへのリンクの更新"
}
Explanation
この変更は、モデルルーターの利用方法に関する説明の一部を更新しています。具体的には、Azure AI Foundryポータルへのリンクが「https://ai.azure.com」から「https://ai.azure.com/?cid=learnDocs」に変更されました。このリンクの更新により、ユーザーはより関連性のあるリソースにアクセスしやすくなります。その他の部分には変更はなく、ユーザーは引き続き「Models + endpoints」ページからモデルルーターのデプロイメントを選択し、モデルプレイグラウンドでメッセージを入力して応答を見ることができます。各応答メッセージには、選択された基盤モデルが表示される旨も記載されています。このマイナーな修正は、ユーザー体験の向上を目的としています。
articles/ai-services/openai/how-to/monitor-openai.md
Diff
@@ -74,7 +74,7 @@ After you configure the diagnostic settings, you can work with metrics and log d
[!INCLUDE [horz-monitor-kusto-queries](~/reusable-content/ce-skilling/azure/includes/azure-monitor/horizontals/horz-monitor-kusto-queries.md)]
-After you deploy an Azure OpenAI model, you can send some completions calls by using the **playground** environment in [Azure AI Foundry](https://ai.azure.com/).
+After you deploy an Azure OpenAI model, you can send some completions calls by using the **playground** environment in [Azure AI Foundry](https://ai.azure.com/?cid=learnDocs).
Any text that you enter in the **Completions playground** or the **Chat completions playground** generates metrics and log data for your Azure OpenAI resource. In the Log Analytics workspace for your resource, you can query the monitoring data by using the [Kusto](/azure/data-explorer/kusto/query/) query language.
Summary
{
"modification_type": "minor update",
"modification_title": "Azure AI Foundryポータルへのリンクの更新"
}
Explanation
この変更は、Azure OpenAIモデルのデプロイ後の操作手順に関する説明の一部を更新しています。具体的には、Azure AI Foundryの「playground」環境へのリンクが「https://ai.azure.com」から「https://ai.azure.com/?cid=learnDocs」に変更されました。このリンクの更新により、ユーザーがより関連性のある情報やリソースにアクセスしやすくなることを目指しています。その他の記述に変更はなく、ユーザーは「Completions playground」や「Chat completions playground」で入力したテキストが、Azure OpenAIリソースのためのメトリクスやログデータを生成することも引き続き説明されています。また、Log Analyticsワークスペースでのデータのクエリ方法についても言及されています。このマイナーな更新はユーザー体験の向上に寄与します。
articles/ai-services/openai/how-to/on-your-data-configuration.md
Diff
@@ -164,7 +164,7 @@ This step can be skipped only if you have a [shared private link](#create-shared
You can disable public network access of your Azure OpenAI resource in the Azure portal.
-To allow access to your Azure OpenAI from your client machines, like using [Azure AI Foundry portal](https://ai.azure.com/), you need to create [private endpoint connections](/azure/ai-services/cognitive-services-virtual-networks?tabs=portal#use-private-endpoints) that connect to your Azure OpenAI resource.
+To allow access to your Azure OpenAI from your client machines, like using [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs), you need to create [private endpoint connections](/azure/ai-services/cognitive-services-virtual-networks?tabs=portal#use-private-endpoints) that connect to your Azure OpenAI resource.
## Configure Azure AI Search
@@ -188,7 +188,7 @@ For more information, see the [Azure AI Search RBAC article](/azure/search/searc
You can disable public network access of your Azure AI Search resource in the Azure portal.
-To allow access to your Azure AI Search resource from your client machines, like using [Azure AI Foundry portal](https://ai.azure.com/), you need to create [private endpoint connections](/azure/search/service-create-private-endpoint) that connect to your Azure AI Search resource.
+To allow access to your Azure AI Search resource from your client machines, like using [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs), you need to create [private endpoint connections](/azure/search/service-create-private-endpoint) that connect to your Azure AI Search resource.
### Enable trusted service
@@ -242,7 +242,7 @@ In the Azure portal, navigate to your storage account networking tab, choose "Se
You can disable public network access of your Storage Account in the Azure portal.
-To allow access to your Storage Account from your client machines, like using [Azure AI Foundry portal](https://ai.azure.com/), you need to create [private endpoint connections](/azure/storage/common/storage-private-endpoints) that connect to your blob storage.
+To allow access to your Storage Account from your client machines, like using [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs), you need to create [private endpoint connections](/azure/storage/common/storage-private-endpoints) that connect to your blob storage.
@@ -271,9 +271,9 @@ To enable the developers to use these resources to build applications, the admin
|Role| Resource | Description |
|--|--|--|
-| `Cognitive Services OpenAI Contributor` | Azure OpenAI | Call public ingestion API from [Azure AI Foundry portal](https://ai.azure.com/). The `Contributor` role is not enough, because if you only have `Contributor` role, you cannot call data plane API via Microsoft Entra ID authentication, and Microsoft Entra ID authentication is required in the secure setup described in this article. |
-| `Contributor` | Azure AI Search | List API-Keys to list indexes from [Azure AI Foundry portal](https://ai.azure.com/).|
-| `Contributor` | Storage Account | List Account SAS to upload files from [Azure AI Foundry portal](https://ai.azure.com/).|
+| `Cognitive Services OpenAI Contributor` | Azure OpenAI | Call public ingestion API from [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs). The `Contributor` role is not enough, because if you only have `Contributor` role, you cannot call data plane API via Microsoft Entra ID authentication, and Microsoft Entra ID authentication is required in the secure setup described in this article. |
+| `Contributor` | Azure AI Search | List API-Keys to list indexes from [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs).|
+| `Contributor` | Storage Account | List Account SAS to upload files from [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs).|
| `Contributor` | The resource group or Azure subscription where the developer need to deploy the web app to | Deploy web app to the developer's Azure subscription.|
| `Role Based Access Control Administrator` | Azure OpenAI | Permission to configure the necessary role assignment on the Azure OpenAI resource. Enables the web app to call Azure OpenAI. |
@@ -297,7 +297,7 @@ Configure your local machine `hosts` file to point your resources host names to
## Azure AI Foundry portal
-You should be able to use all [Azure AI Foundry portal](https://ai.azure.com/) features, including both ingestion and inference, from your on-premises client machines.
+You should be able to use all [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) features, including both ingestion and inference, from your on-premises client machines.
## Web app
The web app communicates with your Azure OpenAI resource. Since your Azure OpenAI resource has public network disabled, the web app needs to be set up to use the private endpoint in your virtual network to access your Azure OpenAI resource.
@@ -308,7 +308,7 @@ The web app needs to resolve your Azure OpenAI host name to the private IP of th
1. [Add a DNS record](/azure/dns/private-dns-getstarted-portal#create-an-additional-dns-record). The IP is the private IP of the private endpoint for your Azure OpenAI resource, and you can get the IP address from the network interface associated with the private endpoint for your Azure OpenAI.
1. [Link the private DNS zone to your virtual network](/azure/dns/private-dns-getstarted-portal#link-the-virtual-network) so the web app integrated in this virtual network can use this private DNS zone.
-When deploying the web app from [Azure AI Foundry portal](https://ai.azure.com/), select the same location with the virtual network, and select a proper SKU, so it can support the [virtual network integration feature](/azure/app-service/overview-vnet-integration).
+When deploying the web app from [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs), select the same location with the virtual network, and select a proper SKU, so it can support the [virtual network integration feature](/azure/app-service/overview-vnet-integration).
After the web app is deployed, from the Azure portal networking tab, configure the web app outbound traffic virtual network integration, choose the third subnet that you reserved for web app.
Summary
{
"modification_type": "minor update",
"modification_title": "Azure AI Foundryポータルへのリンクの更新"
}
Explanation
この変更は、データの設定に関するドキュメント内のいくつかのセクションで、Azure AI Foundryポータルへのリンクを更新しています。具体的には、リンクが「https://ai.azure.com」から「https://ai.azure.com/?cid=learnDocs」に変更され、ユーザーがより関連性のあるリソースにアクセスしやすくなります。変更された箇所は、Azure OpenAI、Azure AI Search、ストレージアカウントに関する接続や権限の設定に関する記述です。また、WebアプリがAzure OpenAIリソースにアクセスする際の設定に関するセクションも含まれています。このマイナーな更新は、リンクの明確化とユーザー体験の向上を目的としています。全体として、文章の他の部分には大きな変更はなく、重要な手順と情報はそのまま保持されています。
articles/ai-services/openai/how-to/provisioned-get-started.md
Diff
@@ -36,7 +36,7 @@ Creating a new deployment requires available (unused) quota to cover the desired
Then 200 PTUs of quota are considered used, and there are 300 PTUs available for use to create new deployments.
-A default amount of global, data zone, and regional provisioned quota is assigned to eligible subscriptions in several regions. You can view the quota available to you in a region by visiting the Quotas pane in [Azure AI Foundry portal](https://ai.azure.com/) and selecting the desired subscription and region. For example, the screenshot below shows a quota limit of 500 PTUs in West US for the selected subscription. Note that you might see lower values of available default quotas.
+A default amount of global, data zone, and regional provisioned quota is assigned to eligible subscriptions in several regions. You can view the quota available to you in a region by visiting the Quotas pane in [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) and selecting the desired subscription and region. For example, the screenshot below shows a quota limit of 500 PTUs in West US for the selected subscription. Note that you might see lower values of available default quotas.
:::image type="content" source="../media/provisioned/available-quota.png" alt-text="A screenshot of the available quota in Azure AI Foundry portal." lightbox="../media/provisioned/available-quota.png":::
@@ -57,7 +57,7 @@ Once you have verified your quota, you can create a deployment. To create a prov
-1. Sign into the [Azure AI Foundry portal](https://ai.azure.com).
+1. Sign into the [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs).
1. Choose the subscription that was enabled for provisioned deployments & select the desired resource in a region where you have the quota.
1. Under **Management** in the left-nav select **Deployments**.
1. Select Create new deployment and configure the following fields. Expand the **advanced options** drop-down menu.
@@ -106,7 +106,7 @@ REST, ARM template, Bicep, and Terraform can also be used to create deployments.
Due to the dynamic nature of capacity availability, it is possible that the region of your selected resource might not have the service capacity to create the deployment of the specified model, version, and number of PTUs.
-In this event, the wizard in [Azure AI Foundry portal](https://ai.azure.com/) will direct you to other regions with available quota and capacity to create a deployment of the desired model. If this happens, the deployment dialog will look like this:
+In this event, the wizard in [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) will direct you to other regions with available quota and capacity to create a deployment of the desired model. If this happens, the deployment dialog will look like this:
:::image type="content" source="../media/provisioned/deployment-screen-2.png" alt-text="Screenshot of the Azure AI Foundry portal deployment page for a provisioned deployment with no capacity available." lightbox="../media/provisioned/deployment-screen-2.png":::
@@ -164,7 +164,7 @@ The inferencing code for provisioned deployments is the same a standard deployme
## Understanding expected throughput
The amount of throughput that you can achieve on the endpoint is a factor of the number of PTUs deployed, input size, output size, and call rate. The number of concurrent calls and total tokens processed can vary based on these values. Our recommended way for determining the throughput for your deployment is as follows:
-1. Use the Capacity calculator for a sizing estimate. You can find the capacity calculator in [Azure AI Foundry portal](https://ai.azure.com/) under the quotas page and Provisioned tab.
+1. Use the Capacity calculator for a sizing estimate. You can find the capacity calculator in [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) under the quotas page and Provisioned tab.
1. Benchmark the load using real traffic workload. For more information about benchmarking, see the [benchmarking](#run-a-benchmark) section.
Summary
{
"modification_type": "minor update",
"modification_title": "Azure AI Foundryポータルへのリンクの更新"
}
Explanation
この変更は、Azure AI Foundryポータルに関連するいくつかのセクションでリンクを更新しています。具体的には、以前のリンク「https://ai.azure.com」が「https://ai.azure.com/?cid=learnDocs」に変更されました。この変更は、ユーザーがより関連性の高いリソースや情報にアクセスしやすくするためのものです。変更された箇所は、定義されたQuotaの確認、デプロイメントの作成、キャパシティ計算機の使用に関する手順を含みます。全体として、リンクの修正により、ユーザー体験が向上するとともに、手順や情報の信頼性が強化されることが期待されます。その他の情報や手順はそのまま保持されており、主な内容には大きな影響はありません。
articles/ai-services/openai/how-to/provisioned-throughput-onboarding.md
Diff
@@ -119,7 +119,7 @@ Choose a model, and click **Confirm**. Select a provision-managed deployment typ
:::image type="content" source="../media/provisioned/deployment-ptu-capacity-calculator.png" alt-text="A screenshot of deployment workflow PTU capacity calculator." lightbox="../media/provisioned/deployment-ptu-capacity-calculator.png":::
-To estimate provisioned capacity using request level data, open the capacity planner in the [Azure AI Foundry](https://ai.azure.com). The capacity calculator is under **Shared resources** > **Model Quota** > **Azure OpenAI Provisioned**.
+To estimate provisioned capacity using request level data, open the capacity planner in the [Azure AI Foundry](https://ai.azure.com/?cid=learnDocs). The capacity calculator is under **Shared resources** > **Model Quota** > **Azure OpenAI Provisioned**.
The **Provisioned** option and the capacity planner are only available in certain regions within the Quota pane, if you don't see this option setting the quota region to *Sweden Central* will make this option available. Enter the following parameters based on your workload.
@@ -144,7 +144,7 @@ The values in the output column are the estimated value of PTU units required fo
Discounts on top of the hourly usage price can be obtained by purchasing an Azure Reservation for Azure OpenAI Provisioned, Data Zone Provisioned, and Global Provisioned. An Azure Reservation is a term-discounting mechanism shared by many Azure products. For example, Compute and Cosmos DB. For Azure OpenAI Provisioned, Data Zone Provisioned, and Global Provisioned, the reservation provides a discount in exchange for committing to payment for fixed number of PTUs for a one-month or one-year period.
-* Azure Reservations are purchased via the Azure portal, not the [Azure AI Foundry portal](https://ai.azure.com/) Link to Azure reservation portal.
+* Azure Reservations are purchased via the Azure portal, not the [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) Link to Azure reservation portal.
* Reservations are purchased regionally and can be flexibly scoped to cover usage from a group of deployments. Reservation scopes include:
Summary
{
"modification_type": "minor update",
"modification_title": "Azure AI Foundryポータルへのリンクの更新"
}
Explanation
この変更では、Azure AI Foundryポータルに関連するリンクが更新されています。特に、以前のリンク「https://ai.azure.com」が「https://ai.azure.com/?cid=learnDocs」に変更されています。これにより、ユーザーは関連するリソースにより簡単にアクセスできるようになります。変更の具体的な内容は、プロビジョンされた容量を見積もるための手順とAzure Reservationsの購入に関連する情報です。また、容量プランナーに関する説明も含まれており、特定の地域で利用可能なオプションの設定についての情報が含まれています。このマイナーな更新により、ユーザー体験が向上することが期待されますが、その他の重要な情報や手順はそのまま保持されています。
articles/ai-services/openai/how-to/quota.md
Diff
@@ -42,11 +42,11 @@ The flexibility to distribute TPM globally within a subscription and region has
When you create a model deployment, you have the option to assign Tokens-Per-Minute (TPM) to that deployment. TPM can be modified in increments of 1,000, and will map to the TPM and RPM rate limits enforced on your deployment, as discussed above.
-To create a new deployment from within the [Azure AI Foundry portal](https://ai.azure.com/) select **Deployments** > **Deploy model** > **Deploy base model** > **Select Model** > **Confirm**.
+To create a new deployment from within the [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) select **Deployments** > **Deploy model** > **Deploy base model** > **Select Model** > **Confirm**.
:::image type="content" source="../media/quota/deployment-new.png" alt-text="Screenshot of the deployment UI of Azure AI Foundry" lightbox="../media/quota/deployment-new.png":::
-Post deployment you can adjust your TPM allocation by selecting and editing your model from the **Deployments** page in [Azure AI Foundry portal](https://ai.azure.com/). You can also modify this setting from the **Management** > **Model quota** page.
+Post deployment you can adjust your TPM allocation by selecting and editing your model from the **Deployments** page in [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs). You can also modify this setting from the **Management** > **Model quota** page.
> [!IMPORTANT]
> Quotas and limits are subject to change, for the most up-date-information consult our [quotas and limits article](../quotas-limits.md).
@@ -66,7 +66,7 @@ All other model classes have a common max TPM value.
## View and request quota
-For an all up view of your quota allocations across deployments in a given region, select **Management** > **Quota** in [Azure AI Foundry portal](https://ai.azure.com/):
+For an all up view of your quota allocations across deployments in a given region, select **Management** > **Quota** in [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs):
:::image type="content" source="../media/quota/quota-new.png" alt-text="Screenshot of the quota UI of Azure AI Foundry" lightbox="../media/quota/quota-new.png":::
Summary
{
"modification_type": "minor update",
"modification_title": "Azure AI Foundryポータルへのリンクの更新"
}
Explanation
この変更は、Azure AI Foundryポータルに関連するリンクの更新を行っています。特に、以前のリンク「https://ai.azure.com」が「https://ai.azure.com/?cid=learnDocs」に変更されており、ユーザーがより関連性の高い情報にアクセスできるよう配慮されています。この更新は、TPM(トークン・パー・ミニット)の設定やデプロイメント管理に関する具体的な手順を含む箇所で行われています。変更された内容には、モデルのデプロイメント作成時やデプロイ後のTPMの調整に関する説明が含まれています。このマイナーな更新により、ユーザー体験が向上することが期待され、手順やその他の重要な情報はそのまま保持されています。
articles/ai-services/openai/how-to/reinforcement-fine-tuning.md
Diff
@@ -176,11 +176,11 @@ Models which we're supporting as grader models are:
"model": string,
"pass_threshold": number,
"range": number[],
- "sampling_parameters": {
+ "sampling_params": {
"seed": number,
"top_p": number,
"temperature": number,
- "max_completion_tokens": number,
+ "max_completions_tokens": number,
"reasoning_effort": "low" | "medium" | "high"
}
}
Summary
{
"modification_type": "minor update",
"modification_title": "サンプリングパラメーター名の変更"
}
Explanation
この変更では、強化学習のファインチューニングに関するドキュメント内のサンプリングパラメーターに関するキー名が変更されています。具体的には、sampling_parameters
がsampling_params
に、また、max_completion_tokens
がmax_completions_tokens
に変更されました。これにより、用語がより一貫性を持つように改良され、パラメーター設定がわかりやすくなっています。このマイナーな更新は、コードや設定を使用するユーザーに対して明確さを提供することを目的としています。その他の重要な情報はそのまま保持されています。
articles/ai-services/openai/how-to/risks-safety-monitor.md
Diff
@@ -14,13 +14,13 @@ manager: nitinme
When you use an Azure OpenAI model deployment with a content filter, you might want to check the results of the filtering activity. You can use that information to further adjust your [filter configuration](/azure/ai-services/openai/how-to/content-filters) to serve your specific business needs and Responsible AI principles.
-[Azure AI Foundry](https://ai.azure.com/) provides a Risks & Safety monitoring dashboard for each of your deployments that uses a content filter configuration.
+[Azure AI Foundry](https://ai.azure.com/?cid=learnDocs) provides a Risks & Safety monitoring dashboard for each of your deployments that uses a content filter configuration.
## Access Risks & Safety monitoring
To access Risks & Safety monitoring, you need an Azure OpenAI resource in one of the supported Azure regions: East US, Switzerland North, France Central, Sweden Central, Canada East. You also need a model deployment that uses a content filter configuration.
-Go to [Azure AI Foundry](https://ai.azure.com/) and sign in with the credentials associated with your Azure OpenAI resource. Select a project. Then select the **Models + endpoints** tab on the left and then select your model deployment from the list. On the deployment's page, select the **Monitoring** tab at the top. Then select **Open in Azure Monitor** to view the full report in the Azure portal.
+Go to [Azure AI Foundry](https://ai.azure.com/?cid=learnDocs) and sign in with the credentials associated with your Azure OpenAI resource. Select a project. Then select the **Models + endpoints** tab on the left and then select your model deployment from the list. On the deployment's page, select the **Monitoring** tab at the top. Then select **Open in Azure Monitor** to view the full report in the Azure portal.
## Configure metrics
@@ -56,7 +56,7 @@ To use Potentially abusive user detection, you need:
### Set up your Azure Data Explorer database
In order to protect the data privacy of user information and manage the permission of the data, we support the option for our customers to bring their own storage to get the detailed potentially abusive user detection insights (including user GUID and statistics on harmful request by category) stored in a compliant way and with full control. Follow these steps to enable it:
-1. In [Azure AI Foundry](https://ai.azure.com/), navigate to the model deployment that you'd like to set up user abuse analysis with, and select **Add a data store**.
+1. In [Azure AI Foundry](https://ai.azure.com/?cid=learnDocs), navigate to the model deployment that you'd like to set up user abuse analysis with, and select **Add a data store**.
1. Fill in the required information and select **Save**. We recommend you create a new database to store the analysis results.
1. After you connect the data store, take the following steps to grant permission to write analysis results to the connected database:
1. Go to your Azure OpenAI resource's page in the Azure portal, and choose the **Identity** tab.
Summary
{
"modification_type": "minor update",
"modification_title": "Azure AI Foundryポータルへのリンクの更新"
}
Explanation
この修正では、Azure AI Foundryポータルに関連するリンクが更新されています。具体的には、元のリンク「https://ai.azure.com」が「https://ai.azure.com/?cid=learnDocs」に変更されており、ユーザーが特定のドキュメントやリソースにアクセスしやすくなるよう配慮されています。この修正は、リスクと安全性のモニタリングに関するセクションにおいて行われており、コンテンツフィルターを使用したモデルデプロイメントの監視ダッシュボードに関する情報を提供しています。また、データプライバシーやユーザーの悪用分析の設定に関する指示も含まれており、全体としてユーザー体験の向上を図っています。変更内容はマイナーなものであり、ソースや手順はそのままとなっています。
articles/ai-services/openai/how-to/role-based-access-control.md
Diff
@@ -48,8 +48,8 @@ If a user were granted role-based access to only this role for an Azure OpenAI r
✅ View the resource in [Azure portal](https://portal.azure.com) <br>
✅ View the resource endpoint under **Keys and Endpoint** <br>
-✅ Ability to view the resource and associated model deployments in [Azure AI Foundry portal](https://ai.azure.com/). <br>
-✅ Ability to view what models are available for deployment in [Azure AI Foundry portal](https://ai.azure.com/). <br>
+✅ Ability to view the resource and associated model deployments in [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs). <br>
+✅ Ability to view what models are available for deployment in [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs). <br>
✅ Use the Chat, Completions, and DALL-E (preview) playground experiences to generate text and images with any models that have already been deployed to this Azure OpenAI resource. <br>
✅ Make inference API calls with Microsoft Entra ID.
@@ -91,7 +91,7 @@ This role is typically granted access at the resource group level for a user in
✅ View resources in the assigned resource group in the [Azure portal](https://portal.azure.com). <br>
✅ View the resource endpoint under **Keys and Endpoint** <br>
✅ View/Copy/Regenerate keys under **Keys and Endpoint** <br>
-✅ Ability to view what models are available for deployment in [Azure AI Foundry portal](https://ai.azure.com/) <br>
+✅ Ability to view what models are available for deployment in [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) <br>
✅ Use the Chat, Completions, and DALL-E (preview) playground experiences to generate text and images with any models that have already been deployed to this Azure OpenAI resource <br>
✅ Create customized content filters <br>
✅ Add data sources to Azure OpenAI On Your Data. **You must also have the [Cognitive Services OpenAI Contributor](#cognitive-services-openai-contributor) role as well**.
@@ -112,27 +112,27 @@ Viewing quota requires the **Cognitive Services Usages Reader** role. This role
This role can be found in the Azure portal under **Subscriptions** > ***Access control (IAM)** > **Add role assignment** > search for **Cognitive Services Usages Reader**. The role must be applied at the subscription level, it does not exist at the resource level.
-If you don't wish to use this role, the subscription **Reader** role provides equivalent access, but it also grants read access beyond the scope of what is needed for viewing quota. Model deployment via the [Azure AI Foundry portal](https://ai.azure.com/) is also partially dependent on the presence of this role.
+If you don't wish to use this role, the subscription **Reader** role provides equivalent access, but it also grants read access beyond the scope of what is needed for viewing quota. Model deployment via the [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) is also partially dependent on the presence of this role.
This role provides little value by itself and is instead typically assigned in combination with one or more of the previously described roles.
#### Cognitive Services Usages Reader + Cognitive Services OpenAI User
All the capabilities of Cognitive Services OpenAI User plus the ability to:
-✅ View quota allocations in [Azure AI Foundry portal](https://ai.azure.com/)
+✅ View quota allocations in [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs)
#### Cognitive Services Usages Reader + Cognitive Services OpenAI Contributor
All the capabilities of Cognitive Services OpenAI Contributor plus the ability to:
-✅ View quota allocations in [Azure AI Foundry portal](https://ai.azure.com/)
+✅ View quota allocations in [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs)
#### Cognitive Services Usages Reader + Cognitive Services Contributor
All the capabilities of Cognitive Services Contributor plus the ability to:
-✅ View & edit quota allocations in [Azure AI Foundry portal](https://ai.azure.com/) <br>
+✅ View & edit quota allocations in [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) <br>
✅ Create new model deployments or edit existing model deployments (via Azure AI Foundry) <br>
## Summary
@@ -141,8 +141,8 @@ All the capabilities of Cognitive Services Contributor plus the ability to:
|-------------|--------------------|------------------------|------------------|-------------------------|
|View the resource in Azure portal |✅|✅|✅| ➖ |
|View the resource endpoint under “Keys and Endpoint” |✅|✅|✅| ➖ |
-|View the resource and associated model deployments in [Azure AI Foundry portal](https://ai.azure.com/) |✅|✅|✅| ➖ |
-|View what models are available for deployment in [Azure AI Foundry portal](https://ai.azure.com/)|✅|✅|✅| ➖ |
+|View the resource and associated model deployments in [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) |✅|✅|✅| ➖ |
+|View what models are available for deployment in [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs)|✅|✅|✅| ➖ |
|Use the Chat, Completions, and DALL-E (preview) playground experiences with any models that have already been deployed to this Azure OpenAI resource.|✅|✅|✅| ➖ |
|Create or edit model deployments|❌|✅|✅| ➖ |
|Create or deploy custom fine-tuned models|❌|✅|✅| ➖ |
@@ -160,7 +160,7 @@ All the capabilities of Cognitive Services Contributor plus the ability to:
**Issue:**
-When selecting an existing Azure Cognitive Search resource the search indices don't load, and the loading wheel spins continuously. In [Azure AI Foundry portal](https://ai.azure.com/), go to **Playground Chat** > **Add your data (preview)** under Assistant setup. Selecting **Add a data source** opens a modal that allows you to add a data source through either Azure Cognitive Search or Blob Storage. Selecting the Azure Cognitive Search option and an existing Azure Cognitive Search resource should load the available Azure Cognitive Search indices to select from.
+When selecting an existing Azure Cognitive Search resource the search indices don't load, and the loading wheel spins continuously. In [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs), go to **Playground Chat** > **Add your data (preview)** under Assistant setup. Selecting **Add a data source** opens a modal that allows you to add a data source through either Azure Cognitive Search or Blob Storage. Selecting the Azure Cognitive Search option and an existing Azure Cognitive Search resource should load the available Azure Cognitive Search indices to select from.
**Root cause**
@@ -186,7 +186,7 @@ For this API call, you need a **subscription-level scope** role. You can use the
**Root cause:**
-Insufficient subscription-level access for the user attempting to access the blob storage in [Azure AI Foundry portal](https://ai.azure.com/). The user may **not** have the necessary permissions to call the Azure Management API endpoint: ```https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Storage/storageAccounts/{accountName}/listAccountSas?api-version=2022-09-01```
+Insufficient subscription-level access for the user attempting to access the blob storage in [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs). The user may **not** have the necessary permissions to call the Azure Management API endpoint: ```https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Storage/storageAccounts/{accountName}/listAccountSas?api-version=2022-09-01```
Public access to the blob storage is disabled by the owner of the Azure subscription for security reasons.
Summary
{
"modification_type": "minor update",
"modification_title": "Azure AI Foundryポータルへのリンクの修正"
}
Explanation
この修正では、役割ベースのアクセス制御に関するドキュメント内で、Azure AI Foundryポータルへのリンクが更新されています。具体的には、元のURL「https://ai.azure.com」が「https://ai.azure.com/?cid=learnDocs」に変更されています。この変更により、ユーザーは特定のドキュメントリソースにアクセスしやすくなることを目的としています。主に、この変更はリソースの可視性やモデルデプロイメントに関する機能を説明するセクションに影響を与えています。変更内容は11行追加および削除されており、全体的な文書の流れや情報の一貫性を維持しつつ、ユーザーの利便性を向上させることが意図されています。
articles/ai-services/openai/how-to/stored-completions.md
Diff
@@ -245,7 +245,7 @@ curl $AZURE_OPENAI_ENDPOINT/openai/deployments/gpt-4o/chat/completions?api-versi
---
-Once stored completions are enabled for an Azure OpenAI deployment, they'll begin to show up in the [Azure AI Foundry portal](https://ai.azure.com) in the **Stored Completions** pane.
+Once stored completions are enabled for an Azure OpenAI deployment, they'll begin to show up in the [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) in the **Stored Completions** pane.
:::image type="content" source="../media/stored-completions/stored-completions.png" alt-text="Screenshot of the stored completions User Experience." lightbox="../media/stored-completions/stored-completions.png":::
@@ -255,7 +255,7 @@ Distillation allows you to turn your stored completions into a fine-tuning datas
Distillation requires a minimum of 10 stored completions, though it's recommended to provide hundreds to thousands of stored completions for the best results.
-1. From the **Stored Completions** pane in the [Azure AI Foundry portal](https://ai.azure.com) use the **Filter** options to select the completions you want to train your model with.
+1. From the **Stored Completions** pane in the [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) use the **Filter** options to select the completions you want to train your model with.
2. To begin distillation, select **Distill**
@@ -284,7 +284,7 @@ The [evaluation](./evaluations.md) of large language models is a critical step i
Stored completions can be used as a dataset for running evaluations.
-1. From the **Stored Completions** pane in the [Azure AI Foundry portal](https://ai.azure.com) use the **Filter** options to select the completions you want to be part of your evaluation dataset.
+1. From the **Stored Completions** pane in the [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) use the **Filter** options to select the completions you want to be part of your evaluation dataset.
2. To configure the evaluation, select **Evaluate**
Summary
{
"modification_type": "minor update",
"modification_title": "Azure AI Foundryポータルへのリンクの修正"
}
Explanation
この修正は、Azure OpenAIデプロイメントのストア完成に関するドキュメント内のリンクを更新するもので、リンクが「https://ai.azure.com」から「https://ai.azure.com/?cid=learnDocs」に変更されています。この変更により、ストア完成が有効化されると、関連情報がAzure AI Foundryポータル内のStored Completionsペインに表示されることが明確に示され、ユーザーがドキュメントを通じてより効率的に必要な情報にアクセスできるようになります。また、ストア完成を使用してモデルの評価データセットを構成する手順にも、この新しいリンクが適用されています。この修正は、ドキュメントの明瞭さとユーザビリティを向上させることを目的としています。
articles/ai-services/openai/how-to/use-blocklists.md
Diff
@@ -62,7 +62,7 @@ The response code should be `201` (created a new list) or `200` (updated an exis
### Apply a blocklist to a content filter
-If you haven't yet created a content filter, you can do so in [Azure AI Foundry](https://ai.azure.com/). See [Content filtering](/azure/ai-services/openai/how-to/content-filters#create-a-content-filter-in-azure-ai-foundry).
+If you haven't yet created a content filter, you can do so in [Azure AI Foundry](https://ai.azure.com/?cid=learnDocs). See [Content filtering](/azure/ai-services/openai/how-to/content-filters#create-a-content-filter-in-azure-ai-foundry).
To apply a **completion** blocklist to a content filter, use the following cURL command:
Summary
{
"modification_type": "minor update",
"modification_title": "Azure AI Foundryポータルへのリンクの修正"
}
Explanation
この修正は、コンテンツフィルターの作成に関する部分で、Azure AI Foundryのリンクが「https://ai.azure.com」から「https://ai.azure.com/?cid=learnDocs」に変更されています。この変更により、ユーザーはコンテンツフィルターを作成する際に、より具体的な情報にアクセスできるようになります。このリンクの更新は、情報の明瞭さを向上させ、ドキュメント全体の一貫性を保つことを目的としています。全体として、この修正はわずかな追加と削除によるもので、ユーザーの利便性を高める効果があります。
articles/ai-services/openai/how-to/use-web-app.md
Diff
@@ -17,7 +17,7 @@ recommendations: false
> [!NOTE]
> The web app and its [source code](https://github.com/microsoft/sample-app-aoai-chatGPT) are provided "as is" and as a sample only. Customers are responsible for all customization and implementation of their web apps. See the support section for the web app on [GitHub](https://github.com/microsoft/sample-app-aoai-chatGPT/blob/main/SUPPORT.md) for more information.
-Along with [Azure AI Foundry portal](https://ai.azure.com/), APIs, and SDKs, you can use the customizable standalone web app to interact with Azure OpenAI models by using a graphical user interface. Key features include:
+Along with [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs), APIs, and SDKs, you can use the customizable standalone web app to interact with Azure OpenAI models by using a graphical user interface. Key features include:
* Connectivity with multiple data sources to support rich querying and retrieval-augmented generation, including Azure AI Search, Prompt Flow, and more.
* Conversation history and user feedback collection through Cosmos DB.
* Authentication with role-based access control via Microsoft Entra ID.
@@ -119,7 +119,7 @@ To modify the application user interface, follow the instructions in the previou
You can turn on chat history for your users of the web app. When you turn on the feature, users have access to their individual previous queries and responses.
-To turn on chat history, deploy or redeploy your model as a web app by using [Azure AI Foundry portal](https://ai.azure.com/) and select **Enable chat history and user feedback in the web app**.
+To turn on chat history, deploy or redeploy your model as a web app by using [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) and select **Enable chat history and user feedback in the web app**.
:::image type="content" source="../media/use-your-data/enable-chat-history.png" alt-text="Screenshot of the checkbox for enabling chat history in Azure OpenAI or Azure AI Foundry." lightbox="../media/use-your-data/enable-chat-history.png":::
@@ -191,15 +191,15 @@ To connect to Azure AI Search without redeploying your app, you can modify the f
- `AZURE_SEARCH_ENABLE_IN_DOMAIN`: Limits responses to queries related only to your data.
- Data type: boolean, should be set to `True`.
- `AZURE_SEARCH_CONTENT_COLUMNS`: Specifies the list of fields in your Azure AI Search index that contain the text content of your documents, used when formulating a bot response.
- - Data type: text, defaults to `content` if deployed from [Azure AI Foundry portal](https://ai.azure.com/),
+ - Data type: text, defaults to `content` if deployed from [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs),
- `AZURE_SEARCH_FILENAME_COLUMN`: Specifies the field from your Azure AI Search index that provides a unique identifier of the source data to display in the UI.
- - Data type: text, defaults to `filepath` if deployed from [Azure AI Foundry portal](https://ai.azure.com/),
+ - Data type: text, defaults to `filepath` if deployed from [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs),
- `AZURE_SEARCH_TITLE_COLUMN`: Specifies the field from your Azure AI Search index that provides a relevant title or header for your data content to display in the UI.
- - Data type: text, defaults to `title` if deployed from [Azure AI Foundry portal](https://ai.azure.com/),
+ - Data type: text, defaults to `title` if deployed from [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs),
- `AZURE_SEARCH_URL_COLUMN`: Specifies the field from your Azure AI Search index that contains a URL for the document.
- - Data type: text, defaults to `url` if deployed from [Azure AI Foundry portal](https://ai.azure.com/),
+ - Data type: text, defaults to `url` if deployed from [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs),
- `AZURE_SEARCH_VECTOR_COLUMNS`: Specifies the list of fields in your Azure AI Search index that contain vector embeddings of your documents, used when formulating a bot response.
- - Data type: text, defaults to `contentVector` if deployed from [Azure AI Foundry portal](https://ai.azure.com/),
+ - Data type: text, defaults to `contentVector` if deployed from [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs),
- `AZURE_SEARCH_QUERY_TYPE`: Specifies the query type to use: `simple`, `semantic`, `vector`, `vectorSimpleHybrid`, or `vectorSemanticHybrid`. This setting takes precedence over `AZURE_SEARCH_USE_SEMANTIC_SEARCH`.
- Data type: text, we recommend testing with `vectorSemanticHybrid`.
- `AZURE_SEARCH_PERMITTED_GROUPS_COLUMN`: Specifies the field from your Azure AI Search index that contains Microsoft Entra group IDs, determining document-level access control.
@@ -294,7 +294,7 @@ The JSON to paste in the Advanced edit JSON editor is:
### Creating and deploying your prompt flow in Azure AI Foundry portal
-Follow [this tutorial](../../../ai-foundry/how-to/flow-deploy.md) to create, test, and deploy an inferencing endpoint for your prompt flow in [Azure AI Foundry portal](https://ai.azure.com/).
+Follow [this tutorial](../../../ai-foundry/how-to/flow-deploy.md) to create, test, and deploy an inferencing endpoint for your prompt flow in [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs).
### Enable underlying citations from your prompt flow
@@ -404,7 +404,7 @@ If you customized or changed the app's source code, you need to update your app'
## Deleting your Cosmos DB instance
-Deleting your web app doesn't delete your Cosmos DB instance automatically. To delete your Cosmos DB instance along with all stored chats, you need to go to the associated resource in the [Azure portal](https://portal.azure.com) and delete it. If you delete the Cosmos DB resource but keep the chat history option selected on subsequent updates from the [Azure AI Foundry portal](https://ai.azure.com/), the application notifies the user of a connection error. However, the user can continue to use the web app without access to the chat history.
+Deleting your web app doesn't delete your Cosmos DB instance automatically. To delete your Cosmos DB instance along with all stored chats, you need to go to the associated resource in the [Azure portal](https://portal.azure.com) and delete it. If you delete the Cosmos DB resource but keep the chat history option selected on subsequent updates from the [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs), the application notifies the user of a connection error. However, the user can continue to use the web app without access to the chat history.
## Enabling Microsoft Entra ID authentication between services
Summary
{
"modification_type": "minor update",
"modification_title": "Azure AI Foundryポータルへのリンクの修正"
}
Explanation
この修正は、Azure OpenAIサービスを使用するためのウェブアプリに関する文書に含まれるリンクを更新するもので、Azure AI Foundryポータルのリンクが「https://ai.azure.com」から「https://ai.azure.com/?cid=learnDocs」に変更されています。この変更により、ユーザーがポータルへアクセスする際に、特定の情報を得やすくなります。修正された部分には、チャット履歴の有効化やカスタマイズ可能なウェブアプリの機能に関する説明が含まれています。全体として、リンクの更新は、ユーザーの利便性を高めることを目的としており、変更は追加と削除が同数であり、ドキュメントの一貫性と明瞭さを向上させています。
articles/ai-services/openai/how-to/weights-and-biases-integration.md
Diff
@@ -87,7 +87,7 @@ Give your Azure OpenAI resource the **Key Vault Secrets Officer** role.
## Link Weights & Biases with Azure OpenAI
-1. Navigate to the [Azure AI Foundry portal](https://ai.azure.com) and select your Azure OpenAI fine-tuning resource.
+1. Navigate to the [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) and select your Azure OpenAI fine-tuning resource.
:::image type="content" source="../media/how-to/weights-and-biases/manage-integrations.png" alt-text="Screenshot of the manage integrations button." lightbox="../media/how-to/weights-and-biases/manage-integrations.png":::
Summary
{
"modification_type": "minor update",
"modification_title": "Azure AI Foundryポータルへのリンクの修正"
}
Explanation
この修正では、Weights & BiasesとAzure OpenAIの統合に関するドキュメント内のリンクが更新されています。具体的には、Azure AI Foundryポータルへのリンクが「https://ai.azure.com」から「https://ai.azure.com/?cid=learnDocs」に変更されています。これにより、ユーザーはポータルにアクセスする際により具体的な情報を得られるようになります。この変更は、ユーザーのナビゲーションを改善し、ドキュメント全体の一貫性を保つことを目指しています。全体として、わずかな追加と削除による修正であり、情報の明瞭さが向上しています。
articles/ai-services/openai/how-to/work-with-code.md
Diff
@@ -27,7 +27,7 @@ You can use Codex for a variety of tasks including:
## How to use completions models with code
-Here are a few examples of using completion models that can be tested in the [Azure AI Foundry](https://ai.azure.com) playground with a deployment of `gpt-35-turbo-instruct`.
+Here are a few examples of using completion models that can be tested in the [Azure AI Foundry](https://ai.azure.com/?cid=learnDocs) playground with a deployment of `gpt-35-turbo-instruct`.
### Saying "Hello" (Python)
Summary
{
"modification_type": "minor update",
"modification_title": "Azure AI Foundryポータルへのリンクの修正"
}
Explanation
この修正では、コーディングに関するドキュメントの一部で、Azure AI Foundryのプレイグラウンドに関するリンクが更新されています。具体的には、リンクが「https://ai.azure.com」から「https://ai.azure.com/?cid=learnDocs」に変更されました。この変更により、ユーザーはリンクをクリックした際に、より具体的な情報にアクセスできるようになります。全体的には、変更はわずかな追加と削除であり、文書の一貫性を向上させ、ユーザーの利便性を高めることを目的としています。
articles/ai-services/openai/how-to/working-with-models.md
Diff
@@ -20,7 +20,7 @@ You can get a list of models that are available for both inference and fine-tuni
## Model updates
-Azure OpenAI now supports automatic updates for select model deployments. On models where automatic update support is available, a model version drop-down is visible in [Azure AI Foundry portal](https://ai.azure.com/) under **Deployments** and **Edit**:
+Azure OpenAI now supports automatic updates for select model deployments. On models where automatic update support is available, a model version drop-down is visible in [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) under **Deployments** and **Edit**:
:::image type="content" source="../media/models/auto-update-new.png" alt-text="Screenshot of the deploy model UI in the Azure AI Foundry portal." lightbox="../media/models/auto-update-new.png":::
@@ -43,13 +43,13 @@ When you select a specific model version for a deployment, this version remains
## Viewing retirement dates
-For currently deployed models, in the [Azure AI Foundry portal](https://ai.azure.com/) select **Deployments**:
+For currently deployed models, in the [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) select **Deployments**:
:::image type="content" source="../media/models/deployments-new.png" alt-text="Screenshot of the deployment UI of the Azure AI Foundry portal." lightbox="../media/models/deployments-new.png":::
## Model deployment upgrade configuration
-You can check what model upgrade options are set for previously deployed models in the [Azure AI Foundry portal](https://ai.azure.com). Select **Deployments** > Under the deployment name column select one of the deployment names that are highlighted in blue.
+You can check what model upgrade options are set for previously deployed models in the [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs). Select **Deployments** > Under the deployment name column select one of the deployment names that are highlighted in blue.
Selecting a deployment name opens the **Properties** for the model deployment. You can view what upgrade options are set for your deployment under **Version update policy**:
Summary
{
"modification_type": "minor update",
"modification_title": "Azure AI Foundryポータルへのリンクの修正"
}
Explanation
この修正では、モデルに関する文書内のAzure AI Foundryポータルへのリンクが更新されています。具体的には、複数の箇所でリンクが「https://ai.azure.com」から「https://ai.azure.com/?cid=learnDocs」に変更されました。これにより、モデルデプロイメントに関する追加情報にユーザーが簡単にアクセスできるようになります。また、この変更はドキュメント全体の整合性を保ち、ユーザーエクスペリエンスを向上させることを目指しています。この修正による他の内容の変更もあり、情報の明確さが強化されています。
articles/ai-services/openai/includes/audio-completions-ai-foundry.md
Diff
@@ -15,10 +15,10 @@ ms.date: 1/7/2025
## Use GPT-4o audio generation
-To chat with your deployed `gpt-4o-mini-audio-preview` model in the **Chat** playground of [Azure AI Foundry portal](https://ai.azure.com), follow these steps:
+To chat with your deployed `gpt-4o-mini-audio-preview` model in the **Chat** playground of [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs), follow these steps:
-1. Go to the [Azure AI Foundry portal](https://ai.azure.com) and select your project that has your deployed `gpt-4o-mini-audio-preview` model.
-1. Go to your project in [Azure AI Foundry](https://ai.azure.com).
+1. Go to the [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) and select your project that has your deployed `gpt-4o-mini-audio-preview` model.
+1. Go to your project in [Azure AI Foundry](https://ai.azure.com/?cid=learnDocs).
1. Select **Playgrounds** from the left pane.
1. Select **Audio playground** > **Try the Chat playground**.
Summary
{
"modification_type": "minor update",
"modification_title": "Azure AI Foundryポータルへのリンクの修正"
}
Explanation
この修正では、音声生成機能に関するドキュメント内のAzure AI Foundryポータルへのリンクが更新されています。具体的には、リンクが「https://ai.azure.com」から「https://ai.azure.com/?cid=learnDocs」に変更され、ユーザーが特定のプロジェクトにアクセスしやすくなっています。また、手順の番号付きリスト内のリンクも同様に更新され、情報の確認が容易になります。これにより、ユーザーは指示に従ってモデルと対話する際の体験が向上します。全体的に、変更は文書の整合性と利便性を高めることに寄与しています。
articles/ai-services/openai/includes/audio-completions-deploy-model.md
Diff
@@ -8,7 +8,7 @@ ms.date: 5/23/2025
---
To deploy the `gpt-4o-mini-audio-preview` model in the Azure AI Foundry portal:
-1. Go to the [Azure AI Foundry portal](https://ai.azure.com) and create or select your project.
+1. Go to the [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) and create or select your project.
1. Select **Models + endpoints** from under **My assets** in the left pane.
1. Select **+ Deploy model** > **Deploy base model** to open the deployment window.
1. Search for and select the `gpt-4o-mini-audio-preview` model and then select **Confirm**.
Summary
{
"modification_type": "minor update",
"modification_title": "Azure AI Foundryポータルへのリンクの修正"
}
Explanation
この修正では、音声モデルのデプロイに関する手順内のAzure AI Foundryポータルへのリンクが更新されています。具体的には、リンクが「https://ai.azure.com」から「https://ai.azure.com/?cid=learnDocs」に変更され、ユーザーがプロジェクトにアクセスする際の利便性を向上させています。この変更により、ユーザーはより簡単に必要な情報にアクセスできるようになります。また、全体の手順の流れが保たれており、情報の明確性が向上しています。これにより、ユーザーがモデルをデプロイする際の体験がスムーズになり、ドキュメントの整合性も高まります。
articles/ai-services/openai/includes/batch/batch-studio.md
Diff
@@ -74,7 +74,7 @@ For this article, we'll create a file named `test.jsonl` and will copy the conte
Once your input file is prepared, you first need to upload the file to then be able to initiate a batch job. File upload can be done both programmatically or via the Azure AI Foundry portal. This example demonstrates uploading a file directly to your Azure OpenAI resource. Alternatively, you can [configure Azure Blob Storage for Azure OpenAI Batch](../../how-to/batch-blob-storage.md).
-1. Sign in to [Azure AI Foundry portal](https://ai.azure.com).
+1. Sign in to [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs).
2. Select the Azure OpenAI resource where you have a global batch model deployment available.
3. Select **Batch jobs** > **+Create batch jobs**.
Summary
{
"modification_type": "minor update",
"modification_title": "Azure AI Foundryポータルへのリンクの修正"
}
Explanation
この修正では、バッチジョブの作成手順に関するドキュメント内のAzure AI Foundryポータルへのリンクが更新されています。具体的には、古いリンク「https://ai.azure.com」が「https://ai.azure.com/?cid=learnDocs」に変更され、ユーザーがより効率的にプロジェクトにアクセスできるようになります。この変更は、プロジェクトの作成時に必要な情報を得やすくし、全体的なユーザー体験の向上に寄与します。また、手順は一貫しており、ユーザーにとっての分かりやすさが維持されています。
articles/ai-services/openai/includes/chatgpt-studio.md
Diff
@@ -15,7 +15,7 @@ ms.date: 09/19/2024
## Go to Azure AI Foundry
-Navigate to the [Azure AI Foundry portal](https://ai.azure.com/) and sign-in with credentials that have access to your Azure OpenAI resource. During or after the sign-in workflow, select the appropriate directory, Azure subscription, and Azure OpenAI resource.
+Navigate to the [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) and sign-in with credentials that have access to your Azure OpenAI resource. During or after the sign-in workflow, select the appropriate directory, Azure subscription, and Azure OpenAI resource.
From Azure AI Foundry, select **Chat playground**.
Summary
{
"modification_type": "minor update",
"modification_title": "Azure AI Foundryポータルへのリンクの修正"
}
Explanation
この修正では、ChatGPTスタジオに関するドキュメント内のAzure AI Foundryポータルへのリンクが更新されました。具体的には、リンクが「https://ai.azure.com」から「https://ai.azure.com/?cid=learnDocs」に変更され、ユーザーがAzure OpenAIリソースへのアクセスを容易にすることを目的としています。この更新により、利用者がAzure AI Foundryポータルにサインインする際に必要な情報を得やすくなります。全体の手順は一貫しており、ユーザー体験を向上させるための改善が示されています。
articles/ai-services/openai/includes/connect-your-data-studio.md
Diff
@@ -16,7 +16,7 @@ recommendations: false
> [!TIP]
> You can [use the Azure Developer CLI](../how-to/azure-developer-cli.md) to programmatically create the resources needed for Azure OpenAI On Your Data
-Navigate to [Azure AI Foundry portal](https://ai.azure.com/) and sign-in with credentials that have access to your Azure OpenAI resource.
+Navigate to [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) and sign-in with credentials that have access to your Azure OpenAI resource.
1. You can either [create an Azure AI Foundry project](../../../ai-foundry/how-to/create-projects.md) by clicking **Create project**, or continue directly by clicking the button on the **Focused on Azure OpenAI in Azure AI Foundry Models** tile.
Summary
{
"modification_type": "minor update",
"modification_title": "Azure AI Foundryポータルへのリンクの修正"
}
Explanation
この修正では、データスタジオの接続に関するドキュメント内のAzure AI Foundryポータルへのリンクが更新されています。具体的には、リンクが「https://ai.azure.com」から「https://ai.azure.com/?cid=learnDocs」に変更され、Azure OpenAIリソースへのアクセスをより容易にすることが目的です。この変更により、ユーザーはAzure AI Foundryポータルにサインインする際に必要な情報を簡単に取得でき、エクスペリエンスの向上が期待されます。手順自体は明確で、リソースの作成プロセスをスムーズに進めるための選択肢も提供されています。
articles/ai-services/openai/includes/create-resource-portal.md
Diff
@@ -96,7 +96,7 @@ Before you can generate text or inference, you need to deploy a model. You can s
To deploy a model, follow these steps:
-1. Sign in to [Azure AI Foundry portal](https://ai.azure.com).
+1. Sign in to [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs).
2. Choose the subscription and the Azure OpenAI resource to work with, and select **Use resource**.
Summary
{
"modification_type": "minor update",
"modification_title": "Azure AI Foundryポータルへのリンクの修正"
}
Explanation
この修正では、リソースポータルの作成に関するドキュメント内のAzure AI Foundryポータルへのリンクが更新されています。具体的に言えば、リンクが「https://ai.azure.com」から「https://ai.azure.com/?cid=learnDocs」に変更され、ユーザーがAzure OpenAIリソースにアクセスする際の利便性向上を目的としています。この変更により、サインイン手順が一層効率的になり、ユーザーは必要なリソースに容易にアクセスできるようになります。手順自体は簡潔で、必要な設定が明確に示されています。
articles/ai-services/openai/includes/dall-e-studio.md
Diff
@@ -20,7 +20,7 @@ Use this guide to get started generating images with Azure OpenAI in your browse
## Go to Azure AI Foundry
-Browse to [Azure AI Foundry](https://ai.azure.com/) and sign in with the credentials associated with your Azure OpenAI resource. During or after the sign-in workflow, select the appropriate directory, Azure subscription, and Azure OpenAI resource.
+Browse to [Azure AI Foundry](https://ai.azure.com/?cid=learnDocs) and sign in with the credentials associated with your Azure OpenAI resource. During or after the sign-in workflow, select the appropriate directory, Azure subscription, and Azure OpenAI resource.
From the Azure AI Foundry landing page, create or select a new project. Navigate to the **Models + endpoints** page on the left nav. Select **Deploy model** and then choose one of the DALL-E models from the list. Complete the deployment process.
Summary
{
"modification_type": "minor update",
"modification_title": "Azure AI Foundryポータルへのリンクの修正"
}
Explanation
この修正では、DALL-Eスタジオに関するドキュメント内のAzure AI Foundryポータルへのリンクが更新されています。具体的には、リンクが「https://ai.azure.com」から「https://ai.azure.com/?cid=learnDocs」に変更されました。この変更によって、ユーザーはAzure OpenAIリソースにアクセスする際に、より直接的に必要な情報を取得できるようになります。ドキュメントでは、サインイン後に適切なディレクトリやAzureサブスクリプションを選択する手順も明確に示されており、ユーザーのナビゲーションがさらにスムーズになることが期待されます。
articles/ai-services/openai/includes/fine-tuning-openai-in-ai-studio.md
Diff
@@ -98,7 +98,7 @@ In general, doubling the dataset size can lead to a linear increase in model qua
To fine-tune an Azure OpenAI model in an existing Azure AI Foundry project, follow these steps:
-1. Sign in to [Azure AI Foundry](https://ai.azure.com) and select your project. If you don't have a project already, first [create a project](../../../ai-foundry/how-to/create-projects.md).
+1. Sign in to [Azure AI Foundry](https://ai.azure.com/?cid=learnDocs) and select your project. If you don't have a project already, first [create a project](../../../ai-foundry/how-to/create-projects.md).
1. From the collapsible left menu, select **Fine-tuning** > **+ Fine-tune model**.
@@ -212,7 +212,7 @@ You can monitor the progress of your deployment on the **Deployments** page in A
### Use a deployed fine-tuned model
-After your fine-tuned model deploys, you can use it like any other deployed model. You can use the **Playground** in [Azure AI Foundry](https://ai.azure.com) to experiment with your new deployment. You can also use the REST API to call your fine-tuned model from your own application. You can even begin to use this new fine-tuned model in your prompt flow to build your generative AI application.
+After your fine-tuned model deploys, you can use it like any other deployed model. You can use the **Playground** in [Azure AI Foundry](https://ai.azure.com/?cid=learnDocs) to experiment with your new deployment. You can also use the REST API to call your fine-tuned model from your own application. You can even begin to use this new fine-tuned model in your prompt flow to build your generative AI application.
> [!NOTE]
> For chat models, the [system message that you use to guide your fine-tuned model](../concepts/system-message.md) (whether it's deployed or available for testing in the playground) must be the same as the system message you used for training. If you use a different system message, the model might not perform as expected.
Summary
{
"modification_type": "minor update",
"modification_title": "Azure AI Foundryポータルへのリンクの修正"
}
Explanation
この修正では、Azure AI Foundry内でのOpenAIモデルのファインチューニングに関するドキュメントの一部が更新されています。具体的には、「Azure AI Foundry」へのリンクが「https://ai.azure.com」から「https://ai.azure.com/?cid=learnDocs」に変更されました。これにより、ユーザーがAI Foundryにアクセスする際に、必要なリソースや情報をより簡単に取得できるようになります。また、ファインチューニングプロセスを踏む際の手順も強調されており、ユーザーが簡単に作業を進められるように配慮されています。この変更は、全体的なユーザー体験の向上につながることが期待されます。
articles/ai-services/openai/includes/fine-tuning-studio.md
Diff
@@ -245,7 +245,7 @@ If you're ready to deploy for production or have particular data residency needs
### Use a deployed fine-tuned model
-After your fine-tuned model deploys, you can use it like any other deployed model. You can use the **Playground** in [Azure AI Foundry](https://ai.azure.com) to experiment with your new deployment. You can also use the REST API to call your fine-tuned model from your own application. You can even begin to use this new fine-tuned model in your prompt flow to build your generative AI application.
+After your fine-tuned model deploys, you can use it like any other deployed model. You can use the **Playground** in [Azure AI Foundry](https://ai.azure.com/?cid=learnDocs) to experiment with your new deployment. You can also use the REST API to call your fine-tuned model from your own application. You can even begin to use this new fine-tuned model in your prompt flow to build your generative AI application.
> [!NOTE]
> For chat models, the [system message that you use to guide your fine-tuned model](../concepts/system-message.md) (whether it's deployed or available for testing in the playground) must be the same as the system message you used for training. If you use a different system message, the model might not perform as expected.
Summary
{
"modification_type": "minor update",
"modification_title": "Azure AI Foundryポータルへのリンクの修正"
}
Explanation
この修正では、ファインチューニングされたモデルの使用に関するドキュメントでのAzure AI Foundryへのリンクが更新されています。具体的には、リンクが「https://ai.azure.com」から「https://ai.azure.com/?cid=learnDocs」に変更されました。この変更により、ユーザーは新しいデプロイメントを試す際に、より関連性の高い情報やリソースにアクセスしやすくなります。また、ファインチューニングされたモデルの機能として、他のデプロイされたモデルと同様に使用できることや、REST APIを利用してアプリケーションから呼び出すことができることが強調されています。これらの改善は、ユーザーの体験を向上させることが期待されます。
articles/ai-services/openai/includes/fine-tuning-unified.md
Diff
@@ -14,7 +14,7 @@ ms.custom:
There are two unique fine-tuning experiences in the Azure AI Foundry portal:
-* [Hub/Project view](https://ai.azure.com) - supports fine-tuning models from multiple providers including Azure OpenAI, Meta Llama, Microsoft Phi, etc.
+* [Hub/Project view](https://ai.azure.com/?cid=learnDocs) - supports fine-tuning models from multiple providers including Azure OpenAI, Meta Llama, Microsoft Phi, etc.
* [Azure OpenAI centric view](https://ai.azure.com/resource/overview) - only supports fine-tuning Azure OpenAI models, but has support for additional features like the [Weights & Biases (W&B) preview integration](../how-to/weights-and-biases-integration.md).
If you are only fine-tuning Azure OpenAI models, we recommend the Azure OpenAI centric fine-tuning experience which is available by navigating to [https://ai.azure.com/resource/overview](https://ai.azure.com/resource/overview).
Summary
{
"modification_type": "minor update",
"modification_title": "Azure AI Foundryポータルへのリンクの修正"
}
Explanation
この修正では、Azure AI Foundryポータルのファインチューニング体験に関する文書のリンクが更新されています。特に、「Hub/Project view」へのリンクが「https://ai.azure.com」から「https://ai.azure.com/?cid=learnDocs」に変更されています。この変更により、ユーザーはより関連性の高い情報にアクセスできるようになり、ファインチューニングを行う際の体験が向上します。また、Azure AI Foundryポータルには異なるプロバイダーからのモデルファインチューニングをサポートする「Hub/Project view」と、Azure OpenAIモデル専用の「Azure OpenAI centric view」の2つのユニークな体験があることが記載されています。この修正は、ユーザーが必要な情報を容易に見つけられるようにするための重要なステップです。
articles/ai-services/openai/includes/gpt-v-studio.md
Diff
@@ -10,7 +10,7 @@ ms.custom: references_regions, ignite-2024
ms.date: 01/29/2025
---
-Use this article to get started using [Azure AI Foundry](https://ai.azure.com) to deploy and test a chat completion model with image understanding.
+Use this article to get started using [Azure AI Foundry](https://ai.azure.com/?cid=learnDocs) to deploy and test a chat completion model with image understanding.
## Prerequisites
@@ -28,7 +28,7 @@ You need an image to complete this quickstart. You can use this sample image or
## Go to Azure AI Foundry
-1. Browse to [Azure AI Foundry](https://ai.azure.com/) and sign in with the credentials associated with your Azure OpenAI resource. During or after the sign-in workflow, select the appropriate directory, Azure subscription, and Azure OpenAI resource.
+1. Browse to [Azure AI Foundry](https://ai.azure.com/?cid=learnDocs) and sign in with the credentials associated with your Azure OpenAI resource. During or after the sign-in workflow, select the appropriate directory, Azure subscription, and Azure OpenAI resource.
1. Select the project you'd like to work in.
1. On the left nav menu, select **Models + endpoints** and select **+ Deploy model**.
1. Choose an image-capable deployment by selecting model name: **gpt-4o** or **gpt-4o-mini**. In the window that appears, select a name and deployment type. Make sure your Azure OpenAI resource is connected. For more information about model deployment, see the [resource deployment guide](/azure/ai-services/openai/how-to/create-resource).
Summary
{
"modification_type": "minor update",
"modification_title": "Azure AI Foundryポータルへのリンクの修正"
}
Explanation
この修正では、ドキュメント内のAzure AI Foundryポータルへのリンクが更新されています。具体的には、最初のリンクと手順内のリンクが「https://ai.azure.com」から「https://ai.azure.com/?cid=learnDocs」に変更されました。この変更により、ユーザーはチャット完了モデルをデプロイおよびテストする際に、より関連性の高いリソースにアクセスできるようになります。また、手順の中では、Azure AI Foundryにサインインする際に必要な手続きを簡潔に説明しており、この修正によってユーザーが手順に従いやすくなります。このような改善は、Azureのサービスを利用する上でのユーザーエクスペリエンスを向上させることを目的としています。
articles/ai-services/openai/includes/realtime-deploy-model.md
Diff
@@ -8,7 +8,7 @@ ms.date: 1/21/2025
---
To deploy the `gpt-4o-mini-realtime-preview` model in the Azure AI Foundry portal:
-1. Go to the [Azure AI Foundry portal](https://ai.azure.com) and create or select your project.
+1. Go to the [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) and create or select your project.
1. Select **Models + endpoints** from under **My assets** in the left pane.
1. Select **+ Deploy model** > **Deploy base model** to open the deployment window.
1. Search for and select the `gpt-4o-mini-realtime-preview` model and then select **Confirm**.
Summary
{
"modification_type": "minor update",
"modification_title": "Azure AI Foundryポータルへのリンクの修正"
}
Explanation
この変更では、Azure AI Foundryポータルに関する手順のリンクが更新されています。具体的には、リンクが「https://ai.azure.com」から「https://ai.azure.com/?cid=learnDocs」に変更されています。これにより、ユーザーはプロジェクトを作成または選択するためにポータルにアクセスする際に、より関連性の高い情報が提供されるようになります。この更新は、ユーザーの利便性を向上させ、特にモデルをデプロイする手順を行う際の体験をサポートすることを目的としています。また、手順の進行に関しては、ユーザーが明確な方向性を持てるように構成されています。
articles/ai-services/openai/includes/realtime-portal.md
Diff
@@ -13,9 +13,9 @@ ms.date: 3/20/2025
## Use the GPT-4o real-time audio
-To chat with your deployed `gpt-4o-mini-realtime-preview` model in the [Azure AI Foundry](https://ai.azure.com) **Real-time audio** playground, follow these steps:
+To chat with your deployed `gpt-4o-mini-realtime-preview` model in the [Azure AI Foundry](https://ai.azure.com/?cid=learnDocs) **Real-time audio** playground, follow these steps:
-1. Go to the [Azure AI Foundry portal](https://ai.azure.com) and select your project that has your deployed `gpt-4o-mini-realtime-preview` model.
+1. Go to the [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) and select your project that has your deployed `gpt-4o-mini-realtime-preview` model.
1. Select **Playgrounds** from the left pane.
1. Select **Audio playground** > **Try the Audio playground**.
Summary
{
"modification_type": "minor update",
"modification_title": "Azure AI Foundryポータルへのリンクの修正"
}
Explanation
この修正では、Azure AI Foundryに関連する実行手順内のリンクが更新されています。具体的には、最初のリンクと手順内のリンクが「https://ai.azure.com」から「https://ai.azure.com/?cid=learnDocs」に変更されています。この更新により、ユーザーはデプロイされたgpt-4o-mini-realtime-preview
モデルでチャットする際に、関連情報をより簡単に見つけられるようになります。また、手順は整然とした構成を保っており、ユーザーが左ペインからPlaygroundsを選択してオーディオプレイグラウンドを試す際の流れをスムーズにしています。この変更は、Azure AI Foundryにおけるユーザー体験を向上させることを狙っています。
articles/ai-services/openai/includes/studio.md
Diff
@@ -39,7 +39,7 @@ In the Completions playground you can also view Python and curl code samples pre
To use the Azure OpenAI for text summarization in the Completions playground, follow these steps:
-1. Sign in to the [Azure AI Foundry portal](https://ai.azure.com).
+1. Sign in to the [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs).
1. Select the subscription and OpenAI resource to work with.
1. Select **Completions playground** on the landing page.
1. Select your deployment from the **Deployments** dropdown. If your resource doesn't have a deployment, select **Create a deployment** and then revisit this step.
Summary
{
"modification_type": "minor update",
"modification_title": "Azure AI Foundryポータルへのリンクの修正"
}
Explanation
この変更では、Azure AI Foundryポータルに関連する手順内のサインインリンクが更新されています。具体的には、リンクが「https://ai.azure.com」から「https://ai.azure.com/?cid=learnDocs」に変更されています。この修正により、ユーザーはテキスト要約のためにAzure OpenAIを使用する際に、より関連性の高い情報にアクセスできるようになります。手順は明確に構成されており、サインイン後に必要なリソースを選択し、Completions playgroundを開くための流れが保たれています。この変更は、ユーザーがプロセスにおいてスムーズに進むことをサポートすることを目的としています。
articles/ai-services/openai/overview.md
Diff
@@ -24,7 +24,7 @@ Azure OpenAI provides REST API access to OpenAI's powerful language models inclu
| Price | [Available here](https://azure.microsoft.com/pricing/details/cognitive-services/openai-service/) <br> For details on vision-enabled chat models, see the [special pricing information](../openai/concepts/gpt-with-vision.md#special-pricing-information).|
| Virtual network support & private link support | Yes. |
| Managed Identity| Yes, via Microsoft Entra ID |
-| UI experience | [Azure portal](https://portal.azure.com) for account & resource management, <br> [Azure AI Foundry](https://ai.azure.com) for model exploration and fine-tuning |
+| UI experience | [Azure portal](https://portal.azure.com) for account & resource management, <br> [Azure AI Foundry](https://ai.azure.com/?cid=learnDocs) for model exploration and fine-tuning |
| Model regional availability | [Model availability](./concepts/models.md) |
| Content filtering | Prompts and completions are evaluated against our content policy with automated systems. High severity content is filtered. |
@@ -41,7 +41,7 @@ Start with the [Create and deploy an Azure OpenAI resource](./how-to/create-reso
1. When you have an Azure OpenAI resource, you can deploy a model such as GPT-4o.
1. When you have a deployed model, you can:
- - Try out the [Azure AI Foundry portal](https://ai.azure.com/) playgrounds to explore the capabilities of the models.
+ - Try out the [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) playgrounds to explore the capabilities of the models.
- You can also just start making API calls to the service using the REST API or SDKs.
For example, you can try [real-time audio](./realtime-audio-quickstart.md) and [assistants](./assistants-quickstart.md) in the playgrounds or via code.
Summary
{
"modification_type": "minor update",
"modification_title": "Azure AI Foundryポータルへのリンクの修正"
}
Explanation
この修正では、Azure OpenAIに関連する概要文書内でのリンクが更新されています。具体的には、Azure AI Foundryポータルへのリンクが「https://ai.azure.com」から「https://ai.azure.com/?cid=learnDocs」に変更された点です。この変更は、ユーザーがモデルの探索や微調整を行う際に、より関連性の高い情報にアクセスできるよう配慮されています。また、書類内の他の情報は、Azure OpenAIが提供する機能の価格、仮想ネットワークとプライベートリンクのサポート、管理ID、コンテンツフィルタリングなどが適切に整理されており、ユーザーが必要な情報を効果的に得ることができるようになっています。全体として、この更新はAzure AIサービスの利用体験を向上させることを目指しています。
articles/ai-services/openai/quotas-limits.md
Diff
@@ -45,8 +45,8 @@ The following sections provide you with a quick guide to the default quotas and
| Max number of `/chat/completions` functions | 128 |
| Max number of `/chat completions` tools | 128 |
| Maximum number of Provisioned throughput units per deployment | 100,000 |
-| Max files per Assistant/thread | 10,000 when using the API or [Azure AI Foundry portal](https://ai.azure.com/).|
-| Max file size for Assistants & fine-tuning | 512 MB<br/><br/>200 MB via [Azure AI Foundry portal](https://ai.azure.com/) |
+| Max files per Assistant/thread | 10,000 when using the API or [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs).|
+| Max file size for Assistants & fine-tuning | 512 MB<br/><br/>200 MB via [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) |
| Max size for all uploaded files for Assistants |200 GB |
| Assistants token limit | 2,000,000 token limit |
| GPT-4o max images per request (# of images in the messages array/conversation history) | 50 |
@@ -197,7 +197,7 @@ M = million | K = thousand
### gpt-4o audio
-The rate limits for each `gpt-4o` audio model deployment are 100 K TPM and 1 K RPM. During the preview, [Azure AI Foundry portal](https://ai.azure.com/) and APIs might inaccurately show different rate limits. Even if you try to set a different rate limit, the actual rate limit is 100 K TPM and 1 K RPM.
+The rate limits for each `gpt-4o` audio model deployment are 100 K TPM and 1 K RPM. During the preview, [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) and APIs might inaccurately show different rate limits. Even if you try to set a different rate limit, the actual rate limit is 100 K TPM and 1 K RPM.
| Model|Tier| Quota Limit in tokens per minute (TPM) | Requests per minute |
|---|---|:---:|:---:|
Summary
{
"modification_type": "minor update",
"modification_title": "Azure AI Foundryポータルへのリンクの修正"
}
Explanation
この変更では、Azure OpenAIのクォータと制限に関する文書内のリンクが更新されています。具体的には、Azure AI Foundryポータルへのリンクが「https://ai.azure.com」から「https://ai.azure.com/?cid=learnDocs」に変更されています。修正箇所は、アシスタントやファインチューニングに関するファイルサイズやファイル数の制限に関連する部分で、これらの数値は変わっていませんが、ユーザーがポータルにアクセスする際により関連性の高い情報へ導くことを意図しています。また、gpt-4oオーディオモデルのレート制限に関する情報も確認されており、リンクの更新が一貫して行われています。この変更は、ユーザーが正確で最新の情報に簡単にアクセスできるようにすることを目的としています。
articles/ai-services/openai/tutorials/fine-tune.md
Diff
@@ -34,7 +34,7 @@ In this tutorial you learn how to:
- [Jupyter Notebooks](https://jupyter.org/)
- An Azure OpenAI resource in a [region where `gpt-4o-mini-2024-07-18` fine-tuning is available](../concepts/models.md). If you don't have a resource the process of creating one is documented in our resource [deployment guide](../how-to/create-resource.md).
- Fine-tuning access requires **Cognitive Services OpenAI Contributor**.
-- If you don't already have access to view quota and deploy models in [Azure AI Foundry portal](https://ai.azure.com/), then you need [more permissions](../how-to/role-based-access-control.md).
+- If you don't already have access to view quota and deploy models in [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs), then you need [more permissions](../how-to/role-based-access-control.md).
> [!IMPORTANT]
> We recommend reviewing the [pricing information](https://azure.microsoft.com/pricing/details/cognitive-services/openai-service/#pricing) for fine-tuning to familiarize yourself with the associated costs. Testing of this tutorial resulted in 48,000 tokens being billed (4,800 training tokens * 10 epochs of training). Training costs are in addition to the costs that are associated with fine-tuning inference, and the hourly hosting costs of having a fine-tuned model deployed. Once you have completed the tutorial, you should delete your fine-tuned model deployment otherwise you continue to incur the hourly hosting cost.
@@ -694,7 +694,7 @@ fine_tuned_model = response.fine_tuned_model
Unlike the previous Python SDK commands in this tutorial, since the introduction of the quota feature, model deployment must be done using the [REST API](/rest/api/aiservices/accountmanagement/deployments/create-or-update?tabs=HTTP), which requires separate authorization, a different API path, and a different API version.
-Alternatively, you can deploy your fine-tuned model using any of the other common deployment methods like [Azure AI Foundry portal](https://ai.azure.com/), or [Azure CLI](/cli/azure/cognitiveservices/account/deployment#az-cognitiveservices-account-deployment-create()).
+Alternatively, you can deploy your fine-tuned model using any of the other common deployment methods like [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs), or [Azure CLI](/cli/azure/cognitiveservices/account/deployment#az-cognitiveservices-account-deployment-create()).
|variable | Definition|
|--------------|-----------|
@@ -745,13 +745,13 @@ print(r.reason)
print(r.json())
```
-You can check on your deployment progress in the [Azure AI Foundry portal](https://ai.azure.com/).
+You can check on your deployment progress in the [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs).
It isn't uncommon for this process to take some time to complete when dealing with deploying fine-tuned models.
## Use a deployed customized model
-After your fine-tuned model is deployed, you can use it like any other deployed model in either the [Chat Playground of Azure AI Foundry portal](https://ai.azure.com), or via the chat completion API. For example, you can send a chat completion call to your deployed model, as shown in the following Python example. You can continue to use the same parameters with your customized model, such as temperature and max_tokens, as you can with other deployed models.
+After your fine-tuned model is deployed, you can use it like any other deployed model in either the [Chat Playground of Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs), or via the chat completion API. For example, you can send a chat completion call to your deployed model, as shown in the following Python example. You can continue to use the same parameters with your customized model, such as temperature and max_tokens, as you can with other deployed models.
```python
# Use the deployed customized model
@@ -784,7 +784,7 @@ Unlike other types of Azure OpenAI models, fine-tuned/customized models have [an
Deleting the deployment won't affect the model itself, so you can re-deploy the fine-tuned model that you trained for this tutorial at any time.
-You can delete the deployment in [Azure AI Foundry portal](https://ai.azure.com/), via [REST API](/rest/api/aiservices/accountmanagement/deployments/delete?tabs=HTTP), [Azure CLI](/cli/azure/cognitiveservices/account/deployment#az-cognitiveservices-account-deployment-delete()), or other supported deployment methods.
+You can delete the deployment in [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs), via [REST API](/rest/api/aiservices/accountmanagement/deployments/delete?tabs=HTTP), [Azure CLI](/cli/azure/cognitiveservices/account/deployment#az-cognitiveservices-account-deployment-delete()), or other supported deployment methods.
## Troubleshooting
Summary
{
"modification_type": "minor update",
"modification_title": "Azure AI Foundryポータルへのリンクの修正"
}
Explanation
この修正では、Azure OpenAIのファインチューニングに関するチュートリアル文書内でのリンクが更新されています。具体的には、Azure AI Foundryポータルへのリンクが「https://ai.azure.com」から「https://ai.azure.com/?cid=learnDocs」に変更されています。この変更により、ユーザーがポータルにアクセスする際に、より関連性の高い情報へ導かれるようになっています。
文書内の他の重要な情報も含まれており、ファインチューニングアクセスには特定の権限が必要であることや、ファインチューニングにかかる費用についての注意喚起も記載されています。また、ファインチューニングモデルのデプロイメントには新たに導入されたクォータ機能により、REST APIを使用する必要があることが強調されており、様々なデプロイメント方法が説明されています。この更新は、ユーザーがファインチューニングプロセスをよりスムーズに進められるよう配慮されたものです。
articles/ai-services/openai/whats-new.md
Diff
@@ -141,7 +141,7 @@ The `gpt-4o-realtime-preview` model version 2024-12-17 is available for global d
- Added support for [prompt caching](./how-to/prompt-caching.md) with the `gpt-4o-realtime-preview` model.
- Added support for new voices. The `gpt-4o-realtime-preview` models now support the following voices: "alloy", "ash", "ballad", "coral", "echo", "sage", "shimmer", "verse".
-- Rate limits are no longer based on connections per minute. Rate limiting is now based on RPM (requests per minute) and TPM (tokens per minute) for the `gpt-4o-realtime-preview` model. The rate limits for each `gpt-4o-realtime-preview` model deployment are 100K TPM and 1K RPM. During the preview, [Azure AI Foundry portal](https://ai.azure.com/) and APIs might inaccurately show different rate limits. Even if you try to set a different rate limit, the actual rate limit will be 100K TPM and 1K RPM.
+- Rate limits are no longer based on connections per minute. Rate limiting is now based on RPM (requests per minute) and TPM (tokens per minute) for the `gpt-4o-realtime-preview` model. The rate limits for each `gpt-4o-realtime-preview` model deployment are 100K TPM and 1K RPM. During the preview, [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) and APIs might inaccurately show different rate limits. Even if you try to set a different rate limit, the actual rate limit will be 100K TPM and 1K RPM.
For more information, see the [GPT-4o real-time audio quickstart](realtime-audio-quickstart.md) and the [how-to guide](./how-to/realtime-audio.md).
@@ -271,7 +271,7 @@ Global batch now supports GPT-4o (2024-08-06). See the [global batch getting sta
### Azure OpenAI Studio UX updates
-As of September 19, 2024, when you go to the [Azure OpenAI Studio](https://oai.azure.com/) you no longer see the legacy Azure OpenAI Studio by default. If needed you'll still be able to go back to the previous experience by using the **Switch to the old look** toggle in the top bar of the UI for the next couple of weeks. If you switch back to legacy [Azure AI Foundry portal](https://ai.azure.com/), it helps if you fill out the feedback form to let us know why. We're actively monitoring this feedback to improve the new experience.
+As of September 19, 2024, when you go to the [Azure OpenAI Studio](https://oai.azure.com/) you no longer see the legacy Azure OpenAI Studio by default. If needed you'll still be able to go back to the previous experience by using the **Switch to the old look** toggle in the top bar of the UI for the next couple of weeks. If you switch back to legacy [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs), it helps if you fill out the feedback form to let us know why. We're actively monitoring this feedback to improve the new experience.
### GPT-4o 2024-08-06 provisioned deployments
@@ -314,7 +314,7 @@ OpenAI has incorporated additional safety measures into the `o1` models, includi
### Availability
-The `o1-preview` and `o1-mini` are available in the East US2 region for limited access through the [Azure AI Foundry portal](https://ai.azure.com) early access playground. Data processing for the `o1` models might occur in a different region than where they're available for use.
+The `o1-preview` and `o1-mini` are available in the East US2 region for limited access through the [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) early access playground. Data processing for the `o1` models might occur in a different region than where they're available for use.
To try the `o1-preview` and `o1-mini` models in the early access playground **registration is required, and access will be granted based on Microsoft’s eligibility criteria.**
@@ -372,7 +372,7 @@ On August 6, 2024, OpenAI [announced](https://openai.com/index/introducing-struc
Azure customers can test out GPT-4o `2024-08-06` today in the new Azure AI Foundry early access playground (preview).
-Unlike the previous early access playground, the [Azure AI Foundry portal](https://ai.azure.com/) early access playground (preview) doesn't require you to have a resource in a specific region.
+Unlike the previous early access playground, the [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) early access playground (preview) doesn't require you to have a resource in a specific region.
> [!NOTE]
> Prompts and completions made through the early access playground (preview) might be processed in any Azure OpenAI region, and are currently subject to a 10 request per minute per Azure subscription limit. This limit might change in the future.
Summary
{
"modification_type": "minor update",
"modification_title": "Azure AI Foundryポータルへのリンクの修正と新機能の追加"
}
Explanation
この変更では、Azure OpenAIの「新機能」についての文書が更新されています。主な変更点は、いくつかの新機能とリンクの修正です。具体的には、gpt-4o-realtime-previewモデルに関するレート制限が明確に説明されており、従来の接続数ベースの制限からリクエスト数(RPM)とトークン数(TPM)に変更されたことが記されています。
また、いくつかの新しい音声がサポートされ、Azure AI Foundryポータルへのリンクが「https://ai.azure.com」から「https://ai.azure.com/?cid=learnDocs」に変更され、ユーザーがより関連の高い情報を得られるようになっています。
さらに、Azure OpenAI Studioのユーザーエクスペリエンスの変更が紹介されており、以前のバージョンに戻るためのトグル機能が残されていることなどが言及されています。この更新は、ユーザーが最新の機能に簡単にアクセスできるようにすることを目的としています。