Documentation Center

Setting up Azure OpenAI for LLM usage in Trados applications

Learn how to configure an Azure OpenAI deployment and generate the LLM key, API endpoint/version, and the deployment name. By default, the Smart Review feature has an embedded AWS Bedrock LLM model which you can use straight away. However, you can also configure an Azure OpenAI LLM instance for the Smart Review feature, at additional cost.

Before you begin

Make sure you have an Azure subscription and that this subscription allows you to create Azure OpenAI resources.

Procedure

  1. Access Azure at https://portal.azure.com.
  2. Create an Azure OpenAI resource. Note that if you do not have the appropriate rights to create Azure OpenAI resources, a form is displayed where you can require access. Submit the form and wait for Microsoft to evaluate your request and provide approval.
    create Azure OpenAI resourcecreate Azure OpenAI resource
  3. While creating the Azure OpenAI resource, select the region where you want to have the resource created. Note that the region is important, as not all models are available in every region. The list of regions and associated models changes over time and can be tracked here.
  4. Consult the endpoint URL and API key of your newly created Azure OpenAI resource.
    Deployment details
  5. In Azure OpenAI Studio, deploy an OpenAI model as part of the new resource. Go to Model Deployments > Manage Deployments.Azure OpenAI deployment
  6. Ensure you configure the following information as per your requirements:
    1. Model type - Take note of the model type you deployed, namely chat or completion, as you will need this information later.
    2. Model - Choose the model to be used (e.g. gpt-35-turbo).
    3. Quota - Choose the allowed throughput. The limits are specified in the Microsoft contractual agreement.
    Azure OpenAI deployment model
  7. In Azure OpenAI Studio > Deployments, select the deployment, and then select Open in Playground.Azure OpenAI deployment details in code view
  8. Consult the following information: API key, API endpoint, API version, Deployment name (version). To view the API endpoint (=openai.api_base) and the API version (=openai.api_version), in the Playground, select View code.
  9. Try a question/query in the Playground, so that the deployment is validated.
  10. For Smart Review capabilities, log in to Trados Enterprise, go to Integrations > Smart Review.
    1. Select Add Smart Review Provider, and enter the required information.
    2. If you want to set this provider as your default provider, select the Set as active check box. You can add several providers, but only one of the them can be set as the default, active provider.
    3. Select Save.
    4. Select the provider from the list, and then select Test Connection. If you want to change the current default provider, select the check box of the current provider, select Edit, and then clear the Set as active check box. Then, edit the new provider by selecting the Set as active check box, and then test the connection.