Azure AI Foundry
Azure AI Foundry provides a unified platform for enterprise AI operations, model building, and application development. With Lamatic, you can seamlessly integrate with various models available on Azure AI Foundry and take advantage of features like observability, prompt management, fallbacks, and more.
Learn how to integrate Azure AI Foundry with Portkey to access a wide range of AI models with enhanced observability and reliability features.
Azure AI Foundry offers three different ways to deploy models, each with unique endpoints and configurations:
- AI Services: Azure-managed models accessed through Azure AI Services endpoints
- Managed: User-managed deployments running on dedicated Azure compute resources
- Serverless: Seamless, scalable deployment without managing infrastructure
You can learn more about the Azure AI Foundry deployment here (opens in a new tab).
azure-ai-foundry
1. Access Azure AI Foundry Portal
Go to ai.azure.com (opens in a new tab) (Azure AI Foundry portal) and sign in with your Azure account.
2. Choose Project Type
Azure AI Foundry supports two types of projects: a hub-based project and a Foundry project. In most cases, you'll want to use a Foundry project.
3. Create a Foundry Project
- Click "Create new" in the top right
- Select "Azure AI Foundry resource"
- Enter a project name
- Select your subscription and resource group
- Choose a location/region
- Click "Create"
4. Deploy a Model
- Navigate to "Model catalog" or "Models + endpoints"
- Browse available models (GPT-4, GPT-3.5-turbo, etc.)
- Select a model and click "Deploy"
- Provide a Deployment Name (e.g., "gpt-4-deployment")
- Configure deployment settings
- Click "Deploy"
5. Get API Credentials
Navigate to your project's "Settings" or "Keys and endpoints" and copy the API key.
Required Information
Resource Name
Found in the project overview section.
Deployment Name
Listed in the "Models + endpoints" section.
Azure API Version
Check the latest versions in Azure AI Foundry documentation. Common versions:
2024-06-01
2024-02-15-preview
Endpoint URL Format
https://[ResourceName].openai.azure.com/
Example Configuration
Azure API Key: your-api-key-here
Azure API Version: 2024-06-01
Resource Name: my-ai-foundry-project
Deployment Name: gpt-4-deployment
Endpoint: https://my-ai-foundry-project.openai.azure.com/
Once you provide these details, your selected foundation model will be automatically populated.
For more information, refer to the official Azure OpenAI guide (opens in a new tab).
Follow these general steps in Lamatic.ai:
- Open your Lamatic.ai Studio (opens in a new tab)
- Navigate to Models section
- Select
azure-openai
provider - Provide the following credentials:
- Azure API Key
- Azure API Version
- Resource Name
- Deployment Name
- Save your changes
Important Notes
- Keep your API keys secure and never share them
- Some providers may require additional setup steps
- Check provider's pricing before generating API keys
- Regularly rotate your API keys for security
- Test your integration after adding each key