Overview
Integration details
Class | Package | Local | Serializable | JS support | Downloads | Version |
---|---|---|---|---|---|---|
AzureAIChatCompletionsModel | langchain-azure-ai | ❌ | ✅ | ✅ |
Model features
Tool calling | Structured output | JSON mode | Image input | Audio input | Video input | Token-level streaming | Native async | Token usage | Logprobs |
---|---|---|---|---|---|---|---|---|---|
✅ | ✅ | ✅ | ✅ | ❌ | ❌ | ✅ | ✅ | ✅ | ✅ |
Setup
To access AzureAIChatCompletionsModel models, you’ll need to create an Azure account, get an API key, and install thelangchain-azure-ai
integration package.
Credentials
Head to the Azure docs to see how to create your deployment and generate an API key. Once your model is deployed, you click the ‘get endpoint’ button in AI Foundry. This will show you your endpoint and api key. Once you’ve done this, set the AZURE_AI_CREDENTIAL and AZURE_AI_ENDPOINT environment variables:Installation
The LangChain AzureAIChatCompletionsModel integration lives in thelangchain-azure-ai
package: