Skip to main content

Adding Models

This section explains the steps to add Azure AI Foundry models and configure the required access controls.
1

Navigate to Azure AI Foundry Models in AI Gateway

From the TrueFoundry dashboard, navigate to AI Gateway > Models and select Azure AI Foundry.
Navigating to Azure AI Foundry Provider Account in AI Gateway
2

Add Azure AI Foundry Account Details

Click Add Azure AI Foundry Account. Give a unique name to your Azure AI Foundry account and complete the form with your Azure authentication details (API Key). For enhanced security, you can also use certificate-based authentication. Add collaborators to your account. You can read more about access control here.Azure AI Foundry account configuration form with API key and collaborator fields
3

Add Models from Azure AI Foundry

Click on + Add Model to open the form for adding a new model. For Azure Foundry, you don’t select from a list. Instead, you add models based on your deployments in Azure AI Foundry. The Model ID in TrueFoundry corresponds to your Deployment Name in Azure. First, ensure you have deployed a model in your Azure AI Foundry service. You can follow Microsoft’s instructions here. Once deployed, you can find the endpoint URL and model name in the Deployments section of your Azure AI Foundry resource.
Azure portal showing deployed model endpoint URL and model name in the Deployments section
Use these details when adding the model in TrueFoundry.
Model addition form for Azure AI Foundry with fields for model ID and endpoint URL
Azure AI Foundry integration supports various AI models you have deployed in your Azure account.

Inference

After adding the models, you can perform inference using an OpenAI-compatible API via the Playground or by integrating with your own application.
Code Snippet and Try in Playgroud Buttons for each model

Mistral OCR - Document Processing

Extract text from documents while maintaining structure and formatting (headers, paragraphs, lists, tables) using the Mistral OCR model via Azure AI Foundry. Returns markdown format and supports multiple formats including PDFs, images (png, jpeg, avif), and office documents (pptx, docx).
This endpoint cannot be used via the Mistral SDK. The Mistral SDK automatically appends /v1 to the base URL, which causes a URL mismatch (e.g. the request is sent to <base_url>/v1/ocr instead of <base_url>/ocr). Use direct HTTP requests (e.g. requests in Python) as shown below. See the open GitHub issue for details.
import base64
import requests
import json

def encode_pdf(pdf_path):
    with open(pdf_path, "rb") as pdf_file:
        return base64.b64encode(pdf_file.read()).decode("utf-8")

pdf_path = "path-to-your-pdf"
base64_pdf = encode_pdf(pdf_path)

url = "https://{controlPlaneUrl}/api/llm/proxy/ocr"

headers = {
    "Content-Type": "application/json",
    "Authorization": "Bearer your-truefoundry-api-key"
}

# Use your TrueFoundry Azure Foundry model name (e.g. azure-foundry/mistral-ocr)
payload = {
    "model": "azure-foundry/mistral-ocr",
    "document": {
        "type": "document_url",
        "document_url": f"data:application/pdf;base64,{base64_pdf}"
    },
    "include_image_base64": True
}

response = requests.post(url, headers=headers, json=payload)

if response.status_code == 200:
    # Save output to file
    with open("ocr_output.json", "w") as f:
        json.dump(response.json(), f, indent=2)
    print("OCR output saved to ocr_output.json")
else:
    print(f"Error: {response.status_code}")
    print(response.text)