InfoQ Homepage News OpenAI Features New o3-mini Model on Microsoft Azure OpenAI Service
OpenAI Features New o3-mini Model on Microsoft Azure OpenAI Service
This item in japanese
Feb 11, 2025 2 min read
Write for InfoQ
Feed your curiosity. Help 550k+ globalsenior developers
each month stay ahead.Get in touch
OpenAI has launched the o3-mini model, which is now accessible through the Microsoft Azure OpenAI Service. According to the company, this model represents an advancement in AI technology, featuring improved cost efficiency and enhanced reasoning capabilities compared to the previous o1-mini model released last September.
The o3-mini model is expected to benefit developers and enterprises looking to enhance their AI applications. It offers faster performance and lower latency while effectively handling more complex reasoning tasks. Yina Arenas, Vice President of Product at Microsoft, Core AI, writes in an AI and machine learning blog:
With faster performance and lower latency, o3-mini is designed to handle complex reasoning workloads while maintaining efficiency.
A notable new aspect of the o3-mini model is the reasoning effort parameter. This feature allows users to adjust the model's cognitive load, enabling low, medium, or high levels of reasoning. For instance, a low level of reasoning might be suitable for simple data processing tasks, while a high level could be used for complex decision-making processes.
An example is given in the AI Vercel SDK documentation:
import { generateText } from 'ai';
import { openai } from '@ai-sdk/openai';
// Reduce reasoning effort for faster responses
const { text } = await generateText({
model: openai('o3-mini'),
prompt: 'Explain quantum entanglement briefly.',
providerOptions: {
openai: { reasoningEffort: 'low' },
},
});
Additionally, the o3-mini model supports structured outputs by incorporating JSON Schema constraints. This feature ensures that the model's outputs are in a format that is easily understandable and usable by other systems - it facilitates automated workflows, making it more straightforward for organizations to implement AI within their existing systems. A Rest call with structured output could look like this:
curl -X POST https://YOUR_RESOURCE_NAME.openai.azure.com/openai/deployments/YOUR_MODEL_DEPLOYMENT_NAME/chat/completions?api-version=2025年01月31日 \
-H "api-key: $AZURE_OPENAI_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"messages": [
{"role": "system", "content": "Extract the event information."},
{"role": "user", "content": "Alice and Bob are going to a science fair on Friday."}
],
"response_format": {
"type": "json_schema",
"json_schema": {
"name": "CalendarEventResponse",
"strict": true,
"schema": {
"type": "object",
"properties": {
"name": {
"type": "string"
},
"date": {
"type": "string"
},
"participants": {
"type": "array",
"items": {
"type": "string"
}
}
},
"required": [
"name",
"date",
"participants"
],
"additionalProperties": false
}
}
}
}'
Note: o3-mini version 2025年01月31日
The model also supports functions and external tools akin to its predecessor, making it suitable for various AI-powered automation tasks. These tasks could include automating customer support responses, managing inventory levels, or even controlling manufacturing processes.
Another significant change is the introduction of developer messages, which replace the previous system messages – a new approach that provides a more structured framework for instruction handling, enabling developers to create more responsive AI applications. Moreover, the Azure OpenAI Service has ensured backward compatibility by aligning old system messages with the new format to assist with transitions.
Continuing its predecessor's capabilities, the o3-mini model improves on key areas such as coding, mathematics, and scientific reasoning. These enhancements are essential for organizations requiring high-performance AI solutions.
ShinChven Zhang concludes in a blog post:
While the o3-mini currently lacks support for image processing, its text-only processing capability, advanced reasoning abilities, and cost-effectiveness make it a compelling choice for various applications. The availability of o3-mini to free users in ChatGPT is a significant step towards democratizing access to powerful AI models, potentially driving innovation in multiple fields, such as coding, STEM research, and AI-powered automation.
Lastly, developers can learn more about OpenAI o3-mini in GitHub Copilot and GitHub Models and sign up in Azure AI Foundry to access o3-mini and other advanced AI models.
This content is in the Cloud topic
Related Topics:
-
Related Editorial
-
Related Sponsors
-
Popular across InfoQ
-
Grafana and GitLab Introduce Serverless CI/CD Observability Integration
-
TanStack Start: A New Meta Framework Powered by React or SolidJS
-
Redis Critical Remote Code Execution Vulnerability Discovered after 13 Years
-
Java News Roundup: OpenJDK JEPs for JDK 26, Spring RCs, Quarkus, JReleaser, Seed4J, Gradle
-
GitHub Expands Copilot Ecosystem with AgentHQ
-
If You Can’t Test It, Don’t Deploy It: The New Rule of AI Development?
-
Related Content
The InfoQ Newsletter
A round-up of last week’s content on InfoQ sent out every Tuesday. Join a community of over 250,000 senior developers. View an example