InfoQ Homepage News OpenAI Adds Full MCP Support to ChatGPT Developer Mode
OpenAI Adds Full MCP Support to ChatGPT Developer Mode
Oct 13, 2025 2 min read
Write for InfoQ
Feed your curiosity. Help 550k+ globalsenior developers
each month stay ahead.Get in touch
OpenAI has rolled out full Model Context Protocol (MCP) support in ChatGPT, bringing developers a long-requested feature: the ability to use custom connectors for both read and write actions directly inside chats. The feature, now in beta under Developer Mode, effectively turns ChatGPT into a programmable automation hub capable of interacting with external systems or internal APIs.
Until now, ChatGPT’s built-in tools were limited mainly to reading or fetching data — such as browsing the web or retrieving documents. With Developer Mode enabled, developers can now register MCP servers and expose any compatible tool that supports structured actions. This allows ChatGPT to, for example, update Jira tickets, trigger workflows, or write back to databases, all within a conversational interface.
According to OpenAI, the new mode is “powerful but dangerous,” as it gives the model the ability to perform real write operations. The company emphasizes the need for developers to test connectors carefully, remain alert to prompt injection attacks, and confirm all write actions before execution. Each tool call includes an expandable JSON payload for inspection, and ChatGPT will prompt users to review and approve any action that modifies data.
MCP connectors can be added by navigating to ChatGPT's Settings menu, then selecting Connectors, followed by Advanced, and finally Developer Mode. These connectors support Server-Sent Events (SSE) and streaming HTTP protocols, with optional OAuth authentication. Once connected, the tools will be accessible in the Developer Mode menu, allowing users to explicitly call them using structured prompts.
Developers can also define tool preferences, sequencing, and safety restrictions to minimize ambiguity. The design encourages clear prompting habits — for example, specifying that ChatGPT should use one connector for data retrieval and another for scheduling, rather than relying on built-in functions.
Following the release, developers clarified how the integration works on Reddit. User AlternativeBorder813 asked:
Is this remote only, or possible to also run ‘localhost’ servers? Any list of MCPs that actually work with it?
Another user replied:
It would have to be remote. ChatGPT cannot connect to your localhost servers. It’s fairly straightforward to create a tunnel using a service like ngrok. In theory, any MCP that works with other LLMs will work with ChatGPT.
The new support enhances ChatGPT's capabilities for developers, making it more compatible with frameworks like LangChain and LlamaIndex. It allows ChatGPT to function as a user interface for managing agents and as a platform for automation, with connectors linking language models to real-world applications.
OpenAI says the feature is available now to Pro, Plus, Business, Enterprise, and Education accounts on the web. Developers can explore documentation and examples through the Connectors tab in ChatGPT settings and start experimenting with MCP-compatible tools immediately.
This content is in the AI, ML & Data Engineering topic
Related Topics:
-
Related Editorial
-
Related Sponsors
-
Popular across InfoQ
-
AWS Introduces ECS Managed Instances for Containerized Applications
-
Producing a Better Software Architecture with Residuality Theory
-
GitHub Introduces New Embedding Model to Improve Code Search and Context
-
Google DeepMind Introduces CodeMender, an AI Agent for Automated Code Repair
-
Building Distributed Event-Driven Architectures across Multi-Cloud Boundaries
-
Elena Samuylova on Large Language Model (LLM)-Based Application Evaluation and LLM as a Judge
-
Related Content
The InfoQ Newsletter
A round-up of last week’s content on InfoQ sent out every Tuesday. Join a community of over 250,000 senior developers. View an example