If I understand correctly, it requires your MCP server to have exactly two tools - search and fetch.
So this is not really support for MCP in general, as in all the available MCP servers. It’s support for their own custom higher-level protocol built on top of MCP.
Cause from TFA “To work with ChatGPT Connectors or deep research (in ChatGPT or via API), your MCP server must implement two tools - search and fetch.”
Also, this page is actually the only docs site about MCP they have, and their help articles link to it too.
Quote we're talking about: > To work with ChatGPT Connectors or deep research (in ChatGPT or via API), your MCP server must implement two tools - search and fetch.
Reference links:
- Using remote MCP servers with the API: https://platform.openai.com/docs/guides/tools-remote-mcp
- Which account types can setup custom connectors in ChatGPT: https://help.openai.com/en/articles/11487775-connectors-in-c...
It’s disappointing they are gating this and browser agent for $100 users, and even for 100, it’s only two tool methods.
Eg how I described here a while ago: https://x.com/wunderwuzzi23/status/1930899939737166075?s=46&...
Ironically, I have a blog post drafted that explains this also in detail, and should probably still publish it.
Given how disastrous the AI 'industry' has been, between misappropriating data from customers, performing actions on behalf of customers that lead to data and/or financial loss, and then seeking protection from the law in one or more cases of these, isn't providing an MCP service essentially requiring you to notify customers of a GDPR-or-similar data compromise event at some point in the future when it suddenly but inevitably betrays you?
Like, isn't OpenAI just leading people to a footgun and then kindly asking them to use it, for the betterment of OpenAI's bottom line, which was significantly in the red for FY24?
I know 0. And i work in tech lol.
I don't hear anything from people I actually know who are exposed to this stuff being like, "Oh my God, this has massively improved my life and I can't live without it!" No its stuff like, "How do I uninstall Siri off my iPhone? It doesn't work right anymore and can't understand me, and keeps turning itself back on when there is an iOS update."
I don't think any amount of marketing is going to change the fact that the service doesn't work and isn't up to user expectations.
It's undercooked and now its too late to put it back in the oven, so to speak.
And siri has been junk for years too. Not sure what LLM’s have to do with that. Siri activates via a machine learning model that runs on device, which listens for a “wake word”, and that tech has been around for 10 years at this point. It’s possible apple messed up their machine learning model in a recent update? Idk :) But that’s unrelated to LLM’s.
I agree hype has gone out of control! the tech companies are over-promising. Expectations are too high, like you said. That doesn’t mean LLM’s are useless like crypto though. It’s early days still. Crypto is not useful to _any_ normal person, unless you live in a handful of countries with crazy inflation, or you’re a crypto nerd, or you’re trying to move dirty money around.
But all of this is useful to almost anyone:
- quickly summarize a recipe online, removing all the junk ads from the website.
- Brainstorming a quick travel itinerary (as a starting point) in 5 seconds.
- Getting a quick layman’s understanding of a topic you’re unfamiliar in.
- teens can brainstorm career paths. (tell an llm what they are good at, what they enjoy, and what they dislike)
- need to find a product online (tool, part, etc), but don’t know what it’s called? Describe it to an LLM! Then start your google search journey.
All of that is genuinely useful i think? IMO you’re conflating the _tech_, with how it’s being pushed onto the masses by the tech giants (siri, google search, etc).
I'm glad I'm on Android. I can just uninstall Google Assistant, which did the same transformation into a Gemini frontend. I wasn't using Assistant at all, so it isn't a meaningful loss for me.
If they truly are processing some siri requests in the cloud, im shocked at how incompetent their implementation is.
i haven’t kept up with siri to be honest. All i know is siri is still working for me, but it’s a degraded experience.
- As an end-user, you can connect MCPs only with `search` and `fetch` that only work in deep research mode.
- As a developer, you can use MCP with the API and that supports the full set of MCP tools - all tools become available; this shows in the dev playground.
- For Custom GPTs, they support any action, but not MCPs. So if you had a layer to translate MCP to their API spec, it will work. But Custom GPTs with actions only support models 4o and 4.1; so you don't get the benefit of the o-series of models.
Figuring out what works when is harder than it needs to be.
ChatGPT desktop client with only search/fetch MCPs is far, far inferior to CC from a utility/value perspective.
babyshake•1d ago
miguelxpn•1d ago
d_watt•1d ago
MCP servers can expose tools that are agents, but don't have to, and usually don't.
That being said, I can't say I've come across an actual implementation of a2a outside of press releases...