frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

Show HN: The current sky at your approximate location, as a CSS gradient

https://sky.dlazaro.ca
373•dlazaro•6h ago•79 comments

Long-term exposure to outdoor air pollution linked to increased risk of dementia

https://www.cam.ac.uk/research/news/long-term-exposure-to-outdoor-air-pollution-linked-to-increased-risk-of-dementia
184•hhs•7h ago•55 comments

MCP's Disregard for 40 Years of RPC Best Practices

https://julsimon.medium.com/why-mcps-disregard-for-40-years-of-rpc-best-practices-will-burn-enterprises-8ef85ce5bc9b
154•yodon•5h ago•76 comments

Simon Willison's Lethal Trifecta Talk at the Bay Area AI Security Meetup

https://simonwillison.net/2025/Aug/9/bay-area-ai/
157•vismit2000•5h ago•53 comments

Debian 13 "Trixie"

https://www.debian.org/News/2025/20250809
198•ducktective•2h ago•67 comments

A CT scanner reveals surprises inside the 386 processor's ceramic package

https://www.righto.com/2025/08/intel-386-package-ct-scan.html
74•robin_reala•3h ago•14 comments

OpenFreeMap survived 100k requests per second

https://blog.hyperknot.com/p/openfreemap-survived-100000-requests
265•hyperknot•6h ago•63 comments

Quickshell – building blocks for your desktop

https://quickshell.org/
181•abhinavk•4d ago•28 comments

The current state of LLM-driven development

http://blog.tolki.dev/posts/2025/08-07-llms/
41•Signez•4h ago•15 comments

ESP32 Bus Pirate 0.5 – A Hardware Hacking Tool That Speaks Every Protocol

https://github.com/geo-tp/ESP32-Bus-Pirate
65•geo-tp•5h ago•12 comments

Stanford to continue legacy admissions and withdraw from Cal Grants

https://www.forbes.com/sites/michaeltnietzel/2025/08/08/stanford-to-continue-legacy-admissions-and-withdraw-from-cal-grants/
97•hhs•7h ago•159 comments

ChatGPT Agent – EU Launch

https://help.openai.com/en/articles/11752874-chatgpt-agent
52•Topfi•4h ago•35 comments

Accessibility and the Agentic Web

https://tetralogical.com/blog/2025/08/08/accessibility-and-the-agentic-web/
14•edent•3h ago•5 comments

Testing Bitchat at the music festival

https://primal.net/saunter/testing-bitchat-at-the-music-festival
37•alexcos•3d ago•22 comments

The mystery of Alice in Wonderland syndrome

https://www.bbc.com/future/article/20230313-the-mystery-of-alice-in-wonderland-syndrome
11•amichail•3d ago•5 comments

Isle FPGA Computer: creating a simple, open, modern computer

https://projectf.io/isle/fpga-computer.html
9•pabs3•3d ago•0 comments

Jan – Ollama alternative with local UI

https://github.com/menloresearch/jan
148•maxloh•10h ago•65 comments

Ratfactor's Illustrated Guide to Folding Fitted Sheets

https://ratfactor.com/cards/fitted-sheets
93•zdw•8h ago•13 comments

Knuth on ChatGPT (2023)

https://cs.stanford.edu/~knuth/chatGPT20.txt
90•b-man•3h ago•29 comments

End-User Programmable AI

https://queue.acm.org/detail.cfm?id=3746223
25•tosh•4h ago•1 comments

Cordoomceps – replacing an Amiga's brain with Doom

https://mjg59.dreamwidth.org/73001.html
25•naves•4d ago•4 comments

Ch.at – a lightweight LLM chat service accessible through HTTP, SSH, DNS and API

https://ch.at/
5•ownlife•1h ago•1 comments

I want everything local – Building my offline AI workspace

https://instavm.io/blog/building-my-offline-ai-workspace
976•mkagenius•1d ago•261 comments

The dead need right to delete their data so they can't be AI-ified, lawyer says

https://www.theregister.com/2025/08/09/dead_need_ai_data_delete_right/
137•rntn•6h ago•94 comments

Car has more than 1.2M km on it – and it's still going strong

https://www.cbc.ca/news/canada/nova-scotia/1985-toyota-tercel-high-mileage-1.7597168
164•Sgt_Apone•3d ago•213 comments

Sandstorm- self-hostable web productivity suite

https://sandstorm.org/
142•nalinidash•14h ago•28 comments

Mexico to US livestock trade halted due to screwworm spread

https://www.usda.gov/about-usda/news/press-releases/2025/07/09/secretary-rollins-takes-decisive-action-and-shuts-down-us-southern-border-ports-livestock-trade-due
214•burnt-resistor•5h ago•169 comments

Physical Media Is Cool Again. Streaming Services Have Themselves to Blame

https://www.rollingstone.com/culture/culture-features/physical-media-collectors-trend-viral-streamers-1235387314/
18•coloneltcb•1h ago•8 comments

How AI is upending the software development industry

https://www.reuters.com/lifestyle/bootcamp-bust-how-ai-is-upending-software-development-industry-2025-08-09/
5•wglb•2h ago•0 comments

Residents cheer as Tucson rejects data center campus

https://www.datacenterdynamics.com/en/news/residents-cheer-as-tucson-rejects-amazons-massive-project-blue-data-center-campus-in-arizona/
36•01-_-•2h ago•18 comments
Open in hackernews

MCP's Disregard for 40 Years of RPC Best Practices

https://julsimon.medium.com/why-mcps-disregard-for-40-years-of-rpc-best-practices-will-burn-enterprises-8ef85ce5bc9b
153•yodon•5h ago

Comments

al2o3cr•5h ago
IMO worrying about type-safety in the protocol when any string field in the reply can prompt-inject the calling LLM feels like putting a band-aid on a decapitation, but YMMV
ComputerGuru•4h ago
They’re 100% orthogonal issues.
gjsman-1000•2h ago
… or we’ll just invent MCP 2.0.

On that note; some of these “best practices” arguably haven’t worked out. “Be conservative with what you send, liberal with what you receive” has turned even decent protocols into a dumpster fire, so why keep the charade going?

rcarmo•2h ago
I’d rather we ditched MCP and used something that could leverage Swagger instead….
jmull•2h ago
Right...

Failed protocols such as TCP adopted Postel's law as a guiding principle, and we all know how that worked out!

dragonwriter•1h ago
A generalized guiding principle works in one particular use case, so this proves it is a good generalized guiding principle?
gjsman-1000•41m ago
Survivor’s bias.
mockingloris•2h ago
I read this thrice: ...When OpenAI bills $50,000 for last month’s API usage, can you tell which department’s MCP tools drove that cost? Which specific tool calls? Which individual users or use cases?...

It seems to be a game of catch up for most things AI. That said, my school of thought is that certain technologies are just too big for them to be figured out early on - web frameworks, blockchain, ...

- the gap starts to shrink eventually. With AI, we'll just have to keep sharing ideas and caution like you have here. Such very interesting times we live in.

zorked•2h ago

  CORBA emerged in 1991 with another crucial insight: in heterogeneous environments, you can’t just “implement the protocol” in each language and hope for the best. The OMG IDL generated consistent bindings across C++, Java, Python, and more, ensuring that a C++ exception thrown by a server was properly caught and handled by a Java client. The generated bindings guaranteed that all languages saw identical interfaces, preventing subtle serialization differences.
Yes, CORBA was such a success.
antonymoose•2h ago
To be charitable, you can look at a commercially unsuccessful project and appreciate its technical brilliance.
cortesoft•1h ago
Yeah, the modern JSON centered API landscape came about as a response to failures of CORBA and SOAP. It didn’t forget the lessons of CORBA, it rejected them.
cyberax•1h ago
And now we're getting a swing back to sanity. OpenAPI is an attempt to formally describe the Wild West of JSON-based HTTP interfaces.

And its complexity and size now are rivaling the specs of the good old XML-infused times.

pjmlp•1h ago
And then rediscovered why we need schemas in CORBA and SOAP, or orchestration engines.
EdiX•55m ago
It didn't, though. JSON schema is basically dead in the water.
stouset•34m ago
Just because they discovered it doesn’t mean they fixed it.
pjmlp•15m ago
Yet I keep seeing it across several repos.
cyberax•1h ago
CORBA got a lot of things right. But it was unfortunately a child of the late 80-s telecom networks mixed with OOP-hype.

So it baked in core assumptions that the network is transparent, reliable, and symmetric. So you could create an object on one machine, pass a reference to it to another machine, and everything is supposed to just work.

Which is not what happens in the real world, with timeouts, retries, congested networks, and crashing computers.

Oh, and CORBA C++ bindings had been designed before the STL was standardized. So they are a crawling horror, other languages were better.

sudhirb•1h ago
I've worked somewhere where CORBA was used very heavily and to great effect - though I suspect the reason for our successful usage was that one of the senior software engineers worked on CORBA directly.
abtinf•2h ago
I wish someone would write a clear, crisp explanation for why MCP is needed over simply supporting swagger or proto.
nikanj•2h ago
MCP is new
dragonwriter•2h ago
OpenAPI (or its Swagger predecessor) or Proto (I assume by this you mean protobuf?) don't cover what MCP does. It could have layered over them instead of using JSON-RPC, but I don't see any strong reason why they would be better than JSON-RPC as the basis (Swagger has communication assumptions that don't work well with MCP's local use case; protobuf doesn't cover communication at all and would require additional consideration in the protocol layered over it.)

You'd still need basically the entire existing MCP spec to cover the use cases if it replaced JSON-RPC with Swagger or protobuf, plus additional material to cover the gaps and complications that that switch would involve.

vineyardmike•1h ago
Proto has a full associated spec (gRPC) on communication protocols and structured definitions for them. MCP could easily have built upon these and gotten a lot “for free”. Generally gRPC is better than JsonRPC (see below).

I agree that swagger leaves a lot unplanned. I disagree about the local use case because (1) we could just run local HTTP servers easily and (2) I frankly assume the future of MCP is mostly remote.

Returning back to JSON-RPC, it’s a poorly executed RPC protocol. Here is an excellent HackerNews thread on it, but the TLDR is parsing JSON is expensive and complex, we have tons of tools (eg load balancers) that make modern services, and making those tools parse json is very expensive. Many people in the below thread mention alternative ways to implement J-RPC but that depends on new clients.

https://news.ycombinator.com/item?id=34211796

nurettin•1h ago
MCP supports streaming responses. You could implement that by polling and a session state, but that's an inefficient hack.
lsaferite•51m ago
Eh... No, it does not support streaming responses.

I know this because I wish it did. You can approximate streaming responses by using progress notifications. If you want something like the LLM partial response streaming, you'll have to extend MCP with custom capabilities flags. It's totally possible to extend it in this way, but then it's non standard.

Perhaps you are alluding to the fact that it's bidirectional protocol (by spec at least).

input_sh•56m ago
"We couldn't be fucked supporting vastly different APIs so we wrote a new definition of an API that now everyone has to follow if they want to integrate with our tool."

For what it's worth, IFTTT did the same thing like a decade ago. It started by just using existing API until it became unmaintanable, then they just forced everyone to create a separate API that follows their specific instructions.

self_awareness•2h ago
What's new?

- Electron disregards 40 years of best deployment practices,

- Web disregards 40 years of best GUI practices,

- Fast CPUs and lots of RAM disregards 40 years of best software optimization techniques,

there are probably many more examples.

xg15•1h ago
Yeah, and all three have evidently made software more shitty. More profitable and easier to develop, sure, but also much more unpleasant to use.
rickcarlino•2h ago
> SOAP, despite its verbosity, understood something that MCP doesn’t

Unfortunately, no one understood SOAP back.

(Additional context: Maintaining a legacy SOAP system. I have nothing good to say about SOAP and it should serve as a role model for no one)

SoftTalker•2h ago
I have found that any protocol whose name includes the word "Simple" is anything but. So waiting for SMCP to appear....
divan•1h ago
No, letter S in MCP is reserved for "Security")
yjftsjthsd-h•1h ago
I dunno, SMTP wasn't bad last time I had to play with it. In actual use it wasn't entirely trivial, but most of that happened at layers that weren't really the mail transfer protocol's fault (SPF et al.). Although, I'm extremely open to that being one exception in flood of cases where you are absolutely correct:)
sirtaj•1h ago
I recall two SOAP-based services refusing to talk to each other because one nicely formatted the XML payload and the other didn't like that one bit. There is a lot we lost when we went to json but no, I don't look back at that stuff with any fondness.
cyberax•1h ago
This is a very hilarious but apt SOAP description: https://harmful.cat-v.org/software/xml/soap/simple

And I actually like XML-based technologies. XML Schema is still unparalleled in its ability to compose and verify the format of multiple document types. But man, SOAP was such a beast for no real reason.

Instead of a simple spec for remote calls, it turned into a spec that described everything and nothing at the same time. SOAP supported all kinds of transport protocols (SOAP over email? Sure!), RPC with remote handles (like CORBA), regular RPC, self-describing RPC (UDDI!), etc. And nothing worked out of the box, because the nitty-gritty details of authentication, caching, HTTP response code interoperability and other "boring" stuff were just left as an exercise to the reader.

AnotherGoodName•1h ago
I'll give a different viewpoint and it's that I hate everything about XML. In fact one of the primary issues with SOAP was the XML. It never worked well across SOAP libraries. Eg. The .net and Java SOAP libraries have huge threads on stackoverflow "why is this incompatible" and a whole lot of needing to very tightly specify the schema. To the point it was a flaw; it might sound reasonable to tightly specify something but it got to the point there were no reasonable common defaults hence our complaints about SOAP verbosity and the work needed to make it function.

Part of this is the nature of XML. There's a million ways to do things. Should some data be parsed as an attribute of the tag or should it be another tag? Perhaps the data should be in the body between the tags? HTML, based on XML, has this problem; eg. you can seriously specify <font face="Arial">text</font> rather than have the font as a property of the wrapping tag. There's a million ways to specify everything and anything and that's why it makes a terrible data parsing format. The reader and writer must have the exact same schema in mind and there's no way to have a default when there's simply no particular correct way to do things in XML. So everything had to be very very precisely specified to the point it added huge amounts of work when a non-XML format with decent defaults would not have that issue.

This become a huge problem for SOAP and why i hate it. Every implementation had different default ways of handling even the simplest data structure passing between them and were never compatible unless you took weeks of time to specify the schema down to a fine grained level.

In general XML is problematic due to the lack of clear canonical ways of doing pretty much anything. You might say "but i can specify it with a schema" and to that i say "My problem with XML is that you need a schema for even the simplest use case in the first place".

cyberax•1h ago
Yes, XML has way too much flexibility. With some very dark corners like custom entities, DTDs, and BOMs (byte order marks). It's clearly a child of 90-s conceived before UTF-8, and the corrosive world of the modern networks.

But parts of XML infrastructure were awesome. I could define a schema for the data types, and have my IDE auto-complete and validate the XML documents as I typed them. I could also validate the input/output data and provide meaningful errors.

And yeah, I also worked with XML and got burned many times by small incompatibilities that always happen due to its inherent complexity. If XML were just a _bit_ simpler, it could have worked so much better.

pjmlp•1h ago
I have plenty of good stuff to say, especially since REST (really JSON-RPC in practice), and GraphQL, seem to always being catching up to features the whole SOAP and SOA ecosystems already had.

Unfortunately as usual when a new technology cycle comes, everything gets thrown away, including the good parts.

jchw•50m ago
Agreed. In practice, SOAP was a train wreck. It's amazing how overly complicated they managed to make concepts that should've been simple, all the way down to just XML somehow being radically more complex than it looks to the wacky world of ill-defined standards for things like WSDLs and weird usage of multi-part HTTP and, to top it all off, it was all for nothing, because you couldn't guarantee that a SOAP server written in one language would be interoperable with clients in other languages. (I don't remember exactly what went wrong, but I hit issues trying to use a SOAP API powered by .NET from a Java client. I feel like that should be a pretty good case!)

It doesn't take very long for people to start romanticizing things as soon as they're not in vogue. Even when the painfulness is still fresh in memory, people lament over how stupid new stuff is. Well I'm not a fan of schemaless JSON APIs (I'm one of those weird people that likes protobufs and capnp much more) but I will take 50 years of schemaless JSON API work over a month of dealing with SOAP again.

chasd00•24m ago
It’s been a while but isn’t soap just xml over http-post? Seems like all the soap stuff I’ve done is just posting lots of xml and getting lots of xml back.

/“xml is like violence, if it’s not working just use more!”

rcxdude•17m ago
If it was some vaguely sensibly defined XML, it wouldn't be quite as bad. But it's a ludicrously over-complicated mapping between the service definition and the underlying XML, often auto-generated by a bunch of not very well designed nor compatible tooling.
dragonwriter•12m ago
> It’s been a while but isn’t soap just xml over http-post?

No.

SOAP uses that, but SOAP involves a whole lot of spec about how you do that, and that's even before (as the article seems to) treat SOAP as meaning SOAP + the set of WS-* standards built around it.

ohdeargodno•18m ago
Parsing SOAP responses on memory limited devices is such a fun experiment in just how miserable your life can get.
btown•1h ago
If you want the things mentioned in this article, I highly recommend looking at https://github.com/modelcontextprotocol/modelcontextprotocol... and https://modelcontextprotocol.io/community/sep-guidelines and participating in the specification process.

Point-by-point for the article's gripes:

- distributed tracing/telemetry - open discussion at https://github.com/modelcontextprotocol/modelcontextprotocol...

- structured tool annotation for parallelizability/side-effects/idempotence - this actually already exists at https://modelcontextprotocol.io/specification/2025-06-18/sch... but it's not well documented in https://modelcontextprotocol.io/specification/2025-06-18/ser... - someone should contribute to improving this!

- a standardized way in which the costs associated with an MCP tool call can be communicated to the MCP Client and reported to central tracking - nothing here I see, but it's a really good idea!

- serialization issues e.g. "the server might report a date in a format unexpected by the client" - this isn't wrong, but since the consumer of most tool responses is itself an LLM, there's a fair amount of mitigation here. And in theory an MCP Client can use an LLM to detect under-specified/ambiguous tool specifications, and could surface these issues to the integrator.

Now, I can't speak to the speed at which Maintainers and Core Maintainers are keeping up with the community's momentum - but I think it's meaningful that the community has momentum for evolving the specification!

I see this post in a highly positive light: MCP shows promise because you can iterate on these kinds of structured annotations, in the context of a community that is actively developing their MCP servers. Legacy protocols aren't engaging with these problems in the same way.

calvinmorrison•1h ago
MCP, aka, WSDL for REST
SigmundA•56m ago
I thought that was OpenAPI?
zwaps•1h ago
The author seems to fundamentally misunderstand how MCPs are going to be used and deployed.

This is really obvious when they talk about tracing and monitoring, which seem to be the main points of criticism anyway.

They bemoan that they cant trace across MCP calls, assuming somehow there would be a person administering all the MCPs. Of course each system has tracing in whatever fashion fits its system. They are just not the same system, nor owned by the same people let alone companies.

Same as monitoring cost. Oh, you can’t know who racked up the LLM costs? Well of course you can, these systems are already in place and there are a million of ways to do this. It has nothing to do with MCP.

Reading this, I think its rather a blessing to start fresh and without the learnings of 40 years of failed protocols or whatever

ComplexSystems•1h ago
I thought this article was going to be a bunch of security theater nonsense - maybe the relatively bland title - but after reading I found it to be incredibly insightful, particularly this:

> MCP discards this lesson, opting for schemaless JSON with optional, non-enforced hints. Type validation happens at runtime, if at all. When an AI tool expects an ISO-8601 timestamp but receives a Unix epoch, the model might hallucinate dates rather than failing cleanly. In financial services, this means a trading AI could misinterpret numerical types and execute trades with the wrong decimal precision. In healthcare, patient data types get coerced incorrectly, potentially leading to wrong medication dosing recommendations. Manufacturing systems lose sensor reading precision during JSON serialization, leading to quality control failures.

Having worked with LLMs every day for the past few years, it is easy to see every single one of these things happening.

I can practically see it playing out now: there is some huge incident of some kind, in some system or service with an MCP component somewhere, with some elaborate post-mortem revealing that some MCP server somewhere screwed up and output something invalid, the LLM took that output and hallucinated god knows what, its subsequent actions threw things off downstream, etc.

It would essentially be a new class of software bug caused by integration with LLMs, and it is almost sure to happen when you combine it with other sources of bug: human error, the total lack of error checking or exception handling that LLMs are prone to (they just hallucinate), a bunch of gung-ho startups "vibe coding" new services on top of the above, etc.

I foresee this being followed by a slew of Twitter folks going on endlessly about AGI hacking the nuclear launch codes, which will probably be equally entertaining.

throwawaymaths•1h ago
i mean isnt all this stuff up to the mcp author to return a reasonable error to the agent and ask for it to repeat the call with amendments to the json?
dotancohen•1h ago
Yes. And this is where culture comes in. The culture of discipline of the C++ and the JavaScript communities are at extreme odds of the spectrum. The concern here is that the culture of interfacing with AI tools, such as MCP, is far closer to the discipline of the JavaScript community than to the C++ community.
nativeit•1h ago
What's your point? It's up to a ship's captain to keep it afloat, doesn't mean the hundreds of holes in the new ship's hull aren't relevant.
stouset•43m ago
This is no different than the argument that C is totally great as long as you just don’t make mistakes with pointers or memory management or indexing arrays.

At some point we have to decide as a community of engineers that we have to stop building tools that are little more than loaded shotguns pointed at our own feet.

andersa•31m ago
It's clearly a much better design if the shotguns are pointed at someone else's feet.
cwilkes•23m ago
This implies that the input process did a check when it imported the data from somewhere else.

GIEMGO garbage in even more garbage out

tomrod•1h ago
We already have PEBKAC - problem exists between chair and keyboard.

LLMs are basically automating PEBKAC

cle•55m ago
I don't understand this criticism by the author. MCP supports JSON Schema, and server responses must conform to the schema. If the schema requires an ISO-8601 timestamp (ex by specifying a "date" format in the schema) but the server sends a Unix epoch timestamp, then it is violating the protocol.

The author even later says that MCP supports JSON Schema, but also claims "you can't generate type-safe clients". Which is plainly untrue, there exist plenty of JSON Schema code generators.

ohdeargodno•26m ago
Except that any properly written software will respond to protocol and schema violations by throwing an error.

Claude will happily cast your int into a 2023 Toyota Yaris and keep on hallucinating things.

starkparker•11m ago
For the curious:

> Cast an integer into the type of a 2023 Toyota Yaris using Javascript

(GPT-4o mini)

> To cast an integer into the type of a 2023 Toyota Yaris in JavaScript, you would typically create a class or a constructor function that represents the Toyota Yaris. Then, you can create an instance of that class using the integer value. Here's an example of how you might do this:

  // Define a class for the 2023 Toyota Yaris
  class ToyotaYaris {
      constructor(modelYear, value) {
          this.modelYear = modelYear;
          this.value = value;
      }
  
      displayInfo() {
          console.log(`Model Year: ${this.modelYear}, Value: ${this.value}`);
      }
  }
  
  // Function to cast an integer into a Toyota Yaris object
  function castToYaris(integerValue) {
      const modelYear = 2023; // Set the model year
      return new ToyotaYaris(modelYear, integerValue);
  }
  
  // Example usage
  const integerValue = 20000; // Example integer value
  const yaris = castToYaris(integerValue);
  yaris.displayInfo(); // Output: Model Year: 2023, Value: $20000
cle•10m ago
I just tried this in Claude Code. I made an MCP server whose tool output is declared as an integer but it returns a string at runtime.

Claude Code validated the response against the schema and did not accept the response.

     test - test_tool (MCP)(input: "foo")
      ⎿  Error: Output validation error: 'bar' is not of type 'integer'
cookiengineer•36m ago
Let's put it this way:

Before 2023 I always thought that all the bugs and glitches of technology in Star Trek were totally made up and would never happen this way.

Post-LLM I am absolutely certain that they will happen exactly that way.

I am not sure what LLM integrations have to do with engineering anymore, or why it makes sense to essentially put all your company's infrastructure into external control. And that is not even scratching the surface with the lack of reproducibility at every single step of the way.

It "somehow works" isn't engineering.

avereveard•12m ago
MCP focuses on transport and managing context and doesn't absolve the user for sensibly implementing the interface (i.e. defining a schema and doing schema validation)

this is like saying "HTTP doesn't do json validation", which, well, yeah.

hinkley•9m ago
> In healthcare, patient data types get coerced incorrectly, potentially leading to wrong medication dosing recommendations.

May have changed, but unlikely. I worked with medical telemetry as a young man and it was impressed upon me thoroughly how important parsing timestamps correctly was. I have a faint memory, possibly false, of this being the first time I wrote unit tests (and without the benefit of a test framework).

We even accounted for lack of NTP by recalculating times off of the timestamps I. Their message headers.

And the reasons I was given were incident review as well as malpractice cases. A drug administered three seconds before a heart attack starts is a very different situation than one administered eight seconds after the patient crashed. We saw recently with the British postal service how lives can be ruined by bad data, and in medical data a minute is a world of difference.

oblio•8m ago
We keep repeating this.

When desktop OSes came out, hardware resources were scarce so all the desktop OSes (DOS, Windows, MacOS) forgot all the lessons from Unix: multi user, cooperative multitasking, etc. 10 years later PC hardware was faster than workstations from the 90s yet we're still stuck with OSes riddled with limitations that stopped making sense in the 80s.

When smartphones came out there was this gold rush and hardware resources were scarce so OSes (iOS, Android) again forgot all the lessons. 10 years later mobile hardware was faster than desktop hardware from the 00s. We're still stuck with mistakes from the 00s.

AI basically does the same thing. It's all lead by very bright 20 and 30 year olds that weren't even born when Windows was first released.

Our field is doomed under a Cascade of Attention-Deficit Teenagers: https://www.jwz.org/doc/cadt.html (copy paste the link).

It's all gold rushes and nobody does Dutch urban infrastructure design over decades. Which makes sense as this is all driven by the US, where long term plan I is anathema.

BLanen•1h ago
As I've been saying.

MCP is not a protocol. It doesn't protocolize anything of use. It's just "here's some symbols, do with them whatever you want.", leaving it there but then advertising that as a feature of its universality. It provides almost just as much of a protocol as TCP, but rebuild on 5 OSI layers, again.

It's not a security issue, it's a ontological issue.

lsaferite•30m ago
And yet, TCP powers the Internet.

That being said. MCP as a protocol has a fairly simple niche. Provide context that can be fed to a model to perform some task. MCP covers the discovery process around presenting those tools and resources to an Agent in a standardized manner. An it includes several other aspects that are useful in this niche. Things like "sampling" and "elicitations". Is it perfect? Not at all. But it's a step in the right direction.

The crowd saying "just point it at an OpenAPI service" does not seem to fully understand the current problem space. Can many LLMs extract meaning from un-curated API response messages? Sure. But they are also burning up context holding junk that isn't needed. Part of MCP is the acknowledgement that general API responses aren't the right way to feed the model the context it needs. MCP is supposed to be taking a concrete task, performing all the activities need to gather the info or affect the change, then generate clean context meant for the LLM. If you design an OpenAPI service around those same goals, then it could easily be added to an Agent. You'd still need to figure out.all the other aspects, but you'd be close. But at that point you aren't pointing an Agent at a random API, you're pointing it at a purpose made API. And then you have to wonder, why not something like MCP that's designed for that purpose from the start?

I'll close by saying there are an enormous number of MCP Servers out there that are poorly written, thin wrappers on general APIs, or have some other bad aspects. I attribute a lot of this to the rise in AI Coding Agents allowing people with poor comprehension of the space enabling them to crank out this... Noise.

There are also great examples of MCP Servers to be found. They are the ones that have thoughtful designs, leverage the spec fully, and provide nice clean context for the Agent to feed to the LLM.

I can envision a future where we can simply point an agent at a series of OpenAPI services and the agent uses it's models to self-assemble what we consider the MCP server today. Basically it would curate accessing the APIs into a set of focused tools and the code needed to generate the final context. That's not quite where we are today. It's likely not far off though.

dragonwriter•1h ago
> MCP discards this lesson, opting for schemaless JSON with optional, non-enforced hints.

Actually, MCP uses a normative TypeScript schema (and, from that, an autogenerated JSON Schema) for the protocol itself, and the individual tool calls also are specified with JSON Schema.

> Type validation happens at runtime, if at all.

That's not a consequence of MCP "opting for schemaless JSON" (which it factually does not), that's, for tool calls, a consequence of MCP being a discovery protocol where the tools, and thus the applicable schemas, are discovered aruntime.

If you are using MCP as a way to wire up highly-static components, you can do discovery against the servers once they are wired up, statically build the clients around the defined types, and build your toolchain to raise errors if the discovery responses change in the future. But that's not really the world MCP is built for. Yes, that means that the toolchain needs, if it is concerned about schema enforcement, use and apply the relevant schemas at runtime. So, um, do that?

GeneralMayhem•1h ago
> MCP promises to standardize AI-tool interactions as the “USB-C for AI.”

Ironically, it's achieved this - but that's an indictment of USB-C, not an accomplishment of MCP. Just like USB-C, MCP is a nigh-universal connector with very poorly enforced standards for what actually goes across it. MCP's inconsistent JSON parsing and lack of protocol standardization is closely analogous to USB-C's proliferation of cable types (https://en.wikipedia.org/wiki/USB-C#Cable_types); the superficial interoperability is a very leaky abstraction over a much more complicated reality, which IMO is worse than just having explicitly different APIs/protocols.

cnst•17m ago
I'd like to add that the culmination of USB-C failure was Apple's removal of USB-A ports from the latest M4 Mac mini, where an identical port on the exact same device, now has vastly different capabilities, opaque to the final user of the system months past the initial hype on the release date.

Previously, you could reasonably expect a USB-C on a desktop/laptop of an Apple Silicon device, to be USB4 40Gbps Thunderbolt, capable of anything and everything you may want to use it for.

Now, some of them are USB3 10Gbps. Which ones? Gotta look at the specs or tiny icons, I guess?

Apple could have chosen to have the self-documenting USB-A ports to signify the 10Gbps limitation of some of these ports (conveniently, USB-A is limited to exactly 10Gbps, making it perfect for the use-case of having a few extra "low-speed" ports at very little manufacturing cost), but instead, they've decided to further dilute the USB-C brand. Pure innovation!

With the end user likely still having to use a USB-C to USB-A adapters anyways, because the majority of thumb drives, keyboards and mice, still require a USB-A port — even the USB-C ones that use USB-C on the other end of the USB-A connector. (But, of course, that's irrelevant for an Apple fan, because you can always spend 10x as much for a USB-C version of any of these devices, and the fact that the USB-C variants are less common or inferior to USB-A, is of course irrelevant when hype and fanaticism are more important than utility, usability and logic.)

afeuerstein•16m ago
Yeah, I loughed out loud when I read that line. Mission accomplished, I guess?
zombiwoof•1h ago
Basically a bunch of vibe coders at a Anthropic hackathon used Claude to poop out MCP
ipython•59m ago
I am torn. I see this argument and intellectually agree with it (that interfaces need to be more explicit). However it seems that every time there is a choice between “better” design and “good enough”, the “good enough” wins handily.

Multics vs Unix, xml based soap vs json based rest apis, xhtml’s failure, javascript itself, … I could keep going on.

So I’ve resigned myself to admitting that we are doomed to reimplement the “good enough” every time, and continue to apply bandaid after bandaid to gradually fix problems after we rediscover them, slowly.

antonvs•24m ago
It's the old Worse is Better observation, which is 36 years old now:

https://en.m.wikipedia.org/wiki/Worse_is_better

It's been confirmed over and over since then. And I say that as someone who naturally gravitates towards "better" solutions.

cookiengineer•11m ago
Obligatory minute of silence for xforms 2.0

The world we could have lived in... working web forms validations, working microdata...

mac-mc•54m ago
You're missing the most significant lesson of all that MCP knew. That all of those featureful things are way too overcomplicated for most places, so they will gravitate to the simple thing. It's why JSON over HTTP blobs is king today.

I've been on the other side of high-feature serialization protocols, and even at large tech companies, something like migrating to gRPC is a multi-year slog that can even fail a couple of times because it asks so much of you.

MCP, at its core, is a standardization of a JSON API contract, so you don't have to do as much post-training to generate various tool calling style tokens for your LLM.

prerok•47m ago
What are HTTP blobs?

I think you meant that is why JSON won instead of XML?

mac-mc•34m ago
JSON-over-HTTP blobs. Or blobs of schemaless json.

Not just XML, but a lot of other serialization formats and standards, like SOAP, protobuf in many cases, yaml, REST, etc.

People say REST won, but tell me how many places actually implement REST or just use it as a stand-in term for casual JSON blobs to HTTP URLs?

upghost•37m ago
So I'm in the "MCP is probably not a great idea" camp but I couldn't say "this is how it SHOULD be done", and the author makes great criticisms but falls short of actual suggestions. I'm assuming the author is not seriously recommending we go back to SOAP and I've never heard of CORBA. I've heard of gRPC but I can't tell if the author is saying it is good or bad.

Also Erlang uses RPCs for pretty much all "synchronous" interactions but it's pretty minimal in terms of ceremony. Seems pretty reliable.

So this is a serious question because hand rolling "40 years" of best practices seems hard, what should we be using for RPC?