RIIJ was justified too, because people believed the web will end up being java applets all the way down.
I think Java helped in the mainstreaming of memory safe and GC languages, especially in more corporate spaces where C/C++ was still mostly still a thing.
Certainly sysadmins were using a lot of Perl during that time, but for "real" enterprise software, I don't think dynamic-ish languages were as accepted. The use of Perl and rise of Python widened the Overton window.
The hype accomplished something that would be otherwise impossible: it established Java as a language in what was likely record time. Consider another popular language: Python. It was created about 5 years earlier, yet it rose to prominence about a decade later. Or consider Rust. It is, in many respects, as significant as Java. While major developers were shipping significant applications written in Java within 5 years, Rust is only creeping into important software a decade after its introduction.
The second point is its easy to underestimate the dominance of Microsoft in those days. You think that Microsoft is dominant today, well, that's nothing compared to the late 1990's. Microsoft's market share was closer to 95% for the PC market. The workstation market was starting to crumble due to competition from Microsoft and Intel. About the only thing that was safe were mainframes, and that was highly dependent upon one's definition of safe. Nearly everyone who was competing against Microsoft wanted to see a chunk taken out of them, which meant pretty much everyone since Microsoft had its fingers in so many markets. And, as it turns out, nearly everything did have to be rewritten. Sometimes it was to deliver SaaS over the Internet and sometimes it was to target mobile devices.
> The second point is its easy to underestimate the dominance of Microsoft in those days. You think that Microsoft is dominant today, well, that's nothing compared to the late 1990's. Microsoft's market share was closer to 95% for the PC market.
By the late 1990s Linux had become a viable platform for a whole lot of things, and people were beginning to take notice. Most obviously, that probably put a big dent into the adoption of Windows NT as a server OS on x86 machines, which had been progressing quite well until the mid 1990s. That also probably helped Java because it meant you could seamlessly run your server workloads on "toy" x86 machines or on more "serious" platforms, without changing anything else.
I predict we will be having buffer overrun CVEs in C/C++ code for as long as we have C/C++ code.
The realities of writing safe, multithreaded C/C++ on processors with out-of-order processing, context switching and branch prediction is simply too complex to get right 100% of the time. Rust makes writing certain code difficult because it is difficult to do/ C/C++ fools you into believing something is safe because you've never encountered the circumstances where it isn't.
We have tools like valgrind to try and identify such issues. They're certainly useful. But you'll be constantly chasing rabbits.
I've seen thread and memory bugs in production code written by smart, highly-paid engineers at big tech companies that have lain dormant for the better part of a decade.
That's why Rust exists.
There's another perspective. Many people were looking for something like Java well before it was released: VM-based, portable, modern object-orientation features, etc.
Case in point: databases. In the early 1990s I worked on a project at Sybase that attempted to rewrite SQL Server from the ground up to bring in object [relational] support. The team focused on VM-based languages as a foundation, which were an area of active academic research at the time. Built-in object support, portability, and ability to support code generation for queries were among the attractions. The project started with Smalltalk (slow!), then moved to an acquired VM technology (it was bad!), and finally a VM we designed and built ourselves. These gyrations were a key reason why the project failed, though not the only one.
When Java came out in 1995--I got access to the alpha release in September--it met virtually every requirement we were trying to fulfill. At that point most attempts to build new databases on other VM tech became instantly obsolete. (Other vendors were also looking at VMs as well.)
Not coincidentally Nat Wyatt and Howard Torf, a couple of key engineers from our project, founded a start-up called Cloudscape to pursue the Java route. They created the database we know today as Derby.
Somewhat more coincidentally, Java became dominant in American DBMS development after 2000. Examples including Hadoop, Druid, Pinot, HBase, etc., are just a few of the examples. I say "somewhat more concidentally" because at that point most of us saw Java as simply more productive than C/C++ alternatives for building reliable, high-performance, distributed systems. That view has obviously evolved over time, but between JIT, dev tooling, and libraries it was definitely true in the early 2000s. It helps to remember how difficult C++ was to use at that time to understand this perspective.
In summary, a lot of the hype was the usual new technology craziness, but Java also met the needs of a population of people that went far beyond databases. There was a basis for our excitement. Just my $0.02.
Edit: typo
And it was just about shoved down our throat. They paid to get it into schools. They paid for ads on TV that just vaguely said something about Java being good, because they didn't really have anything concrete they could point to yet. They paid to have really bad enterprise software written in it and then jammed into schools just to make sure we had a bad experience, like Rational Rose [1]... my memory may be failing me but I think it was implemented in Java at the time, because it was a Swing app (another Java thing shoved down our throats but not ready for prime time even by the standards of 1997). I was using it as an undergrad student in 1999 or so and I couldn't hardly click on a thing without crashing it. Not the best look for Java, though I'm sure it was not Java qua Java's fault.
Still, it fits the pattern I'm trying to show here of it being grotesquely hyped beyond its actual capabilities.
They shoved enough money at it that they did eventually fix it up, and the hardware caught up into the 200xs so it became a reasonable choice. Java isn't my favorite language and I still try to avoid it, but in 2025 that's just a taste and personal preference, not because I think it's completely useless. But I feel bad for anyone in the 1990s ordered by corporate mandate to write their servers in Java because the ads look cool or because Sun was paying them off. It must have been a nightmare.
In fact, you can understand the entire Dot Com era hype as selling the internet of about 2007 in 1997, or in some cases even 2017. It all happened, but it didn't all happen in the "year or two" that the stock valuations implied.
Outside of Perl’s CPAN, library support in 1997 sucked for all languages. Being able to write a hash-table or linked list C was a valuable commercial skill as nearly every single code base would include custom versions of these sorts of very basic data structures rather than pull them from a commonly used library.
“Using a 3rd party library” meant copying a bunch of source code downloaded from who-knows-where into your source control repo and hacking it to work with whatever funky compiler and/or linker your project used.
I know that it wasn't like it is today where a casual weekend hobby project can easily pull in a few hundred libraries with just a couple of shell commands, but you still needed some things to get going. It was theoretically possible to sit down with a blank text editor and write assembly code that functioned as a GUI app, the last few dying gasps of that philosophy were around, but it's not what most people did.
I think in the context of the time, Java was simply following the common least-common-denominator approach, common for cross-platform GUI toolkits at the time - tcl/tk was hot at the time as were commercial products like powerbuilder.
This approach was to only include features/funtionality which mapped directly to native for all supported platforms - Mac, windows and some X11 toolkit. There were convincing arguments for this approach at the time - you got native look and behavior from the same code for all platforms Java ran on, like with TK but it quickly came apparent that this approach was a technological cul-de-sac limited to simple dialogs and form-based GUIs.
I know you didn't say otherwise, but for anyone who wasn't there it should be emphasised that many of the deficiencies were core-language deficiencies not just library issues. Java people would blame the customers https://people.csail.mit.edu/gregs/ll1-discuss-archive-html/... https://people.csail.mit.edu/gregs/ll1-discuss-archive-html/... and a rush to market http://www.blinkenlights.com/classiccmp/javaorigin.html (in a self-congratulatory way, of course), but it's also pretty clear that the Java team itself had significantly overestimated how capable and sufficient core Java 1 was. In fact writing out the standard library was evidently an important learning experience there, one which gave birth to the Java-Hater's Handbook https://wiki.c2.com/?EffectiveJava . And before things eventually got better there was ofc lots of hype first about how Java was a shining jewel of minimalism, and then about how it was a fine language for plain everyday people who had no truck with fancy abstractions.
Did you just look three years into the future and write about the GenAI hype?
definition #1 is about Java features : The original "Java is criminally underhyped" essay by Jackson Roberts is talking about "not over-hyped" in terms of Java's technical capabilities ... such as types and package manager, etc. E.g. Java has types which Javascript/Python do not and typing is a positive thing to help prevent errors -- therefore -- Java is "underhyped". The particular language capability not being used as much as the author thinks it should is the basis for defining what "hype" is.
definition #2 is about Java's marketplace effect: The "overhype" of Java in the 1990s was extrapolating and predicting Java's effect on the whole computing landscape. This type of "hype" is overestimating the benefits of Java and making bold proclamations. Examples:
- Java and JVM's WORA "Write Once Run Anywhere" will kill Microsoft's Evil Empire because it will render Windows irrelevant. (This didn't happen and MS Windows still has 70+% desktop market share today. 30 years later and Microsoft is one of the top 3 tech companies with a $3+ trillion dollar market cap while Sun Microsystems was acquired at a discount by Oracle.)
- Java will make lower level languages with manual memory allocation like C/C++ obsolete because CPUs are getting faster. Let the extra "unused" cpu cycles do automatic garbage collection in Java instead of C programmers manually managing memory with malloc()/free(). (C/C++ is still used today for games, and tight loops of machine learning libs underneath Python.)
- Java plugins will enable "rich" web experiences. (It turns out that Javascript and not Java plugins won the web. Java also didn't win on desktop apps. Javascript+Electron is more prevalent.)
That's the type of overhype that Java failed to deliver.
Same situation with today's AI. Some aspects of AI will absolutely be useful but some are making extravagant extrapolations (i.e. "AI LLM hype") that will not come true.
Except that this actually happened wrt. a whole lot of application code. Sure, Java was slow and clunky but at least it was free of the memory unsafety that plagued C/C++. What was the mainstream "safe" alternative? There was no Rust back then, even Cyclone (the first memory-safe C style language) was only released in the mid-2000s.
Before Sun Java in 1995, companies built enterprise CRUD apps with "safe" memory languages using MS Visual Basic, PowerBuilder, and xBase languages like dBASE and FoxPro. This allowed them to develop line-of-business apps without manually managing memory in C/C++.
They claimed they fixed it a dozen times, starting with JDK 1.4, and continued to claim that with every major release since.
There was the sentiment but this doesn’t fully capture what the hype was about. Java out of the gate had Suns network vision built in via jndi, rmi and object serialization. The hype was about moving applications onto the network and off of windows or any particular vendors OS.
And this did come to pass, just not how Sun was selling it with Java. For example; Office, Microsoft’s crown jewel and anchor into a lot organizations is now almost entirely web based.
It's kind of obvious since having a standard, platform-neutral virtual machine doesn't just enable WORA; it enables sending binary code and program state over the network, which is quite handy for all sorts of distributed computing flows. We'll probably do the same things using Wasm components once that tech stack becomes established.
Office web is comically slow, even when I have $5K of machines a 100ish GB of DRAM laying around my house.
In the java vision, it’d transparently offload to that network of machines. Also, the user could decide not to offload to untrusted hardware (e.g., I don’t want to trust Microsoft’s cloud compute).
1. Desktop applications;
2. Server applications; and
3. Browser applications.
We had more platforms then. On the desktop front, Mac was in decline but still existed and was strong in certain niches. On the server front, there were many UNIX variants (eg Solaris, HP/UX, Digital Unix, etc). Cross-platform really was a big deal and much more relevant.
We still had desktop apps then. Being able to write a Swing app and run it "everywhere" was a big deal. Otherwise you wre writing things in thing slike Visual Basic (and thus Windows only) or I don't even know what Mac used at the time.
On the server, this was the very early days of Web servers. Netscape still existed and sold Web server software. The most common form of HTTP serving was CGI bin. These had a significant start up cost. There were other solutions like ISAPI/NSAPI. Java brought in servlets, which were persistent between requests. That was massive at the time.
It creates problems too but it's all tradeoffs.
And the last is Web applications. This was the newest area and had the most uncertain future. Java applets were pushed as a big deal and were ultimately displaced by Macromedia (then Adobe) Flash, which itself is now (thankfully) dead and we have Javascript applications. That future was far from certain in the 1990s.
I remember seeing demos of Java applets with animations and Microsoft doing the same thing with a movie player of all things.
Single page applications simply didn't exist yet. If you wanted that, and honestly nobody did, it was a Java applet or maybe a Flash app. The Web was still much more "document" oriented. Server a page, click something, server a page for that and so on.
I still wrote this form of Website in the 2000s and it could be incredibly fast. PHP5+MySQL and a load time sub-30ms. You could barely tell there was a server round trip at all.
So Java still exists but almost entirely in the server space. It's probably fallen away from other platforms like PHP, Python, Node.js, etc. But it absolutely changed the direction of tech. Java has left a lasting legacy.
I would go as far as saying that Java democratized the Internet. Prior to Java, every solution was commercial and proprietary.
Even now in fortune 100 I don't think they use anything other than Java to perform daily tasks. Yea they are now more open to use Python for ML related tasks and Node.js for compiling React and all but anything backend its Java
To help folks understand your perspective, what would you replace it with?
Combined with a massive branding push — Sun doubled its ad spend from 1995 to 1997 — Java ended up everywhere in CS education. By the late ’90s, first-year courses using Java weren’t a coincidence; they were the result of a planned, top-down push.
I also dont think we should blame Java the language for the OOP insanity that also infect C++, Delphi, etc. it was an industry wide insanity that thought they could replace pesky programmers with a single god architect.
Also most of the large enterprises need distributed transactions as they use multiple databases and message queues in a monolith architecture - and no other language have the lib/tooling that can beat Java.
One factor in that choice is that Java can run seamlessly and with official support on mainframe and midrange compute platforms that are still quite popular among Fortune 100 and 500 companies. (Many of them are building "private clouds" as a replacement but it's a slow transition.) While you might be able to get other languages to run, sticking to Java is a broadly sensible choice.
Could you elaborate a bit further? People at consulting companies don’t use Go or Rust? Also, do these top Fortune companies recruit from consulting companies often?
https://digitalcareers.infosys.com/infosys/global-careers?lo...
Just search for Rust or Go lang you can know why. Infosys employs 350,000 employees and almost all of them working for Fortune 500 companies. There is no single Rust or Go opening from what i can see. Go and Rust did not even make it into the dropdown.
> top Fortune companies recruit from consulting companies often
If you have worked in large Banks, Pharma, Automobile (IT), FMCGs you know. There will be a couple of middle managers onsite (i.e. in US) and rest of the devs, often hundreds of, are located in the offshore (Asia/South America).
The choice of having unsigned types or not is always one of the lesser evil, and in a language where emitting signals directly to hardware ports is not a primary use case, the argument that not having these types is the lesser evil carries a lot of merit.
Java in the meantime has gained all the unsigned operations as methods in the Integer and Long classes, so the relatively rare cases when you need them are straightforward to handle.
The only real annoyance is that byte is signed. At least there’s a bit of unsigned support in the Byte class now.
Lastly, minor point, Java actually has an unsigned 16-bit integer type, called char.
Unsigned integer types are only really necessary when dealing with low-level bit manipulation, but most programs don't do this. The lack of unsigned integers makes low-level stuff a bit more difficult, but it makes the language as a whole much easier. It's a good tradeoff.
Keeping unsigned integer types out of the language makes things much simpler, and keeping things simple was an original design goal of Java.
Mandate 2's complement be used.
> Unsigned integer types are only really necessary when dealing with low-level bit manipulation
They also give one more bit of precision, useful when dealing with 32-bit integers (or below)
The risk of unsigned types (even without the C or C++ issues around mixing with signed types) is that too many people make the mistake of using them to express the invariant of "a number that must be positive", which modular arithmetic types are a really bad fit for.
One possible use is for a memory-efficient storage of small positive values, say in a byte. But then you have to make a choice between forcing the value into a signed type for arithmetic (which Java easily lets you do with Byte.toUnsignedInt) and allowing signed and unsigned types to be mixed.
However, since then I've seen several surveys of JVM programmers, and apparently Spring is used in something like 80% of Java projects, so it's not surprising a majority of people, even Java developers, think that Spring is somehow mandatory if you're a Java programmer. But of course, it's just a framework, one of many, it's just the most popular one. Can you "do JS" professionally without knowing Javascript today? I'd think so. I guess React is about as dominant in the JS-world as Spring is in the Java-world.
try the inverse and you will cry razor blade tears.
And then, of course, I woke up and smelled the roses, and realized the mess I was making.
Follow the money.
Initially it was from VM licenses, from netscape for browsers, Oracle for databases, and Borland for IDE's (Borland also wrote the first JIT). But except for databases, they were non-exclusive, and JavaSoft's free offerings undercut their licensees.
Then IBM cut a 10 year deal while Microsoft's license went to court for trying to add features to get lock in. At this time IBM created a free Eclipse to undercut the IDE market (Borland), with SWT as an alternative to Swing to capture developers for leverage.
But the big money was in enterprise, so J2EE licensing was much more airtight for Oracle, BEA, et al. That was a successful and long-lived franchise that also drove storage and compute revenues.
But people hated the complexity and compute resources and Google and Apple both decided to build rather than buy, so we got Spring, Swift, Go, and the whole native and container ecosystem.
You have to be aggressive and strict in building a monopoly, but you should be gentle and forgiving in maintaining it. Both Microsoft and AWS learned this lesson.
ynzoqn•3d ago
It sounds good.