Roughly every two years, the density of transistors that can be fit onto a silicon chip doubles.
No. Moore's law is not about density. It's just about the number of transistors on a chip. Yes, density increases but so does die size. Anyways, in Moore's own word: The complexity for minimum component costs has increased at a rate of roughly a factor of two per year.
http://cva.stanford.edu/classes/cs99s/papers/moore-crammingm...Effectively, it was always more of a "marketing law" than an engineering one. Semiconductor chips only had 18-36 months to reap big profits, so Intel tried to stay ahead of that curve.
https://hopefullyintersting.blogspot.com/2019/03/what-is-moo...
Dual ported TCAM memory isn't getting faster and we've got to 1,000,000 prefixes in the internet and ipv6 are 4 times bigger. Memory speed is a real issue.
Re garage invention: lithography is probably too big an issue for that. It's important to keep in mind that we're currently producing a lot of transistors with today's tech. Any alternative would have to match that (eg stamping technologies).
(I work on lithography optics)
Ultimately, there's a cap. For as far as I know, the universe is finite.
I don't think we know that. We don't even know how big the universe really is - we can only see so far. All we have is a best guess.
There may also be a multiverse out there (or right beside us).
And, creating universes might be a thing.
... I don't expect Moore's law to hold for ever either, but I don't believe in creating unnecessary caps.
As I understand it Moore's Law doesn't address any sort of fundamental physical limitations other than perhaps an absolutely limit in terms of some fundamental limit on the smallness of an object, it's just an observation of the doubling of transistor density over a consistent period of time.
It seems more like an economical or social observation than a physical one to me.
So we may have Apple and NVidia as the only ones that can afford to build a fab. Edit, correction, Microsoft is the current number 2 company by market cap.
That's... not how this works at all. Eventually the depletion region where the positive or negative charge carriers (for p or n doped silicon) deplete far enough and then at the threshold voltage inversion happens when the opposite sort of charge carrier start to accumulate along the oxide and allow conduction. By surrounding the channel there's less space for a depletion region and so inversion happens at lower voltages, leading to higher performance. Same as people used to do with silicon on oxide.
The Wikipedia article has nice diagrams:
that... isn't the moore law, it is about count / complexity, not density. and larger chips are a valid way to fullfill it.
https://hasler.ece.gatech.edu/Published_papers/Technology_ov...
https://www.eng.auburn.edu/~agrawvd/COURSE/E7770_Spr07/READ/...
> the density of transistors that can be fit onto a silicon chip doubles
the whole article takes off from a flawed and fantasious misinterpretation and argue against that self created windmill
This seems improbable.
50-year-old technology works because 50 years ago, transistors were micron-scale.
Nanometer-scale nodes wear out much more quickly. Modern GPUs have a rated lifespan in the 3-7 year range, depending on usage.
One of my concerns is we're reaching a point where the loss of a fab due to a crisis -- war, natural disaster, etc. -- may cause systemic collapse. You can plot lifespan of chips versus time to bring a new fab online. Those lines are just around the crossing point; modern electronics would start to fail before we could produce more.
I recently bought a new MacBook, my previous one having lasted me for over 10 years. The big thing that pushed me to finally upgrade wasn’t hardware (which as far as I could tell had no major issues), it was the fact that it couldn’t run latest macOS, and software support for the old version it could run was increasingly going away.
The battery and keyboard had been replaced, but (AFAIK) the logic board was still the original
which is very annoying, as none of the newer OS versions has anything that warrants dumping hardware to buy brand new to run them with! With the exception of security upgrades, which i find dubious for a company to stop creating (as they would need to do so for their newer OS versions just as well, so the cost of maintaining security patches ought to not be much, if at all), it is definitely more likely to be a dark-pattern to force hardware upgrades.
Also he said that software from third parties also don’t support the older OS so even if Apple did provide security updates, he would still be in the same place.
That statement absolutely needs a source. Is "usage" 100% load 24/7? What is the failure rate after 7 years? Are the failures unrepairable, i.e. not just a broken fan?
https://www.livescience.com/technology/electronics/what-is-m...
since then, there have been some adjustments, but it still holds as a prediction of a general trend since as noted in that article:
>One reason for the success of Moore’s prediction is that it became a guide — almost a target — for chip designers.
but as noted:
>The days when we could double the number of transistors on a chip every two years are far behind us. However, Moore’s Law has acted as a pacesetter in a decades-long race to create chips that perform more complicated tasks quicker, especially as our expectations for continual progress continue.
jama211•4h ago
As interesting as this breakdown of the current state of things is, it doesn’t tell us much we didn’t know or predict much about the future, and that’s the thing I most wanted to hear from an expert article on the subject, even if we can take it with a large pinch of salt.
WillAdams•18m ago
>software is getting slower more rapidly than hardware is becoming faster.
>Wirth attributed the saying to Martin Reiser, who in the preface to his book on the Oberon System wrote: "The hope is that the progress in hardware will cure all software ills. However, a critical observer may observe that software manages to outgrow hardware in size and sluggishness."
I wish that there would be more instances of developments like to Mac OS X 10.6, where rather than new features, the software was simply optimized for a given CPU architecture, and the focus was on improving performance.