So if I'm streaming a movie, it could be that the video is actually literally visible inside the datacenter?
Photonics has definitely proved itself in communications and linear computing, but still has a way to in terms of general (nonlinear) compute.
I wonder if meta material might provide such nonlinearities in the future.
That's a big part of it. I remember in the Early Pentium 4 days, starting to see a lot more visible 'squiggles' on PCB traces on motherboards; the squiggles essentially being a case of 'these lines need more length to be about as long as the other lines and not skew timing'
In the case of what the article is describing, I'm imagining a sort of 'harness cable' that has a connector on each end for all the fibers, and the fibers in the cable itself are all the same length, there wouldn't be a skew timing issue. (Instead, you worry about bend radius limitations.)
> Would the SerDes be the new bottleneck in the approach
I'd think yes, but at the same time in my head I can't really decide whether it's a harder problem than normal mux/demux.
Things get interesting if the losses are high and there needs to be a DFE. This limits speed a lot, but then copper solutions moved to sending multi-bit symbols (PAM 3, 4,5,6,8,16.. ) which can also be done in optical domain. One can even send multiple wavelengths in optical domain, so there are ways to boost the baud rate without requiring high clock frequencies.
Semi-accurate. For example, PCIe remains dominant in computing. PCIe is technically a serial protocol, as new versions of PCIe (7.0 is releasing soon) increase the serial transmission rate. However, PCIe is also parallel-wise scalable based on performance needs through "lanes", where one lane is a total of four wires, arranged as two differential pairs, with one pair for receiving (RX) and one for transmitting (TX).
PCIe scales up to 16 lanes, so a PCIe x16 interface will have 64 wires forming 32 differential pairs. When routing PCIe traces, the length of all differential pairs must be within <100 mils of each other (I believe; it's been about 10 years since I last read the spec). That's to address the "timing skew between lanes" you mention, and DRCs in the PCB design software will ensure the trace length skew requirement is respected.
>how can this be addressed in this massive parallel optical parallel interface?
From a hardware perspective, reserve a few "pixels" of the story's MicroLED transmitter array for link control, not for data transfer. Examples might be a clock or a data frame synchronization signal. From the software side, design a communication protocol which negotiates a stable connection between the endpoints and incorporates checksums.
Abstractly, the serial vs. parallel dynamic shifts as technology advances. Raising clock rates to shove more data down the line faster (serial improvement) works to a point, but you'll eventually hit the limits of your current technology. Still need more bandwidth? Just add more lines to meet your needs (parallel improvement). Eventually the technology improves, and the dynamic continues. A perfect example of that is PCIe.
Also transmitting 10 Gb/s with a led seems challenging. The bandwidth of an incoherent led is large, so are they doing significant DSP (which costs money and energy and introduces latency) or are they restricting themselves to very short (10s of m) links?
Sure yes, optical might use expensive longer range optical today! But using that framing to assess new technologies & what help the could be may be folly.
The cable is just 2D parallel optical bus. With a bundle like this, you can wrap it with a nice, thick PVC (or whatever) jacket and employ a small, square connector that matches the physical scheme of the 2D planar microled array.
It's a brute force, simple minded approach enabled by high speed, low cost microled arrays. Pretty cool I think.
The ribbon concept could be applicable to PCBs though.
What I'm getting at is, that I don't see any advantage over vcsel arrays. I'm not convinced that the price point is that different.
The caption of the image of the cable and connector reads: "CMOS ASIC with microLEDs sending data with blue light into a fiberbundle." So yes, fibre bundles.
> I don't see any advantage over vcsel arrays
They claim the following advantages:
1. Low energy use
2. Low "computational overhead"
3. Scalability
All of these at least pass the smell test. LEDs are indeed quite efficient relative to lasers. They cite about an order of magnitude "pJ/bit" advantage for the system over laser based optics, and I presume they're privy to vcsels. When you're trying to wheedle nuclear reactor restarts to run your enormous AI clusters, saving power is nice. The system has a parallel "conductor" design that likely employs high speed parallel CMOS latches, so the "computational overhead" claim could make sense: all you're doing is latching bits to/from PCB traces or IC pins so all the SerDes and multiplexing cost is gone. They claim that it can easily be scaled to more pixels/lines. Sure, I guess: low power makes that easier.There you are. All pretty simple.
I think there is use case for this outside data centers. We're at the point where copper transmission lines are a real problem for consumers. Fiber can solve the signal integrity problem for such use cases, however--despite several famous runs at it (Thunderbolt, Firewire)--the cost has always precluded widespread adoption outside niche, professional, or high-end applications. Maybe LED based optics can make fiber cost competitive with copper for such applications: one imagines a very small, very low power microLED based transceiver costing only slightly more than a USB connector on each end of such a cable with maybe 4-8 parallel fibers. Just spit-balling here
And given the talk about this as a CPO alternative, I was assuming this was for back plane and connections of a few metres, not components on the same PCB.
Indeed they do. I overlooked that.
I know little about microLED arrays and their reliability, so I won't guess about how credible this is: LED reliability has a lot of factors. The cables involved will probably be less reliable than conventional laser fiber optics due to the much larger number of fibers that have to be precision assembled. Likely to be more fragile as well.
On-site fabricating or repairing such cables likely isn't feasible.
Yes. I've replaced my share of dead transceivers, and I suspect the laser drivers were the failure mode of most of them.
That doesn't fill in the blank for me though: how reliable are high speed, dense microLEDs?
This new TSMC work with parallel incoherent optics is altogether distinct. No DSP. No SerDes. Apples and oranges.
And I'm not sure how much of this is actually TSMC's work, the title is misleading.
Edit: actually, they are working on the detector side.
See my other comment about non-datacenter applications. There is a serious opportunity here for fixing signal integrity problems with contemporary high bandwidth peripherals. Copper USB et al. are no good and in desperate need of a better medium.
on the distance - exactly right. The real bottleneck now in AI clusters is the interconnect within a rack or sub 10m. So that is the market we are addressing.
On your second point - exactly! Normally people think LEDs are slow and suck. That is the real innovation. At Avicena, we've figured out how to make LEDs blink on and off at 10Gb/s. This is really surprising and amazing! So with simple on-off modulation, there is no DSP or excess energy use. The article says TSMC is developing arrays of detectors, based on their camera process, that also receive signals at 10Gb/s. Turns out this is pretty easy for a camera with a small number of pixels (~1000). We use blue light, which is easily absorbed in silicon. BTW, feel free to reach out to Avicena, and happy to answer questions.
The claimed advantage is a very high aggregate throughput and much less energy per bit than with either copper links or traditional laser-based optical links.
For greater distances, lasers cannot be replaced by anything else.
https://www.nature.com/articles/s41566-020-00754-y
https://www.nature.com/articles/s44172-022-00024-5
As far as I understood, you can only compute quite small neural networks until the noise signal gets too large, and also only a very limited set of computations works well in photonics.
https://arxiv.org/abs/2208.01623
The real issue is trying to backpropagate those nonlinear optics. You need a second nonlinear optical component that matches the derivative of the first nonlinear optical component. In the paper above, they approximate the derivative by slightly changing the parameters, but that means the training time scales linearly with the number of parameters in each layer.
Note: the authors claim it takes O(sqrt N) time, but they're forgetting that the learning rate mu = o(1/sqrt N) if you want to converge to a minimum:
Loss(theta + dtheta) = Loss(theta) + dtheta * dLoss(theta) + O(dtheta^2)
= Loss(theta) + mu * sqrtN * C (assuming Lipschitz continuous)
==> min(Loss) = mu * sqrtN * C/2
With quantum computing, one is forced to use lasers. Basically, we can't transmit quantum information with the classical light from LEDs (handwaving-ly: LEDs emit a distribution of possible photon numbers, not single photons, so you lose control at the quantum level). Moreover, we often also need the narrow linewidth of lasers, so that we can interact with atoms in the way we want them to. That is, not to excite unwanted atomic energy levels. So you see in trapped ion quantum computing people tripping over themselves to realise integration of laser optics, through fancy engineering that i don't fully understand like diffraction gratings within the chip that diffract light onto the ions. It's an absolutely crucial challenge to overcome if you want to make trapped ion quantum computers with more than several tens of ions.
Networking multiple computers via said optical interconnects is an alternative, and also similarly difficult.
What insight do i gleam from this IEEE article, then? I believe if this approach with the LEDs works out for this use case, then I'd see it as a partial admission of failure for laser-integrated optics at scale. It is, after all, the claim in the article that integrating lasers is too difficult. And then I'd expect to see quantum computing struggle severely to overcome this problem. It's still research at this stage, so let's see if Nature's cards fall fortuitously.
However, it’s not correct to say lasers are unreliable. It’s fundamentally false and it’s not supported by field data from today’s pluggable modules. 10’s of millions of lasers are deployed in data centers today in pluggable modules.
It’s also useful to remember that an LED is essentially the gain region of a laser without the reflectors. When lasers fail in the field, they fail for the same reasons an LED will fail; moisture or contamination penetration of the semiconductor material.
An LED is not useful for quantum computing. To create a Bell pair (2qubits) you need a coherent light source to create correlated photons. The photons produced by an incoherent light source like an LED are fundamentally uncorrelated.
But even more than that, this seems to me like a purely on-chip solution. For trapped ions and neutral atoms you really need to translate to free-space optics at some point.
As for fully integrated optics, it's where quantum computers eventually want to be, and there's no physical limitations currently. But perhaps it's too early to say whether we would absolutely require free space optics because it's impossible to do some optics thing another way.
https://www.businesswire.com/news/home/20250422988144/en/Avi...
Noting also that there have been multiple articles on IEEE Spectrum about this startup in the past, I really hope the journalists don't own the stock or are otherwise biased.
Liftyee•1d ago
moelf•1d ago
Taek•23h ago
scheme271•23h ago
xeonmc•20h ago
cycomanic•22h ago
nullc•19h ago
presumably also because photons at wavelengths we can work with are BIG
xeonmc•20h ago
(Strong emphasis on the looseness of the scare quotes.)
EA-3167•1d ago
wyager•20h ago
notepad0x90•1d ago
to11mtm•23h ago
[0] - Mind you, some of that for Coax is due to other issues around CTB and/or the challenge that in Coax, you've got many frequencies running through alongside each frequency having different attenuation per 100 foot...
spwa4•3h ago
Actually this is true for fibers as well. In DWDM (all internet links are DWDM, including fiber-to-the-home in most places) you have many frequencies running alongside and each frequency has differences in attenuation (though generally measured per kilometer, not 100 foot)
Optical light are standing electromagnetic waves. Which means they don't disrupt each other. Electrical signals aren't standing waves. They affect each other.
The difference can be put like this: how many X (electrical waves, but essentially everything, protons, ...) fit on the tip of a needle? (or in a cable)
1) electrical waves? Some finite number. Can be large of course, but ...
2) photons (ie. fiber signals)? ALL OF THEM. Literally every photon that exists in the entire universe would happily join every other photon on the tip of a needle nothing would interfere with anything else
bgnn•23h ago
lo0dot0•23h ago
cycomanic•22h ago
SNR is obviously an issue for any communication system, however fiber attenuation is orders of magnitude lower than coax.
The bigger issues in this case would be mode-dispersion, considering that they are going through "imaging" fibres, i.e. different spatial components of the light walking off to each other causing temporal spread of the pulses until they overlap and you can't distinguish 1's and 0's.
abdullahkhalids•20h ago
cycomanic•19h ago
That said all of that is irrelevant to what the previous speaker said, vibration induced phase variation as an impairment. Thats just not an issue, vibrations are way too slow to impair optical comms signals.
mycall•15h ago
BardiaPezeshki•1h ago
wyager•20h ago
Salgat•4h ago