That is nice speed-up compared to generic hardware but everyone probably wants to know how much slower it is than performing same operations on plain text data? I am sure 50% penalty is acceptable, 95% is probably not.
This hardware won’t make the technique attractive for ALL computation. But, it could dramatically increase the range of applications.
That rules out anything latency-sensitive, but for batch workloads like aggregating encrypted medical records or running simple ML inference on private data it starts to become practical. The real unlock is not raw speed parity but getting FHE fast enough that you can justify the privacy tradeoff for specific regulated workloads.
There are two, non-exclusive paths I'm thinking at the moment:
1. DRM: Might this enable a next level of DRM?
2. Hardware attestation: Might this enable a deeper level of hardware attestation?
It's not related to DRM or trusted computing.
A: "Intel/AMD is adding instructions to accelerate AES"
B: "Might this enable a next level of DRM? Might this enable a deeper level of hardware attestation?"
A: "wtf are you talking about? It's just instructions to make certain types of computations faster, it has nothing to do with DRM or hardware attestation."
B: "Not yet."
I'm sure in some way it probably helps DRM or hardware attestation to some extent, but not any more than say, 3nm process node helps DRM or hardware attestation by making it faster.
Same here.
Can't wait to KYC myself in order to use a CPU.
We are not anymore their clients, we are just another product to sell. So, they do not design chips for us but for the benefit of other corporations.
3. Unskippable ads with data gathering at the CPU level.
I think eGovernment is the main use case: not super high traffic (we're not voting every day), but very high privacy expectations.
[1] https://confer.to/blog/2025/12/confessions-to-a-data-lake/
If computation can happen directly on encrypted data, does that reduce the need for trusted environments like SGX/TEE, or does it mostly complement them?
esseph•1h ago
If you need to trust the encryption and trust the hardware itself, it may not be suitable for your environment/ threat model.
gruez•45m ago
Are we reading the same article? It's talking about homorphic encryption, ie. doing mathematical operations on already encrypted data, without being aware of its cleartext contents. It's not related to SGX or other trusted computing technologies.
cwmma•44m ago
u1hcw9nx•40m ago
First you encrypt the data. Then you send it to hardware to compute, get result back and decrypt it.