As for the matter at hand, I assume that such implementation is validated against which specification exactly?
Gladly to be educated in my clueless.
Besides this that is,
To make a full implementation of a Apple product, the specification for that Apple product must exist in some form.
With that in mind, does this scheme offer any advantage over the much simpler setup of a user sending an inference request:
- directly to an inference provider (no API router middleman)
- that accepts anonymous crypto payments (I believe such things exist)
- using a VPN to mask their IP?
Folks may underestimate the difficulty of providing compute that the provider “cannot”* access to reveal even at gunpoint.
BYOK does cover most of it, but oh look, you brought me and my code your key, thanks… Apple's approach, and certain other systems such as AWS's Nitro Enclaves, aim at this last step of the problem:
- https://security.apple.com/documentation/private-cloud-compu...
- https://aws.amazon.com/confidential-computing/
NCC Group verified AWS's approach and found:
1. There is no mechanism for a cloud service provider employee to log in to the underlying host.
2. No administrative API can access customer content on the underlying host.
3. There is no mechanism for a cloud service provider employee to access customer content stored on instance storage and encrypted EBS volumes.
4. There is no mechanism for a cloud service provider employee to access encrypted data transmitted over the network.
5. Access to administrative APIs always requires authentication and authorization.
6. Access to administrative APIs is always logged.
7. Hosts can only run tested and signed software that is deployed by an authenticated and authorized deployment service. No cloud service provider employee can deploy code directly onto hosts.
- https://aws.amazon.com/blogs/compute/aws-nitro-system-gets-i...
Points 1 and 2 are more unusual than 3 - 7.
Folks who enjoy taking things apart to understand them can hack at Apple's here:
https://security.apple.com/blog/pcc-security-research/
* Except by, say, withdrawing the system (see Apple in UK) so users have to use something less secure, observably changing the system, or other transparency trippers.
It's even harder to do this plus the hard requirement of giving the NSA access.
Or alternatively, give the user a verifiable guarantee that nobody has access.
Service: https://www.privatemode.ai/ Code: https://github.com/edgelesssys/privatemode-public
kiwicopple•21h ago
we are working on a challenge which is somewhat like a homomorphic encryption problem - I'm wondering if OpenPCC could help in some way? :
When developing websites/apps, developers generally use logs to debug production issues. However with wearables, logs can be privacy issue: imagine some AR glasses logging visual data (like someone's face). Would OpenPCC help to extract/clean/anonymize this sort of data for developers to help with their debugging?
jmort•20h ago
If it's possible to anonymize on the wearable, that would be simpler.
The challenge is what does the anonymizer "do" to be perfect?
As an aside, IMO homomorphic encryption (still) isn't ready...