It was a real moment with objects that Bishop Berkeley could have kicked.
Interestingly it wasn't the OCR that was the problem but the JBIG2 compression.
Seems like it.
> a photo scanner that makes the zero knowledge proofs
Presumably at some point the intention is to add other sensors to the camera e.g. for depth information.
[0]: https://authenticity.sony.net/camera/en-us/
[1]: https://petapixel.com/2023/10/26/leica-m11-p-review-as-authe...
https://spec.c2pa.org/specifications/specifications/2.2/inde...
* https://petapixel.com/2010/12/01/russian-software-firm-break...
* https://www.elcomsoft.com/presentations/Forging_Canon_Origin...
Other than that it's a 16MP Sony CMOS, I'd expect a pretty noisy picture...
How do I get my photos off the camera?
Coming soon. We're working on export functionality to get your photos off the camera.
It would be more interesting if the software was open source.https://sfconservancy.org/blog/2021/mar/25/install-gplv2/ https://sfconservancy.org/blog/2021/jul/23/tivoization-and-t... https://events19.linuxfoundation.org/wp-content/uploads/2017...
> What are the camera's specs?
> The camera has a 16MP resolution, 4656 x 3496 pixels. It uses a Sony IMX519 CMOS sensor.
This attitude really rubs me the wrong way, especially on a site called Hacker News.
I think we absolutely should be supporting projects like this (if you think they're worth supporting), else all we're left with is giant corporation monoculture. Hardware startups are incredibly difficult, and by their nature new hardware products from small companies will always cost more than products produced by huge companies that have economies of scale and can afford billions of losses on new products.
So yes, I'm all for people taking risks with new hardware, and even if it doesn't have the most polished design, if it's doing something new and interesting I think it's kinda shitty to just dismiss it as looking like "a 3D printed toy with some cute software".
It’s possible that this could have value in journalism or law enforcement.
Just make it look the part. Make it black and put some decent lens on it.
I guess you could have a unique signing key per camera and blacklist known leaked keys.
They got cracked with a year or two. Not sure if they still offer the capability.
But I feel like the only way to accomplish fool-proof photos we can trust in a trustless way (i.e. without relying on e.g. the Press Association to vet) is to utterly PACK the hardware with sensors and tamper-proof attestation so the capture can’t be plausibly faked: multi-spectral (RGB + IR + UV) imaging, depth/LiDAR, stereo cameras, PRNU fingerprinting, IMU motion data, secure GPS with attested fix, a hardware clock and secure element for signing, ambient audio, lens telemetry, environmental sensors (temperature, barometer, humidity, light spectrum) — all wrapped in cryptographic proofs that bind these readings to the pixels.
In the meantime however, I'd trust a 360deg go-pro with some kind of signature of manafacture. OR just a LOT of people taking photos in a given vicinity. Hard to fake that.
Before long, it might be somewhat "easy" to prove anything.
It's not feasible or desirable for our hardware devices to verify the information they record autonomously. A real solution to the problem of attribution in the age of AI must be based on reputation. People should be able to vouch for information in verifiable ways with consequences for being untrustworthy.
The problem is quality takes time, and therefore loses relevance.
We need a way to break people out of their own human nature and reward delayed gratification by teaching critical thinking skills and promoting thoughtfulness.
I sadly don't see an exciting technological solution here. If anything it's tweaks to the funding models that control the interests of businesses like Instagram, Reddit, etc.
Attestation systems are not inherently in conflict with repurposeability. If they let you install user firmware, then it simply won’t produce attestations linked to their signed builds, assuming you retain any of that functionality at all. If you want attestations to their key instead of yours, you just reinstall their signed OS, the HSM boot attests to whoever’s OS signature it finds using its unique hardware key, and everything works fine (even in a dual boot scenario).
What this does do is prevent you from altering their integrity-attested operating system to misrepresent that photos were taken by their operating system. You can, technically, mod it all you want — you just won’t have their signature on the attestation, because you had to sign it with some sort of key to boot it, and certainly that won’t be theirs.
They could even release their source code under BSD, GPL, or AGPL and it would make no difference to any of this; no open source license compels producing the crypto private keys you signed your build with, and any such argument for that applying to a license would be radioactive for it. Can you imagine trying to explain to your Legal team that you can’t extract a private key from an HSM to comply with the license? So it’s never going to happen: open source is about releasing code, not about letting you pass off your own work as someone else’s.
> must be based on reputation
But it is already. By example:
Is this vendor trusted in a court of law? Probably, I would imagine, it would stand up to the court’s inspection; given their motivations they no doubt have an excellent paper trail.
Are your personal attestations, those generated by your modded camera, trusted by a court of law? Well, that’s an interesting question: Did you create a fully reproducible build pipeline so that the court can inspect your customizations and decide whether to trust them? Did you keep record of your changes and the signatures of your build? Are you willing to provide your source code and build process to the court?
So, your desire for reputation is already satisfied, assuming that they allow OS modding. If they do not, that’s a voluntary-business decision, not a mandatory-technical one! There is nothing justifiable by cryptography or reputation in any theoretical plans that lock users out of repurposing their device.
Like, how is this any different than having each camera equipped with a vendor controlled key and then having it sign every photo?
If you can spoof the sensor enough to reuse the key, couldn't you spoof the sensor enough to fool a verifier into believing your false proof?
The only real solution I can think of is just to have multiple independent parties photograph the same event and use social trust. Luckily this solution is getting easier now that almost everyone is generally no further than 3 feet away from multiple cameras.
Both cameras still allow “staging” a scene and taking a shot of that. Both cameras will both say that the scene was shot in the physical world, but that’s it.
I would argue that slide film is more “verifiable” in the ways that matter: easier to explain to laypeople how slide film works, and it’s them that you want to convince.
If I was a film or camera manufacturer I would try and go for this angle in marketing.
I think the point of this movement toward cryptographically signing image sensors is so people can confidently prove images are real on the internet in a momentary click, without having to get hold of the physical original and hiring a forensic lab to analyze it.
Not trolling. Genuinely don’t understand.
https://www.amazon.com/Camera-Digital-Toddler-Christmas-Birt...
This is one attempt.
cma•1h ago
How do you stop someone from taking a picture of an AI picture? It will still come from the sensor.
radicaldreamer•1h ago
c0balt•1h ago
But a fixture that takes a good enough screen + enough distance to make the photographed pixels imperceptible is likely just a medium hurdle for a motivated person.
You probably can't fully avoid it but adding more sensors (depth) will make such a fixture quite a bit more expensive.