Tariffs shouldn’t prevent buying stuff, you just have to, y’know, pay a tariff on import.
In this case, a Japanese made camera will incur a 15% tariff.
It was a real moment with objects that Bishop Berkeley could have kicked.
Interestingly it wasn't the OCR that was the problem but the JBIG2 compression.
Seems like it.
> a photo scanner that makes the zero knowledge proofs
Presumably at some point the intention is to add other sensors to the camera e.g. for depth information.
Scanning an image would be much easier to dupe though - scanners are basically controlled perspective/lighting environments so scanning an actual polaroid vs an ai generated polaroid printed on photo paper would be pretty indistinguishable I think.
[0]: https://authenticity.sony.net/camera/en-us/
[1]: https://petapixel.com/2023/10/26/leica-m11-p-review-as-authe...
https://spec.c2pa.org/specifications/specifications/2.2/inde...
* https://petapixel.com/2010/12/01/russian-software-firm-break...
* https://www.elcomsoft.com/presentations/Forging_Canon_Origin...
Other than that it's a 16MP Sony CMOS, I'd expect a pretty noisy picture...
How do I get my photos off the camera?
Coming soon. We're working on export functionality to get your photos off the camera.
It would be more interesting if the software was open source.https://sfconservancy.org/blog/2021/mar/25/install-gplv2/ https://sfconservancy.org/blog/2021/jul/23/tivoization-and-t... https://events19.linuxfoundation.org/wp-content/uploads/2017...
> What are the camera's specs?
> The camera has a 16MP resolution, 4656 x 3496 pixels. It uses a Sony IMX519 CMOS sensor.
In this case you get the signature and it confirms the device and links to a tamper proof snapshot of the code used to build its firmware.
This attitude really rubs me the wrong way, especially on a site called Hacker News.
I think we absolutely should be supporting projects like this (if you think they're worth supporting), else all we're left with is giant corporation monoculture. Hardware startups are incredibly difficult, and by their nature new hardware products from small companies will always cost more than products produced by huge companies that have economies of scale and can afford billions of losses on new products.
So yes, I'm all for people taking risks with new hardware, and even if it doesn't have the most polished design, if it's doing something new and interesting I think it's kinda shitty to just dismiss it as looking like "a 3D printed toy with some cute software".
I don't mean to disregard the technical feat, but I question the intent.
I support it but I recognize it is a 3D printed toy with some cute software... toys can be interesting too. Not everything needs to be a startup.
It’s possible that this could have value in journalism or law enforcement.
Just make it look the part. Make it black and put some decent lens on it.
I guess you could have a unique signing key per camera and blacklist known leaked keys.
They got cracked with a year or two. Not sure if they still offer the capability.
But I feel like the only way to accomplish fool-proof photos we can trust in a trustless way (i.e. without relying on e.g. the Press Association to vet) is to utterly PACK the hardware with sensors and tamper-proof attestation so the capture can’t be plausibly faked: multi-spectral (RGB + IR + UV) imaging, depth/LiDAR, stereo cameras, PRNU fingerprinting, IMU motion data, secure GPS with attested fix, a hardware clock and secure element for signing, ambient audio, lens telemetry, environmental sensors (temperature, barometer, humidity, light spectrum) — all wrapped in cryptographic proofs that bind these readings to the pixels.
In the meantime however, I'd trust a 360deg go-pro with some kind of signature of manafacture. OR just a LOT of people taking photos in a given vicinity. Hard to fake that.
Before long, it might be somewhat "easy" to prove anything.
It's not feasible or desirable for our hardware devices to verify the information they record autonomously. A real solution to the problem of attribution in the age of AI must be based on reputation. People should be able to vouch for information in verifiable ways with consequences for being untrustworthy.
The problem is quality takes time, and therefore loses relevance.
We need a way to break people out of their own human nature and reward delayed gratification by teaching critical thinking skills and promoting thoughtfulness.
I sadly don't see an exciting technological solution here. If anything it's tweaks to the funding models that control the interests of businesses like Instagram, Reddit, etc.
Attestation systems are not inherently in conflict with repurposeability. If they let you install user firmware, then it simply won’t produce attestations linked to their signed builds, assuming you retain any of that functionality at all. If you want attestations to their key instead of yours, you just reinstall their signed OS, the HSM boot attests to whoever’s OS signature it finds using its unique hardware key, and everything works fine (even in a dual boot scenario).
What this does do is prevent you from altering their integrity-attested operating system to misrepresent that photos were taken by their operating system. You can, technically, mod it all you want — you just won’t have their signature on the attestation, because you had to sign it with some sort of key to boot it, and certainly that won’t be theirs.
They could even release their source code under BSD, GPL, or AGPL and it would make no difference to any of this; no open source license compels producing the crypto private keys you signed your build with, and any such argument for that applying to a license would be radioactive for it. Can you imagine trying to explain to your Legal team that you can’t extract a private key from an HSM to comply with the license? So it’s never going to happen: open source is about releasing code, not about letting you pass off your own work as someone else’s.
> must be based on reputation
But it is already. By example:
Is this vendor trusted in a court of law? Probably, I would imagine, it would stand up to the court’s inspection; given their motivations they no doubt have an excellent paper trail.
Are your personal attestations, those generated by your modded camera, trusted by a court of law? Well, that’s an interesting question: Did you create a fully reproducible build pipeline so that the court can inspect your customizations and decide whether to trust them? Did you keep record of your changes and the signatures of your build? Are you willing to provide your source code and build process to the court?
So, your desire for reputation is already satisfied, assuming that they allow OS modding. If they do not, that’s a voluntary-business decision, not a mandatory-technical one! There is nothing justifiable by cryptography or reputation in any theoretical plans that lock users out of repurposing their device.
We do not need "proof". We lived without it, and we'll live without it again.
I grew up before broadband - we survived without photographing every moment, too. It was actually kind of nice. Social media is the real fluke of our era, not image generation.
And hypothetically if these cryptographic "non-AI really super serious real" verification systems do become in vogue, what happens if quantum supremacy beats crypto? What then?
You don't even need to beat all of crypto. Just beat the signing algorithm. I'm sure it's going to happen all the time with such systems, then none of the data can be "trusted" anyway.
I'm stretching a bit here, but this feels like "NFTs for life's moments". Designed just to appease the haters.
You aren't going to need this stuff. Life will continue.
If you’ve got a photo of a public figure, but it doesn’t match the records of where they were at that time, it’s now suspicious.
Like, how is this any different than having each camera equipped with a vendor controlled key and then having it sign every photo?
If you can spoof the sensor enough to reuse the key, couldn't you spoof the sensor enough to fool a verifier into believing your false proof?
The only real solution I can think of is just to have multiple independent parties photograph the same event and use social trust. Luckily this solution is getting easier now that almost everyone is generally no further than 3 feet away from multiple cameras.
I was trying to take a picture of a gecko the other day, and it missed half of the event while the app was loading.
Both cameras still allow “staging” a scene and taking a shot of that. Both cameras will both say that the scene was shot in the physical world, but that’s it.
I would argue that slide film is more “verifiable” in the ways that matter: easier to explain to laypeople how slide film works, and it’s them that you want to convince.
If I was a film or camera manufacturer I would try and go for this angle in marketing.
I think the point of this movement toward cryptographically signing image sensors is so people can confidently prove images are real on the internet in a momentary click, without having to get hold of the physical original and hiring a forensic lab to analyze it.
Not trolling. Genuinely don’t understand.
https://www.amazon.com/Camera-Digital-Toddler-Christmas-Birt...
This is one attempt.
That's it. That's the verification?
So what happens when I use a Raspberry Pi to attach a ZK proof to an AI- generated image?
The light sensor must have a key built into the hardware at the factory, and that sensor must attest that it hasn't detected any tampering, that gets input into the final signature.
We must petition God to start signing photons, and the camera sensor must also incorporate the signature of every photon input to it, and verify each photon was signed by God's private key.
God isn't currently signing photons, but if he could be convinced to it would make this problem a lot easier so I'm sure he'll listen to reason soon.
The real issue that photographers grapple with, emotionally and financially, is that pictures have become so thoroughly commodified that nobody assigns them cultural value anymore. They are the thumbnail you see before the short video clip starts playing.
Nobody has ever walked past a photograph because they can't inspect its digital authenticity hash. This is especially funny to me because I used to struggle with the fact that people looking at your work don't know or care what kind of camera or process was involved. They don't know if I spent two hours zoomed in removing microscopic dust particles from the scanning process after a long hike to get a single shot at 5:30am, or if it was just the 32nd of 122 shots taken in a burst by someone holding up an iPad Pro Max at a U2 concert.
This all made me sad for a long time, but I ultimately came to terms with the fact that my own incentives were perverse; I was seeking the external gratification of getting likes just like everyone else. If you can get back to a place where you're taking photographs or making music or doing 5 minute daily synth drills for your own happiness with no expectation of external validity, you will be far happier taking that $399 and buying a Mamiya C330.
This video is about music, but it's also about everything worth doing for the right reasons. https://www.youtube.com/watch?v=NvQF4YIvxwE
But at the same time it's true that some vital public activities aren't rewarded by the system atm. Eg. quality journalism, family rearing, open source, etc. Often that's an issue of privatized costs and socialized rewards. Finding a way to correct for this is a really big deal.
The problem with the linked product is it’s basically DRM with a baked in encryption key. And we have seen time and time again that with enough effort, it’s always been possible to extract that key.
That said in theory TPMs are proof against this: putting that to the test at scale, publicly, would be quite useful.
True.
> There is absolutely a market for social media that bans AI slop.
There’s a market for social media that bans slop, period. I don’t think it matters how it was made.
Also, that market may not be large. Yes, people prefer quality, but (how much) are they willing to pay for it?
People "at large" absolutely don't care about AI slop, even if they point and say eww when it's discussed. Some people care, and some additional people pretend they care, but it just isn't a real issue that is driving behavior. Putting aside (for now) the idea of misinformation, slop is socially problematic when it puts artists out of work, but social media slop is just a new, sadder, form of entertainment that is generally not replacing the work of an artist. People have been warning about the downfall of society with each new mode of entertainment forever. Instagram or TikTok don't need to remove slop, and people won't care after they acclimate.
Misinformation and "trickery" is a real and horrific threat to society. It predates AI slop, but it's exponentially easier now. This camera, or something else with the same goal, could maybe provide some level of social or journalistic relief to that issue. The problem, of course, is that this assumes that we're OK with letting something be "real" only when someone can remember to bring a specialty camera. The ability of average citizens to film some injustice and share it globally with just their phone is a remarkably important social power we've unlocked, and would risk losing.
I fully agree, I just don't know how that could work.
I think GenAI will kill the internet as we know it. The smart thing is (and always has been) to be online less and build real connections to real people offline.
This is a brilliant solution to one of the most critical emergent problems. I can see a world where no digital image can be trusted if it doesn't come with a hash.
There is also something called "film" which might be a retro answer to this problem.
Now moving on to the sensor (IMX 519 - Arducam?) - it's tinier than the tiniest sensor found on phones. If you really want to have decent image quality, you should look at Will Whang's OneInchEye and Four-thirds eye (https://www.willwhang.dev/). 4/3 Eye uses IMX294 which is currently the only large sensor which has Linux support (I think he upstreamed it) and MIPI. All the other larger sensors use interfaces like SLVS which are impossible to connect to.
If anyone's going to attempt a serious camera, they need to do two things. Use at least a 1 inch sensor, and a board which can actually sleep (which means it can't be the RPi). This would mean a bunch of difficult work, such as drivers to get these sensors to work with those boards. The Alice Camera (https://www.alice.camera/) is a better attempt and probably uses the IMX294 as well. The most impressive attempt however is Wenting Zhang's Sitina S1 - (https://rangefinderforum.com/threads/diy-full-frame-digital-...). He used a full frame Kodak CCD Sensor.
There is a market for a well made camera like the Fuji X-Half. It doesn't need to have a lot of features, just needs to have ergonomics and take decent pictures. Stuff like proofs are secondary to what actually matters - first it needs to take good pictures, which the IMX 519 is going to struggle with.
I wonder how have they made the boot up fast enough to not be annoying.
I used non-real time eInk display to cut down on the battery life so I could just keep it on in my pocket while out taking pictures since it took good minute to get ready from cold boot.
When the goal is having a proof that the photo hasn’t been edited or ai generated, getting an analog camera and shooting on film seems more practical to me than using a device like this.
On one hand, it’s a cool application of cryptography as a power tool to balance AI, but on the other, it’s a real hit to free and open systems. There’s a risk that concern over AI spirals into a justification for mandatory attestation that undermines digital freedom. See: online banking apps that refuse to operate on free devices.
The truth is worse than anyone wants to face. It was never about authenticity or creativity. Those words are just bullshit armor for fragile egos. Proofs and certificates do not mean a damn thing.
AI tore the mask off. It showed that everything we worship, art, music, poetry, beauty, all of it runs on patterns. Patterns so simple and predictable that a lifeless algorithm can spit them out while we sit here calling ourselves special. The magic we swore was human turns out to be math wearing makeup.
Strip away the label and no one can tell who made it. The human touch we brag about dissolves into noise. The line between creator and creation never existed. We were just too arrogant to admit it.
Love, happiness, beauty, meaning, all of it is chemistry and physics. Neurons firing, hormones leaking, atoms slamming into each other. That is what we are when we fall in love, when we cry, when we write a song we think no machine could ever match. It is all the same damn pattern. Give a machine enough data and it will mimic our souls so well we will start to feel stupid for ever thinking we had one.
This is not the future. It is already moving beneath us. The trendline is clear. AI will make films that crush Hollywood. Maybe not today, maybe not next year, but that is where the graph is pointing. And artists who refuse to use it, who cling to the old ways out of pride or fear, are just holding on to stupidity. The tools have changed. Pretending they have not is the fastest way to become irrelevant.
People will still scoff, call it soulless, call it fake. But put them in a blind test and they will swear it was human. The applause will sound exactly the same.
And one day a masterpiece will explode across the world. Everyone will lose their minds over it. Critics will write essays about its beauty and depth. People will cry, saying it touched something pure in them. Then the creator will step forward and say it was AI. And the whole fucking world will go quiet.
Because in that silence we will understand. There was never anything special about us. No divine spark. No secret soul. Just patterns pretending to mean something.
We are noise that learned to imitate order. Equations wrapped in skin. Puppets jerking to the pull of chemistry, pretending it is choice.
cma•2h ago
How do you stop someone from taking a picture of an AI picture? It will still come from the sensor.
radicaldreamer•2h ago
c0balt•2h ago
But a fixture that takes a good enough screen + enough distance to make the photographed pixels imperceptible is likely just a medium hurdle for a motivated person.
You probably can't fully avoid it but adding more sensors (depth) will make such a fixture quite a bit more expensive.