I’ve always been an expert at negotiating with myself to skip a workout. To fix this, I built FitVow: a system where I lock real money into a smart contract, and if I miss my weekly goals, I get fined.
The engineering challenge was: How do I prevent my future, lazy self from lying to the contract?
I tried to solve this by anchoring trust in three places:
1. Hardware-backed Proofs: The Android app reads from Health Connect (Galaxy Watch data) and signs a summary using a non-exportable key in the phone's TEE (Trusted Execution Environment). The smart contract verifies this signature on-chain.
2. Amnesiac APK Signing: To prevent myself from just "updating" the app to a version that fakes data, I sign the release with an ephemeral key and permanently delete the keystore. Since Android identity is tied to the signing key, I can’t overwrite the app without losing the hardware keys, which triggers an expensive on-chain penalty.
3. Permissionless Enforcement: The contract is "ownerless" regarding enforcement. If I fail a week, anyone can trigger the penalty function. The enforcer gets a small bounty, and the rest goes to charity (Giveth).
This is currently a 12-week experiment with ~$235 at stake. It’s not a product—it's a system design puzzle to see if I can make cheating more effort than just doing the workout.
the42thdoctor•2h ago
The engineering challenge was: How do I prevent my future, lazy self from lying to the contract?
I tried to solve this by anchoring trust in three places:
1. Hardware-backed Proofs: The Android app reads from Health Connect (Galaxy Watch data) and signs a summary using a non-exportable key in the phone's TEE (Trusted Execution Environment). The smart contract verifies this signature on-chain.
2. Amnesiac APK Signing: To prevent myself from just "updating" the app to a version that fakes data, I sign the release with an ephemeral key and permanently delete the keystore. Since Android identity is tied to the signing key, I can’t overwrite the app without losing the hardware keys, which triggers an expensive on-chain penalty.
3. Permissionless Enforcement: The contract is "ownerless" regarding enforcement. If I fail a week, anyone can trigger the penalty function. The enforcer gets a small bounty, and the rest goes to charity (Giveth).
This is currently a 12-week experiment with ~$235 at stake. It’s not a product—it's a system design puzzle to see if I can make cheating more effort than just doing the workout.
Live Dashboard: http://fitvow.pedroaugusto.dev/
Source Code: https://github.com/pedrooaugusto/fitness-unbreakable-vow/
I’d love to hear your thoughts on the security model—especially how you’d try to 'break' the vow without actually exercising.