I've built a Cursor for business users in Rust. Spreadsheets, slideshows, and an agentic loop.
If you're up for it, it would be nice to chat and share stories and vision.
Email is andy at inboard dot ai
From the linked blog.
Gmail, Notion, Facebook, are painfully slow on my high-end laptop with gigabit ethernet. Something is wrong in our modern engineering culture.
The others, probably, VCs are incentivized to fund the people who allocate the most resources towards growth and marketing, as long as the app isn't actively on fire investors will actively push you away from allocating resources to make your tech good.
What is surprising is that a few years ago, these apps weren’t so terrible on this exact hardware.
I’m convinced that there’s an enormous amount of bloat right at the application framework level.
I finally caved and bought a new M series Mac and the apps are much snappier. But this is simply because the hardware is wicked fast and not because the software got any better.
I really wish consumer apps cared less about user retention and focused more on user empowerment.
Technically, yes. But for many large tech companies it would require a large organisational mindset shift to go from more features is more promotions is more money to good, stable product with well maintained codebase is better and THAT would require a dramatic shift away from line must go up to something more sustainable and less investor/stock obsessed.
The only one I'm looking forward currently is the next version of Logseq which will enable collaboration on their existing block-based authoring model.
Here's an alternative.
Assuming your backing store is Postgres, I’d experiment a lot with the various column storage strategies, at various sizes of documents and varying amounts of writes. The TOAST overhead can become a huge bottleneck.
I'd really like to see the team get rewarded for their work, too. I'd be sad if it went 100% open and they didn't so much as draw a market salary from it.
I think if it went open, they'd get nothing. That's the one thing I strongly dislike about open source is that only hyperscalers really economically benefit from it.
They've done a remarkable service for all of us.
99.9% of the internet is closed source and we don't ask for it to be opened. From our ISPs, to Google, to the hyperscalers.
If anything, I think we should be asking those things to be open. If we're only asking the little guys, the big guys with trillion dollar market caps skate by. This is exactly how they want it. Fewer gradients for small players to grow.
1) Modern 2010s era "OSI Approved open source" is a meme built by hyperscalers to get free work, poach the efforts of others (Amazon makes hundreds of millions on Redis, Elasticsearch, etc.) and eliminate the threat of smaller players.
There are great things like Linux and Blender and ffmpeg. But there is also a concerted battle waged by trillion dollar companies against us using "open" to salt the field of any kind of economic growth salient.
By being completely open and not keeping some leverage, you ensure you cannot make the same revenues the big companies can. And they will outspend and outgrow you. They will encircle and even find a way to grow off of your labor while you don't see so much as a dime.
2) You wouldn't be on the internet right now if you really refused to use closed source. The binary blobs in your hardware, your ISP, your wifi. Not even Stallman can do it.
I love open source. But I hate how difficult it is to make money. And I hate how the big players have used it to enrich and entrench themselves by making it just the crust of their closed source empires.
Wow, I didn't know the team was so small - go them!
Except that making their client FOSS would help a lot to replicate the APIs and create a FOSS server, which would definitely make a difference on how they make money.
This.
I spend 6 months to export 100K notes from Evernote mostly because they intentionally throttle the exports to a limit and you can extract it only in their proprietary format that truncates some data.
It's open source and as far as I can tell uses a database.
From past experience, it's even pretty simple to host your own sync server to get away from their account/storage limits.
In any case, I don't particularly enjoy AnyType, despite coming back to it a few times to test it out (and still maintaining my own sync server, despite not actively using it, in case I go back to try it out again after some demonstrably updates). Just pointing out that it's a less restrictive alternative.
I use Zim wiki for everything just now and I don't like it. I'm in the market for a replacement, and would even pay like with how Immich does it.
Unless the source code is available or you put it into legal escrow for when you go bust/abandon the software†, I will not invest my time and data into a system where I am entirely dependent on another organisation or service.
† And you will go bust or abandon the software before I die!
Distinctive points:
- It exposes the "database metaphor": your data is organized in collections of documents, each collection having a well-defined schema.
- It's all local in an app (no server component to self-host).
- It has an AI assistant on top that you can use to explore / create / update.
- It allows you to create small personal apps (e.g., a custom dashboard).
- It allows you to sync data from external sources (Strava, Google Calendar, Google Contacts.)
Cons:
- The database metaphor is quite "technical". A "normal" user is not comfortable with the idea of creating their own collections, defining a schema, etc. In fact, right now I only have developers and techies as a target audience.
- It's not optimized for any one use case. So, for example, as a notes-keeper Notion is obviously much better.
- It's still in early stages (I'm working on it alone), so:
- There's no mobile app yet.
- It doesn't yet support syncing between devices.
- There are just 3 connectors to sync from external sources.I must admit that I don’t archive things like exercise activity. So maybe the simple mindset won’t work then.
First, lots of server-side code is IO-bound, writing it in Rust vs. Java/C# would barely show any difference in a Monitoring tool, in a real-life scenario.
His authorization system is very limited in scope, of course it can be fast! Get real users and we will see if that will still be fast.
When you are running it in production, even if using Zanzibar's approach of loading everything into memory, you'd still need to handle many aspects he didn't think of, like updates to such permissions, and dealing with sharding etc. Things are always more complex in real life.
And last not but the least, Notion is really fast as it is. I never knew it was slow.
Without bringing any new concept to "Notion", I find it hard to believe this will ever work.
I hope he finds happiness building it though, building is fun!
Notion is a great product for corporations, and I get why companies are jumping on this bandwagon so fast; however, as a consumer, I wouldn't consider it or any option based on seat (like Outcrop) or any that wouldn't give me a binary that I can use in whatever machine that I want.
(On outcrop.app)
How are changes to permissions managed I wonder.
- new e-mail from client comes in which can't be matched to an existing project? New page in the knowledgebase
- second e-mail from client comes in w/ an attachment? It's stripped off and added to that page in the kb
- employee sends out e-mail with link to the initial version of the project? The link is added to that page
&c.
Maybe AI could make something like that work now?
It's missing collaboration at the core, although it's possible to achieve this currnetly with third party solutions, or the next major update should also include it as it's the "multiplayer" update.
We have a novel architecture where you can optionally register a self-hosted relay server with our control plane for complete privacy for all of your docs and attachments.
We know that people typically prefer to have a unified vault, so you can share individual folders with different groups of people within your vault.
Relay is free for markdown docs up to 3 users, and then we have a hobby plan which includes attachment storage (especially popular with D&D and TTRPG players), as well as per-seat plans for businesses and universities. There are a couple of cloud-only alternatives like peerdraft and screen garden as well.
[0] https://relay.md
Maybe Devin can make some GH sync utils for oxide
Preloading authorization data into memory does not, by itself, provide the specific security guarantee (consistency) that defines Zanzibar.
The Zanzibar model is famous not just because it is fast, but because it solves the "New Enemy" problem (or causal consistency). Simple in-memory caching (preloading) often fails this test unless it is paired with complex invalidation logic that mimics Zanzibar's Zookies.
Generally, the solution is to keep a timestamp of when the data changed (Zookies as you mentioned) or you can proactively reload or recompute the cache when the underlying data changes (sometimes in very smart ways), but yeah: it adds significant complications over a "simplified" approach to Zanzibar.
Disclaimer: I'm the cofounder and CTO of AuthZed and we develop the SpiceDB [2] and Materialize [3], which have quite a bit of logic around these exact problems
[1]: https://authzed.com/blog/new-enemies#the-new-enemy-problem [2]: https://spicedb.io [3]: https://authzed.com/docs/authzed/concepts/authzed-materializ...
Then, I'd take a look at the history of SpiceDB [2] for how we developed the system over time.
Finally, if you have any questions, feel free to jump into our Discord [3] and ask: we're happy to answer!
[1]: https://zanzibar.tech/ [2]: https://spicedb.io [3]: https://discord.gg/spicedb
johnisgood•2mo ago
ls-a•2mo ago
lagniappe•2mo ago
not-so-darkstar•2mo ago
lagniappe•2mo ago
ls-a•2mo ago
nicoburns•2mo ago
lagniappe•2mo ago