https://www.reddit.com/r/SQL/comments/10s4608/comment/j722e1...
we could think big to someday do that within py-pglite project actually.
let me put it as the roadmap of v2(much more work to do!)
You will need to use an unrolled main loop similar to what we have in PGlite, using the "single user mode" (you likely don't want to fork sub processes like a normal Postgres). The problems come with then trying to run multiple instances in a single process, Postgres makes heavy use of global vars for state (they can as they fork for each session), these would clash if you had multiple instances. There is work happening to possibly make Postgres multi-threaded, this will solve that problem.
The long term ambition of the PGlite project is to create a libpglite, a low level embedded Postgres with a C api, that will enable all this. We quite far off though - happy to have people join the project to help make it happen!
Anything that won’t work if you tried this as a drop in replacement for a full PG instance in tests? Maybe extensions? Any benchmarks?
For now, it's more accessible for me to hack it in hours and it works.
Would love to see a benchmark between this, pglite and testcontainers.
If that feels hard to you (to set up your app pointing to another DB, run a single E2E-testable part of your app with it's own DB connection, etc), fix that.
And actually, more e2e cases I think it's way better to not use the lite backend.
the non-container solutions would do more like the lifecycle mgmt/isolated env prep/tear-down with elegantly designed abstractions. While I think similar abstractions could be done on top of containers.
Maybe we ideally could have unified abstractions on both container-based, wasm evantually to boost dx yet with different expectation of speed vs compatibility.
The last GH Actions jobs with SQLite and Python 3.9 took 3m 41s, and the same tests with Postgres took 4m 11s. Running a single test locally in PyCharm also executes in less than 1 second. You notice some bootstrap happening, but once the container image is downloaded locally, it's really quite fast.
I work on PGlite, we have an experimental WASI build that can run in a WASI runtime, it should enable dropping the Node requirement. It lacks error handling at the moment (WASM has no long jump, and Postgres uses that for error handling - Emscripten has hacks that fix this via JS), and so we haven't yet pushed it far.
Do ping me on our Discord and I can point you towards it.
Happy to answers any PGlite questions while I'm here!
It does depend on SQLAlchemy. Can this also be used with asyncpg? Is it on the roadmap?
And now this.
Going to use right away.
> The library automatically manages PGlite npm dependencies.
I'm sorry what? Please don't do this. There is no way you can do this in a way that:
(a) is compatible with every pre-existing in-repo NodeJS setup
(b) plays nicely with all SCA automation configurations (Dependabot, etc.)
---
Edit:
After closer inspection there's a config[0] to disable this which defaults to True, but no documentation on how to manage required Node deps when setting that value to false.
I would strongly suggest the authors default this to False with accompanying docs in the README
Overall though I do like the simplicity & readability of the codebase - it makes these gotchas really easy to find & verify even without docs - will definitely be using this.
[1] https://github.com/wey-gu/py-pglite/blob/b821cf2cfeb2e4bc58b...
> No PostgreSQL install needed—just Node.js
postgres is 32MB, nodejs is 63MB. I know that 31MB isn't a huge deal for most folks, but it's hard to see as "doesn't require postgres" as a selling point when you need something else that's twice the size instead.
> Effortless Setup: No PostgreSQL install needed—just Node.js(I know)!
Was just to have kind of SQLite dx in 1 hour thus did so.
And then I thought why not open source it?
Maybe in v2 I could abstract actual binary with same dx
Because if you're really interested in testing postgres you can test PG in PG: https://pgtap.org/
So often what you do is when unit testing, you need to test CRUD-type operations between the application and the database. So you generally have to spin up a temporary database just to delete it at the end of the tests. Commonly this is done with SQLite during testing, even if you built the application for postgres in production. Because it is fast and easy and you can interact with it as a file, not requiring connection configurations and added bloat.
But then sometimes your app gets complicated enough that you are using Postgres features that SQLite doesn't have comparables to. So now you need a PG instance just for testing, which is a headache.
So this project bridges the gap. Giving you the feature completedness and consistency across envs of using postgres, but with the conveniences of using SQLite.
This is an example of a unit test of an API route on a fully isolated WASM backed Postgres - very few lines of code, and all your API unit tests can run fully in parallel without any shared state: https://github.com/ben-pr-p/bprp-react-router-starter/blob/m...
This is all of the code needed to use Postgres in prod and PGLite in test/dev: https://github.com/ben-pr-p/bprp-react-router-starter/blob/m...
I like it because I can do full stack development, including the database, with a single system level dependency (Bun).
This is a nice project idea. But, you should use a Python WASM interpreter to run the PostgreSQL WASM.
wey-gu•16h ago