frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

Show HN: Remove water from speaker using specific frequency

https://speakercleaner.org/
1•artiomyak•57s ago•0 comments

Vercel Accquired Nuxt

https://vercel.com/blog/nuxtlabs-joins-vercel
1•carlual•7m ago•1 comments

Lump of labour fallacy

https://en.wikipedia.org/wiki/Lump_of_labour_fallacy
1•mfiguiere•14m ago•0 comments

The Sequoia Investor Whose Anti-Mamdani Posts Set Off a Silicon Valley Storm

https://www.wsj.com/tech/the-sequoia-investor-whose-anti-mamdani-posts-set-off-a-silicon-valley-storm-56965cdf
2•KnuthIsGod•15m ago•0 comments

The Dark Side of Apple Development

https://www.magiclasso.co/insights/apple-development/
1•ustad•16m ago•0 comments

Psilocybin treatment extends cellular lifespan, improves survival of aged mice

https://www.nature.com/articles/s41514-025-00244-x
1•XzetaU8•21m ago•1 comments

Full event page with photo sharing

https://pixdrop.com
1•yevo_a•28m ago•1 comments

Proving P ≠ NP via Categorical and Graph-Theoretic 3-SAT

https://www.texstr.org/a/naddr1qvzqqqr4gupzqwe6gtf5eu9pgqk334fke8f2ct43ccqe4y2nhetssnypvhge9ce9qqxnzde4xgcrxdfj8qmnvwfc69lg5m
1•vmstabile•30m ago•0 comments

Advancing Protection in Chrome on Android

https://security.googleblog.com/2025/07/advancing-protection-in-chrome-on.html
2•mfrw•31m ago•0 comments

TimescaleDB helped us scale analytics and reporting

https://blog.cloudflare.com/timescaledb-art/
1•arunmu•32m ago•1 comments

Microsoft Music Producer (1996)

https://archive.org/details/microsoft-music-producer_202210
2•CharlesW•34m ago•0 comments

How to stop a bear in big city: Japan issues shoot-to-kill guide

https://www.thetimes.com/world/asia/article/japan-offers-citizens-instructions-on-how-to-kill-a-bear-dpml5msf3
1•petethomas•34m ago•0 comments

Ask HN: What Problem Would You Solve with Unlimited Resources?

3•hedayet•39m ago•6 comments

What Next?

https://www.rxjourney.net/what-next
1•daviddbrainz•40m ago•0 comments

Nginx-micro:Ultra-minimal, statically-linked, multi-architecture Nginx container

https://github.com/johnnyjoy/nginx-micro
1•thunderbong•52m ago•0 comments

Enlightenment as the Great Filter

https://www.aru.ai/essays/great-filter/
2•aru•52m ago•0 comments

TOML v0.9

https://epage.github.io/blog/2025/07/toml-09/
2•Bogdanp•54m ago•0 comments

Pattern-wishcast: enum pattern types in 2025 rust

https://lunnova.dev/articles/pattern-wishcast/
1•nalllar•55m ago•0 comments

Sia X HackerNoon: Inviting Devs to Build the Decentralized Cloud of the Future

https://sia.tech/blog/sia-x-hackernoon-inviting-developers-to-build-the-future-of-decentralized-cloud-storage
1•smooke•59m ago•0 comments

Benchmark for Evaluating Text Embeddings

https://huggingface.co/spaces/embedding-benchmark/RTEB
1•fzliu•1h ago•0 comments

I'm a 16-Year-Old Self-Taught Developer – Built 700 Projects

2•RajGuruYadav•1h ago•0 comments

Comparing the Climate and Productivity Impacts of a Shrinking Population

https://www.nber.org/papers/w33932
3•alphabetatango•1h ago•0 comments

LM Studio is free for use at work

https://lmstudio.ai/blog/free-for-work
2•CharlesW•1h ago•0 comments

Huawei Whistleblower Alleges Pangu AI Model Plagiarized from Qwen and DeepSeek

https://github.com/HW-whistleblower/True-Story-of-Pangu
1•zero_kool•1h ago•1 comments

Myth of the Brown Recluse: Fact, Fear, and Loathing

https://spiders.ucr.edu/myth-brown-recluse-fact-fear-and-loathing
2•indigodaddy•1h ago•2 comments

Jagadish Chandra Bose

https://en.wikipedia.org/wiki/Jagadish_Chandra_Bose
2•Bluestein•1h ago•0 comments

Bash-5.3-Release Available

https://lwn.net/Articles/1029079/
2•ossusermivami•1h ago•0 comments

Quick web stack for vanilla JavaScript

https://www.npmjs.com/package/instaserve
1•throwaway20174•1h ago•0 comments

Mattel unveils first Barbie doll with type 1 diabetes

https://www.yahoo.com/news/mattel-unveils-first-barbie-doll-with-type-1-diabetes-we-knew-the-time-was-right-200026414.html
3•hbcondo714•1h ago•4 comments

Convert JSON –> SQL with a handy web tool

https://widgita.xyz/jsonsql
1•fairlight1337•1h ago•1 comments
Open in hackernews

Instant SQL for results as you type in DuckDB UI

https://motherduck.com/blog/introducing-instant-sql/
378•ryguyrg•2mo ago

Comments

ryguyrg•2mo ago
In DuckDB UI and MotherDuck.

Awesome video of feature: https://youtu.be/aFDUlyeMBc8

Disclaimer: I’m a co-founder at MotherDuck.

rancar2•2mo ago
Thanks for sharing this update with the world and including it on the local ui too.

Feature request: enable the tuning of when Instant SQL is run and displayed. The erroring out with flashing updates at nearly every keystoke while expanding on a query is distracting for me personally (my brain goes into troubleshooting vs thinking mode). I like the feature (so I will keep it on by default), but I’d like to have a few modes for it depending on my working context (specifically tuning of update frequency at separation characters [space, comma], end of statement [semicolon/newline], and injections [paste/autocomplete]).

hamilton•2mo ago
Great feedback! Thanks. We agree w/ the red errors. It's not helpful when it feels like your editor is screaming at you.
theLiminator•2mo ago
Curious if there has been any thought given to open sourcing the UI? Of course there's no obligation to though!
hamilton•2mo ago
We do have plans. It's a question of effort, not business / philosophy.
rastignack•2mo ago
It’s good to know it. I live in a heavily regulated workplace and our data usage is constantly monitored.

Good to know a totally offline tool is being considered.

Thanks for the great tool BTW.

theLiminator•2mo ago
Thank you, that's awesome to hear!
d0100•2mo ago
That would be nice as it would spare us the effort of replicating the UI, half-baked as we can
strgcmc•2mo ago
This is probably stupid, but at the hope of helping others through exposing my own ignorance -- I'm having trouble actually installing and running the preview... I've downloaded the preview release duckdb binary itself, then when I try to run "duckdb -ui", I'm getting this error:

Extension Autoloading Error: An error occurred while trying to automatically install the required extension 'ui': Failed to download extension "ui" at URL "http://extensions.duckdb.org/0069af20ab/osx_arm64/ui.duckdb_..." (HTTP 403) Extension "ui" is an existing extension.

Is it looking to download the preview version of the extension, but getting blocked/unauthorized (hence the 403 forbidden response)? Or is there something about the auto-loading behavior that I'm supposed to disable maybe?

1egg0myegg0•2mo ago
Sorry you hit that! This is actually already working on version 1.2.2. Could you install that version? That should get you going for the moment! We will dig into what you ran into.
strgcmc•2mo ago
All good, v1.2.2 works fine, thank you!
carlineng•2mo ago
I just watched the author of this feature and blog post give a talk at the DataCouncil conference in Oakland, and it is obvious what a huge amount of craft, ingenuity, and care went into building it. Congratulations to Hamilton and the MotherDuck team for an awesome launch!
ryguyrg•2mo ago
wohoo! glad you noticed that. Hamilton is amazing.
wodenokoto•2mo ago
Is that talk available online?
carlineng•2mo ago
Not yet, but I believe the DataCouncil staff recorded it and will post it to their YouTube channel sometime in the next few weeks: https://www.youtube.com/@DataCouncil/videos
XCSme•2mo ago
I hope this doesn't work with DELETE queries.
ryguyrg•2mo ago
ROFL
codetrotter•2mo ago
ROFL FROM jokes WHERE thats_a_new_one;
falcor84•2mo ago
Maybe in the next version they could also implement support for DROP, with autocorrect for the nearest (not yet dropped) table name.
clgeoio•2mo ago
LLM powered queries that run in Agent mode so it can answer questions of your data before you know what to ask.
XCSme•2mo ago
That's actually not a bad idea, to have LLM autocomplete when you write queries, especially if you first add a comment at the top saying what you want to achieve:

// Select all orders for users registered in last year, and compute average earnings per user

SELECT ...

ako•2mo ago
That already works in windsurf, I’ve created unit tests in go, where I just wrote a short comment in the unit test what data to query and windsurf would autocomplete with the full sql.
XCSme•2mo ago
I mean, all LLMs do this already, but I never saw LLM autocomplete in a db tool (e.g. phpmyamdin, MongoDB Compass, etc).
Covenant0028•2mo ago
Vibe SQLing is where it's at
krferriter•2mo ago
DELETED 0 rows. Did you mean `where 1=1`? (click accept to re-run with new where clause)
munk-a•2mo ago
Or, for extra fun, it auto completes to DROP TRIGGER and just drops a single random trigger from your database. It'll help counter automation fears by ensuring your DBAs get to have a wonderful weekend on payroll where, very much in the easter spirit, they can hunt through the DB looking for the one thing that should be there but isn't!
falcor84•2mo ago
Wow, that's perhaps the most nefarious version of chaos engineering I had ever heard of. Kudos!
crmi•2mo ago
Young bobby tables at it again
worldsayshi•2mo ago
Can't it just run inside a transaction that isn't committed?
matsonj•2mo ago
for clarity: Instant SQL won't automatically run queries that write or delete data or metadata. It only runs queries that read data.
d0100•2mo ago
And this is a happy coincidence that json_serialize_sql doesn't work with anything but select queries
ayhanfuat•2mo ago
CTE inspection is amazing. I spend too much time doing that manually.
hamilton•2mo ago
Me too (author of the post here). In fact, I was watching a seasoned data engineer at MotherDuck show me how they would attempt to debug a regex in a CTE. As a longtime SQL user, I felt the pain immediately; haven't we all been there before? Instant SQL followed from that.
RobinL•2mo ago
Agree, definitely amazing feature. In the Python API you can get somewhere close with this kind of thing:

input_data = duckdb.sql("SELECT * FROM read_parquet('...')")

step_1 = duckdb.sql("SELECT ... FROM input_data JOIN ...")

step_2 = duckdb.sql("SELECT ... FROM step_1")

final = duckdb.sql("SELECT ... FROM step_2;")

ako•2mo ago
In datagrip you can select part of a query and execute it to see its result.
sannysanoff•2mo ago
Please finally add q language with proper integration to your tables so that our precious q-SQL is available there. Stop reinventing the wheel, let's at least catch up to the previous generation (in terms of convenience). Make the final step.
datadrivenangel•2mo ago
What is q-SQL?
indeyets•2mo ago
https://code.kx.com/q/basics/qsql/
cess11•2mo ago
Maybe they're busy so it might be faster if you do it instead.
sannysanoff•2mo ago
My intention is good. I advise right way. It's not an offense. (I have my job to do. They may consider doing what I say, it will be then their job to do)
makotech221•2mo ago
Delete From dbo.users w...

(129304 rows affected)

CurtHagenlocher•2mo ago
The blog specifically says that they're getting the SQL AST so presumably they would not execute something like a DELETE.
hamilton•2mo ago
Correct. We only enable fast previews for SELECT statements, which is the actual hard problem. This said, at some point we're likely to also add support for previewing a CTAS before you actually run it.
buremba•2mo ago
I remember your demos of visualizing the CTEs of a huge query in the editor. I'm looking forward to trying it!
makotech221•2mo ago
Cool. Now, there's this thing called a joke...
wodenokoto•2mo ago
Will this be available in duckdb -ui ?

Is mother duck editor features available on-prem? My understanding is that mother duck is a data warehouse sass.

1egg0myegg0•2mo ago
It is already available in the local DuckDB UI! Let us know what you think!

-Customer software engineer at MotherDuck

ukuina•2mo ago
Does local DuckDB UI work without an internet connection?
wodenokoto•2mo ago
I’m pretty sure it doesn’t. My understanding is it gets downloaded at startup and then runs offline.

Kinda like regex101, draw.io or excalidraw.

jephly•2mo ago
(DuckDB UI developer here)

It doesn't currently - the UI assets are loaded at runtime - but we do have an offline mode planned. See https://github.com/duckdb/duckdb-ui/issues/62.

mritchie712•2mo ago
a fun function in duckdb (which I think they're using here) is `json_serialize_sql`. It returns a JSON AST of the SQL

    SELECT json_serialize_sql('SELECT 2');



    [
        {
            "json_serialize_sql('SELECT 2')": {
                "error": false,
                "statements": [
                    {
                        "node": {
                            "type": "SELECT_NODE",
                            "modifiers": [],
                            "cte_map": {
                                "map": []
                            },
                            "select_list": [
                                {
                                    "class": "CONSTANT",
                                    "type": "VALUE_CONSTANT",
                                    "alias": "",
                                    "query_location": 7,
                                    "value": {
                                        "type": {
                                            "id": "INTEGER",
                                            "type_info": null
                                        },
                                        "is_null": false,
                                        "value": 2
                                    }
                                }
                            ],
                            "from_table": {
                                "type": "EMPTY",
                                "alias": "",
                                "sample": null,
                                "query_location": 18446744073709551615
                            },
                            "where_clause": null,
                            "group_expressions": [],
                            "group_sets": [],
                            "aggregate_handling": "STANDARD_HANDLING",
                            "having": null,
                            "sample": null,
                            "qualify": null
                        },
                        "named_param_map": []
                    }
                ]
            }
        }
    ]
hamilton•2mo ago
Indeed, we are! We worked with DuckDB Labs to add the query_location information, which we're also enriching with the tokenizer to draw a path through the AST to the cursor location. I've been wanting to do this since forever, and now that we have it, there's actually a long tail of inspection / debugging / enrichment features we can add to our SQL editor.
krferriter•2mo ago
This is a very cool feature. I don't know how useful it is or how I'd use it right now but I think I am going to get into some benchmarking and performance tweaking soon and this could be handy.
RobinL•2mo ago
Can you go the other way? (E.g. edit the above and turn it back into SQL string)

I've used sqlglot to do this in the past, but doing it natively would be nice

hamilton•2mo ago
it can, but it doesn't format. You can even run the ast!
hk1337•2mo ago
First time seeing the from at the top of the query and I am not sure how I feel about it. It seems useful but I am so used to select...from.

I'm assuming it's more of a user preference like commas in front of the field instead of after field?

hamilton•2mo ago
You can use any variation of DuckDB valid syntax that you want! I prefer to put from first just because I think it's better, but Instant SQL works with traditional select __ from __ queries.
ltbarcly3•2mo ago
Yes it comes from a desire to impose intuition from other contexts onto something instead of building intuition with that thing.

SQL is a declarative language. The ordering of the statements was carefully thought through.

I will say it's harmless though, the clauses don't have any dependency in terms of meaning so it's fine to just allow them to be reordered in terms of the meaning of the query, but that's true of lots and lots of things in programming and just having a convention is usually better than allowing anything.

For example, you could totally allow this to be legal:

  def
      for x in whatever:
          print(x)
  print_whatever(whatever):
There's nothing ambiguous about it, but why? Like if you are used to seeing it one way it just makes it more confusing to read, and if you aren't used to seeing it the normal way you should at least somewhat master something before you try to improve it through cosmetic tweaks.

I think you see this all the time, people try to impose their own comfort onto things for no actual improvement.

whstl•2mo ago
No, it comes from wanting to make autocompletion easier and to make variable scoping/method ordering make sense within LINQ. It is an actual improvement in this regard.

LINQ popularized it and others followed. It does what it says.

Btw: saying that "people try to impose their own comfort" is uncalled for.

ltbarcly3•2mo ago
In that case you are just objectively incorrect, you can build a far, far more efficient autocomplete in the standard query order. I will guess something like half as many keystrokes to type the same select and from clauses. You are imagining a very niave autocomplete that can only guess columns after it knows the tables, but in reality you can guess most of the columns, including the first one, the tables, and the aliases. Names in dbs are incredibly sparse, and duplicate names don't make autocomplete less effective.

If you are right about why they did it its even dumber than my reason, they are changing a language grammar to let them make a much worse solution to the same problem.

pests•2mo ago
I don’t want to type any column names. When you start with FROM the only autocomplete suggestions available are the columns from the specific table, not the entire database. How many columns do I need to type before you can single down a single table? What if you have multiple tables with the same column names?
ltbarcly3•2mo ago
This is extremely easy to check. It depends on the schema.

If your tables have very heterogeneous column names, like 1 column will identify any table on average. There will be some duplicates but the median columns will be one or two, but generally you can even complete those after a few characters.

If your database has very homogenous column names you don't need to identify a single table for autocomplete to be very precise, unless there is no correlation between column name co occurence within tables. However if there is no correlation you are back to very low number of columns to identify the table.

whstl•2mo ago
An autocomplete that shows only the column names of the desired table BEFORE the from clause is typed by the user would require a time machine.

Sure you can do something that is close enough, but the LINQ authors were looking for precision in the autocompletion and for the LINQ query to have the same ordering as expression syntax.

The goals of this syntax are very precise and people seem to like it. Once again: calling it dumb is uncalled for.

ltbarcly3•2mo ago
So you want it to work this way, regardless of how well autocomplete works? Sounds like its about your personal comfort to make it work like another system you are more familiar with, which is exactly what I suggested.

It doesn't require a time machine, just a basic understanding of statistics or probability.

whstl•2mo ago
I’m comfortable with pretty much anything. It took me like 2 mins to get used to this syntax in LINQ.

On the other hand, statistical autocomplete is not as good as having a precise autocomplete that does’t require jumping around lines.

My point here is that different people enjoy different things. There is no need to shit on other people’s accomplishments or preferences.

ltbarcly3•2mo ago
Name one thing thst uses autocomplete like i am describing.
ltbarcly3•2mo ago
This is such a bizarre feature.
hamilton•2mo ago
What about it is bizarre?
pixl97•2mo ago
It's probably different for duckdb, but from something like Microsoft SQL tossing off these random queries at a database of any size could have some weird performance impacts. For example statistics on columns you don't want them on, unindexed queries with slow performance, temp tables being dumped out to disk, etc.
hamilton•2mo ago
I agree; one thing that is neat about Instant SQL is for many reasons, you can't do this with in any other DBMS. You really need DuckDB's specific architecture and ergonomics.
thenaturalist•2mo ago
On first glance possibly, on second glance not at all.

First, repeat data analyst queries are a usage driver in SQL DBs. Think iterating the code and executing again.

Another huge factor in the same vein is running dev pipelines with limited data to validate a change works when modelling complex data.

This is currently a FE feature, but underneath lies effective caching.

The underlying tech is driving down usage cost which is a big thing for data practitioners.

potatohead24•2mo ago
It's neat but the CTE selection bit errors out more often than not & erroneously selects more than the current CTE
hamilton•2mo ago
Can you say more? Where does it error out? Sounds like a bug; if you could post an example query, I bet we can fix that.
jpambrun•2mo ago
I really like duckdb's notebooks for exploration and this feature makes them even more awesome, but the fact that I can't share, export or commit them into a git repo feels extremely limiting. It's neat-ish that it dodfoods and store them in a duckdb database. It even seems to stores historical versions, but I can't really do anything with it..
hamilton•2mo ago
Definitely something we want too! (I'm the author / lead for the UI)
RyanHamilton•2mo ago
Local markdown file based sql notebooks: https://www.timestored.com/sqlnotebook Disclaimer: I'm the author
akshayka•2mo ago
You can try marimo notebooks, which are stored as pure Python and support SQL cells through duckdb. (I’m one of its authors.)

https://github.com/marimo-team/marimo

crazygringo•2mo ago
Edit: never mind, thanks for the replies! I had missed the part where it showed visualizing subqueries, which is what I wanted but didn't think it did. This looks very helpful indeed!
Noumenon72•2mo ago
The article says it does subqueries:

> Getting the AST is a big step forward, but we still need a way to take your cursor position in the editor and map it to a path through this AST. Otherwise, we can’t know which part of the query you're interested in previewing. So we built some simple tools that pair DuckDB’s parser with its tokenizer to enrich the parse tree, which we then use to pinpoint the start and end of all nodes, clauses, and select statements. This cursor-to-AST mapping enables us to show you a preview of exactly the SELECT statement you're working on, no matter where it appears in a complex query.

hamilton•2mo ago
You should read the post! This is what the feature does.
geysersam•2mo ago
> What would be helpful would be to be able to visualize intermediate results -- if my cursor is inside of a subquery, show me the results of that subquery.

But that's exactly what they show in the blog post??

jakozaur•2mo ago
It would be even better if SQL had pipe syntax. SQL is amazing, but its ordering isn’t intuitive, and only CTEs provide a reliable way to preview intermediate results. With pipes, each step could clearly show intermediate outputs.

Example:

FROM orders |> WHERE order_date >= '2024-01-01' |> AGGREGATE SUM(order_amount) AS total_spent GROUP BY customer_id |> WHERE total_spent > 1000 |> INNER JOIN customers USING(customer_id) |> CALL ENRICH.APOLLO(EMAIL > customers.email) |> AGGREGATE COUNT(*) high_value_customer GROUP BY company.country

metadata•2mo ago
Google SQL has it now:

https://cloud.google.com/blog/products/data-analytics/simpli...

It's pretty neat:

    FROM mydataset.Produce
    |> WHERE sales > 0
    |> AGGREGATE SUM(sales) AS total_sales, COUNT(\*) AS num_sales
       GROUP BY item;
Edit: formatting
ryguyrg•2mo ago
note that DuckDB allows that reverse ordering (FROM-first)

FROM table SELECT foo, bar WHERE zoo=“goo”

viggity•2mo ago
it makes intellisense/autocomplete work a hell of a lot easier. LINQ in dotnet does the same thing.
crooked-v•2mo ago
I suspect you'll like PRQL: https://github.com/PRQL/prql
hamilton•2mo ago
Obviously one advantage of SQL is everyone knows it. But conceptually, I agree. I think [1]Malloy is also doing some really fantastic work in this area.

This is one of the reasons I'm excited about DuckDB's upcoming [2]PEG parser. If they can pull it off, we could have alternative dialects that run on DuckDB.

[1] https://www.malloydata.dev/ [2] https://duckdb.org/2024/11/22/runtime-extensible-parsers.htm...

wodenokoto•2mo ago
I haven’t tested but I believe there’s a prql extension for duckdb
tstack•2mo ago
The PRQL[1] syntax is built around pipelines and works pretty well.

I added a similar "get results as you type" feature to the SQLite integration in the Logfile Navigator (lnav)[2]. When entering PRQL queries, the preview will show the results for the current and previous stages of the pipeline. When you move the cursor around, the previews update accordingly. I was waiting years for something like PRQL to implement this since doing it with regular SQL requires more knowledge of the syntax and I didn't want to go down that path.

[1] - https://prql-lang.org [2] - https://lnav.org/2024/03/29/prql-support.html

mritchie712•2mo ago
there's a PRQL extension for duckdb:

https://community-extensions.duckdb.org/extensions/prql.html

RyanHamilton•2mo ago
If you want to get started with prql check out qstudio https://www.timestored.com/qstudio/prql-ide it allows running prql easily against mysql postgresql duckdb etc
cdchhs•2mo ago
that syntax is horrendous.
da_chicken•2mo ago
While I would certainly agree with you that putting the FROM clause first would be a significant improvement to SQL and was a genuine design mistake, this otherwise feels more like just wanting SQL to be less declarative and more imperative. Wanting it to be more like LINQ and less like relational algebra.

That, I think, is most developers' real sticking point with SQL. It's not object-relational impedance mismatch between their application and the data store, it's imperative-declarative impedance mismatch between their preferred or demonstrated talent. They are used to thinking about problems in exactly one way, so when they struggle to adapt to a different way of thinking about the problems they assume their familiarity is what's more correct.

I think this is why the same developers insist that XML/HTML is "just a markup language." Feeding a document into an executable to produce output isn't really significantly different than feeding imperative language into a compiler. The only real difference is that one is Turing complete, but Turning completeness is not a requirement of programming languages.

NDizzle•2mo ago
This is the stuff nightmares are made out of. Keep that style of coding out of any project I’m involved in, please.
sidpatil•2mo ago
What do you dislike about that style?
bb86754•2mo ago
He/she isn't used to it. Any R, Elixir, or F# developer would be right at home with this syntax.
Vaslo•2mo ago
I moved from pandas and SQLite to polars and DuckDB. Such an improvement in these new tools.
arsalanb•2mo ago
Check out livedocs.com, we built a notebook around Polars and DuckDB (disclaimer: I'm the founder)
xdkyx•2mo ago
Does it work as fast with more complicated queries with joins/havings and large tables?
porridgeraisin•2mo ago
This is just so good. I wish redash had this...
jwilber•2mo ago
Amazing work. Motherduck and the duckdb ecosystem have done a great job of gathering talented engineers with great taste. Craftsmanship may be the word I’m looking for - I always look forward to their releases.

I spent the first two quarters of 2024 working on observability for a build-the-plane-as-you-fly-it style project. I can’t express how useful the cte preview would have been for debugging.

almosthere•2mo ago
Wow, I used DuckDB in my last job, and have to say it was impressive for its speed. Now it's more useful than ever.
motoboi•2mo ago
DuckDb is missing a killer feature by not having a pipe syntax like kusto or google's pipe query syntax.

Why is it a killer feature? First of all, LLMs complete text from left to right. That alone is a killer feature.

But for us meatboxes with less compute power, pipe syntax allow (much better) code completion.

Pipe syntax is delightful to work with and makes going back to SQL a real bummer moment (please insert meme of Kate Perry kissing the earth here).

ergest•2mo ago
There’s an extension for that https://github.com/ywelsch/duckdb-psql
Philpax•2mo ago
Also https://github.com/ywelsch/duckdb-prql (by the same author!)
gervwyk•2mo ago
Nothing comes close to the power of mongodb aggression pipelines.. when used in production apps it reduces the amount of code significantly for us by doing data modeling as close as possible to the source
sterlinm•2mo ago
[grizzled kdb+ user considers starting an argument but then thinks better of it]
hantusk•2mo ago
CTEs go a long way towards left to right readability while keeping everything standard SQL.
gitroom•2mo ago
honestly this kind of instant feedback wouldve saved me tons of headaches in the past - you think all these layers of tooling are making sql beginners pick it up faster or just overwhelming them?
arrty88•2mo ago
it looks cool, but i wish i could just see the entire table that im about to query. i always start my queries with a quick `select * from table limit 10;` then go about adding the columns and joins
matsonj•2mo ago
`from my_table`

will do the same!

We are working on how to make it easy to switch from instant sql -> run query -> instant sql

acdanger•2mo ago
Does DuckDB UI support spatial visualizations ? Would be great to be able to use the UI with the spatial extensions.
1egg0myegg0•2mo ago
We support spatial calculations in the UI, but not spatial visualizations just yet. Thanks for the feedback!
acdanger•2mo ago
Just emphasizing that the ability to display a map with geo data on it would be a killer feature for me and for many others I work with! Hope it lands on the roadmap.
r3tr0•2mo ago
We are working on something similar over at yeet.

Except for system performance data.

You can checkout our sandbox at

https://yeet.cx/play

cess11•2mo ago
At times I've done crude implementations of similar functionality, by basically just taking the current string on change and concatenating with " LIMIT 20" before passing it to the database API and then rerendering a table if the result is an associative array rather than an error message.

I think this would be better if it was combined with information about valid words in the cursor position, which would likely be a bit more involved but achievable through querying the schema and settling on a subset of SQL. It would help people that aren't already fluent in SQL to extract the data they want. Perhaps allow them to click the suggestions to add them to the query.

I've done partial implementations of this too, that query the schema for table or column names. It's very cheap even on large, complex schemas, so it's fine to just throw every change at the database and check what drops out. In practice I didn't get much out of either beyond the fun of hacking up an ephemeral tool, or I would probably have built some small product around it.

owlstuffing•2mo ago
Cool tool, even cooler when paired with the manifold project for SQL[1], which has fantastic support for type-safe, native DuckDB syntax.

1. https://github.com/manifold-systems/manifold/blob/master/man...

biophysboy•2mo ago
If there are any DuckDB engineers here, I just want you to know that your tool has been incredible for my work in bioinformatics/biotech. It has the flexibility/simplicity that biological data (messy, changing constantly) requires.
Jgrubb•2mo ago
There's something about this commercial company embracing this OSS project that I love that I very much don't love.