frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Show HN: MCP to get latest dependency package and tool versions

https://github.com/MShekow/package-version-check-mcp
1•mshekow•1m ago•0 comments

The better you get at something, the harder it becomes to do

https://seekingtrust.substack.com/p/improving-at-writing-made-me-almost
2•FinnLobsien•3m ago•0 comments

Show HN: WP Float – Archive WordPress blogs to free static hosting

https://wpfloat.netlify.app/
1•zizoulegrande•5m ago•0 comments

Show HN: I Hacked My Family's Meal Planning with an App

https://mealjar.app
1•melvinzammit•5m ago•0 comments

Sony BMG copy protection rootkit scandal

https://en.wikipedia.org/wiki/Sony_BMG_copy_protection_rootkit_scandal
1•basilikum•7m ago•0 comments

The Future of Systems

https://novlabs.ai/mission/
2•tekbog•8m ago•1 comments

NASA now allowing astronauts to bring their smartphones on space missions

https://twitter.com/NASAAdmin/status/2019259382962307393
2•gbugniot•13m ago•0 comments

Claude Code Is the Inflection Point

https://newsletter.semianalysis.com/p/claude-code-is-the-inflection-point
2•throwaw12•14m ago•1 comments

Show HN: MicroClaw – Agentic AI Assistant for Telegram, Built in Rust

https://github.com/microclaw/microclaw
1•everettjf•14m ago•2 comments

Show HN: Omni-BLAS – 4x faster matrix multiplication via Monte Carlo sampling

https://github.com/AleatorAI/OMNI-BLAS
1•LowSpecEng•15m ago•1 comments

The AI-Ready Software Developer: Conclusion – Same Game, Different Dice

https://codemanship.wordpress.com/2026/01/05/the-ai-ready-software-developer-conclusion-same-game...
1•lifeisstillgood•17m ago•0 comments

AI Agent Automates Google Stock Analysis from Financial Reports

https://pardusai.org/view/54c6646b9e273bbe103b76256a91a7f30da624062a8a6eeb16febfe403efd078
1•JasonHEIN•20m ago•0 comments

Voxtral Realtime 4B Pure C Implementation

https://github.com/antirez/voxtral.c
2•andreabat•23m ago•1 comments

I Was Trapped in Chinese Mafia Crypto Slavery [video]

https://www.youtube.com/watch?v=zOcNaWmmn0A
2•mgh2•29m ago•0 comments

U.S. CBP Reported Employee Arrests (FY2020 – FYTD)

https://www.cbp.gov/newsroom/stats/reported-employee-arrests
1•ludicrousdispla•31m ago•0 comments

Show HN: I built a free UCP checker – see if AI agents can find your store

https://ucphub.ai/ucp-store-check/
2•vladeta•36m ago•1 comments

Show HN: SVGV – A Real-Time Vector Video Format for Budget Hardware

https://github.com/thealidev/VectorVision-SVGV
1•thealidev•38m ago•0 comments

Study of 150 developers shows AI generated code no harder to maintain long term

https://www.youtube.com/watch?v=b9EbCb5A408
1•lifeisstillgood•38m ago•0 comments

Spotify now requires premium accounts for developer mode API access

https://www.neowin.net/news/spotify-now-requires-premium-accounts-for-developer-mode-api-access/
1•bundie•41m ago•0 comments

When Albert Einstein Moved to Princeton

https://twitter.com/Math_files/status/2020017485815456224
1•keepamovin•42m ago•0 comments

Agents.md as a Dark Signal

https://joshmock.com/post/2026-agents-md-as-a-dark-signal/
2•birdculture•44m ago•0 comments

System time, clocks, and their syncing in macOS

https://eclecticlight.co/2025/05/21/system-time-clocks-and-their-syncing-in-macos/
1•fanf2•45m ago•0 comments

McCLIM and 7GUIs – Part 1: The Counter

https://turtleware.eu/posts/McCLIM-and-7GUIs---Part-1-The-Counter.html
2•ramenbytes•48m ago•0 comments

So whats the next word, then? Almost-no-math intro to transformer models

https://matthias-kainer.de/blog/posts/so-whats-the-next-word-then-/
1•oesimania•49m ago•0 comments

Ed Zitron: The Hater's Guide to Microsoft

https://bsky.app/profile/edzitron.com/post/3me7ibeym2c2n
2•vintagedave•52m ago•1 comments

UK infants ill after drinking contaminated baby formula of Nestle and Danone

https://www.bbc.com/news/articles/c931rxnwn3lo
1•__natty__•53m ago•0 comments

Show HN: Android-based audio player for seniors – Homer Audio Player

https://homeraudioplayer.app
3•cinusek•53m ago•2 comments

Starter Template for Ory Kratos

https://github.com/Samuelk0nrad/docker-ory
1•samuel_0xK•55m ago•0 comments

LLMs are powerful, but enterprises are deterministic by nature

2•prateekdalal•58m ago•0 comments

Make your iPad 3 a touchscreen for your computer

https://github.com/lemonjesus/ipad-touch-screen
2•0y•1h ago•1 comments
Open in hackernews

Speech and Language Processing (3rd ed. draft)

https://web.stanford.edu/~jurafsky/slp3/
64•atomicnature•2mo ago

Comments

brandonb•1mo ago
I learned speech recognition from the 2nd edition of Jurafsky's book (2008). The field has changed so much it sometimes feels unrecognizable. Instead of hidden markov models, gaussian mixture models, tri-phone state trees, finite state transducers, and so on, nearly the whole stack has been eaten from the inside out by neural networks.

But, there's benefit to the fact that deep learning is now the "lingua franca" across machine learning fields. In 2008, I would have struggled to usefully share ideas with, say, a researcher working on computer vision.

Now neural networks act as a shared language across ML, and ideas can much more easily flow across speech recognition, computer vision, AI in medicine, robotics, and so on. People can flow too, e.g., Dario Amodei got his start working on Baidu's DeepSpeech model and now runs Anthropic.

Makes it a very interesting time to work in applied AI.

ForceBru•1mo ago
> Gaussian mixture models

In what fields did neural networks replace Gaussian mixtures?

brandonb•1mo ago
The acoustic model of a speech recognizer used to be a GMM, which mapped a pre-processed acoustic signal vector (generally MFCCs-Mel-Frequency Cepstral Coefficients) to an HMM state.

Now those layers are neural nets, so acoustic pre-processing, GMM, and HMM are all subsumed by the neural network and trained end-to-end.

One early piece of work here was DeepSpeech2 (2015): https://arxiv.org/pdf/1512.02595

ForceBru•1mo ago
Interesting, thanks!
roadside_picnic•1mo ago
In addition to all this, I also feel we have been getting so much progress so fast down the NN path that we haven't really had time to take a breath and understand what's going on.

When you work closely with transformers for while you do start to see things reminiscent of old school NLP pop up: decoder only LLMs are really just fancy Markov Chains with a very powerful/sophisticated state representation, "Attention" looks a lot like learning kernels for various tweaks on kernel smoothing etc.

Oddly, I almost think another AI winter (or hopefully just an AI cool down) would give researchers and practitioners alike a chance to start exploring these models more closely. I'm a bit surprised how few people really spend their time messing with the internals of these things, and every time they do something interesting seems to come out of it. But currently nobody I know in this space, from researchers to product folks, seems to have time to catch their breath, let along really reflect on the state of the field.

bawis•1mo ago
> we haven't really had time to take a breath and understand what's going on.

The field of Explainable AI (or other equivalent names, interpretable AI, transparent AI etc) is looking for talent, both in academia and industry.

miki123211•1mo ago
There are sectors where pre-ML approaches still dominate.

Among screen reader users for example, formant-based TTS is still wildly popular, and I don't think that's going to change anytime soon. The speed, predictability and responsiveness are unmatched by any newer technology.

mfalcon•1mo ago
I was eagerly waiting for a chapter on semantic similarity as I was using Universal Sentence Encoder for paraphrase detection, then LLMs showed up before that chapter :).
MarkusQ•1mo ago
Latecomers to the field may be tempted to write this off as antiquated (though updated to cover transformers, attention, etc.) but a better framing would be that it is _grounded_. Understanding the range of related approaches is key to understanding the current dominant paradigm.
languagehacker•1mo ago
Good old Jurafsky and Martin. Got to meet Dan Jurafsky when he visited UT back in '07 or so -- cool guy.

This one and Manning and Schutze's "Dice Book" (Foundations of Statistical Natural Language Processing) were what got me into computational linguistics, and eventually web development.

ivape•1mo ago
Were NLP people able to cleanly transition? I'm assuming the field is completely dead. They may actually be patient zero of the llm-driven unemployment outbreak.
jll29•1mo ago
One can feel for the authors, it's such a struggle to write a textbook in a time when NeurIPS gets 20000 submissions and ACL has 6500 registered attendees (as of August '05), and every day, dozens of relevant ArXiv pre-prints appear.

Controversial opinion (certainly the publisher would disagree with me): I would not take out older material, but arrange it by properties like explanatory power/transparency/interpreability, generative capacity, robustness, computational efficiency, and memory footprint. For each machine learning method, an example NLP model/application could be shown to demonstrate it.

Naive Bayes is way too useful to downgrade it to an appendix position.

It may also make sense to divide the book into timeless material (Part I: what's a morphem? what's a word sense?) and (Part II:) methods and datasets that change every decade.

This is the broadest introductory book for beginners and a must-read; like the ACL family of conferences it is (nowadays) more of an NLP book (i.e., on engineering applications) than a computational linguistics (i.e., modeling/explaining how language-based communication works) book.

aanet•1mo ago
This is the OG among the Computational Linguistics books. Very glad it exists and is being revised.

Newcomers to the field should glad to read through this... there is gold in there. <3

I got my start in NLP back in '08 and later in '12 with an older version of this book. Recommended!