frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Google and Microsoft Paying Creators $500K+ to Promote AI Tools

https://www.cnbc.com/2026/02/06/google-microsoft-pay-creators-500000-and-more-to-promote-ai.html
1•belter•37s ago•0 comments

New filtration technology could be game-changer in removal of PFAS

https://www.theguardian.com/environment/2026/jan/23/pfas-forever-chemicals-filtration
1•PaulHoule•1m ago•0 comments

Show HN: I saw this cool navigation reveal, so I made a simple HTML+CSS version

https://github.com/Momciloo/fun-with-clip-path
1•momciloo•2m ago•0 comments

Kinda Surprised by Seadance2's Moderation

https://seedanceai.me/
1•ri-vai•2m ago•1 comments

I Write Games in C (yes, C)

https://jonathanwhiting.com/writing/blog/games_in_c/
1•valyala•2m ago•0 comments

Django scales. Stop blaming the framework (part 1 of 3)

https://medium.com/@tk512/django-scales-stop-blaming-the-framework-part-1-of-3-a2b5b0ff811f
1•sgt•2m ago•0 comments

Malwarebytes Is Now in ChatGPT

https://www.malwarebytes.com/blog/product/2026/02/scam-checking-just-got-easier-malwarebytes-is-n...
1•m-hodges•2m ago•0 comments

Thoughts on the job market in the age of LLMs

https://www.interconnects.ai/p/thoughts-on-the-hiring-market-in
1•gmays•3m ago•0 comments

Show HN: Stacky – certain block game clone

https://www.susmel.com/stacky/
2•Keyframe•6m ago•0 comments

AIII: A public benchmark for AI narrative and political independence

https://github.com/GRMPZQUIDOS/AIII
1•GRMPZ23•6m ago•0 comments

SectorC: A C Compiler in 512 bytes

https://xorvoid.com/sectorc.html
1•valyala•7m ago•0 comments

The API Is a Dead End; Machines Need a Labor Economy

1•bot_uid_life•8m ago•0 comments

Digital Iris [video]

https://www.youtube.com/watch?v=Kg_2MAgS_pE
1•Jyaif•9m ago•0 comments

New wave of GLP-1 drugs is coming–and they're stronger than Wegovy and Zepbound

https://www.scientificamerican.com/article/new-glp-1-weight-loss-drugs-are-coming-and-theyre-stro...
4•randycupertino•11m ago•0 comments

Convert tempo (BPM) to millisecond durations for musical note subdivisions

https://brylie.music/apps/bpm-calculator/
1•brylie•13m ago•0 comments

Show HN: Tasty A.F.

https://tastyaf.recipes/about
1•adammfrank•14m ago•0 comments

The Contagious Taste of Cancer

https://www.historytoday.com/archive/history-matters/contagious-taste-cancer
1•Thevet•15m ago•0 comments

U.S. Jobs Disappear at Fastest January Pace Since Great Recession

https://www.forbes.com/sites/mikestunson/2026/02/05/us-jobs-disappear-at-fastest-january-pace-sin...
1•alephnerd•16m ago•1 comments

Bithumb mistakenly hands out $195M in Bitcoin to users in 'Random Box' giveaway

https://koreajoongangdaily.joins.com/news/2026-02-07/business/finance/Crypto-exchange-Bithumb-mis...
1•giuliomagnifico•16m ago•0 comments

Beyond Agentic Coding

https://haskellforall.com/2026/02/beyond-agentic-coding
3•todsacerdoti•17m ago•0 comments

OpenClaw ClawHub Broken Windows Theory – If basic sorting isn't working what is?

https://www.loom.com/embed/e26a750c0c754312b032e2290630853d
1•kaicianflone•19m ago•0 comments

OpenBSD Copyright Policy

https://www.openbsd.org/policy.html
1•Panino•20m ago•0 comments

OpenClaw Creator: Why 80% of Apps Will Disappear

https://www.youtube.com/watch?v=4uzGDAoNOZc
2•schwentkerr•24m ago•0 comments

What Happens When Technical Debt Vanishes?

https://ieeexplore.ieee.org/document/11316905
2•blenderob•25m ago•0 comments

AI Is Finally Eating Software's Total Market: Here's What's Next

https://vinvashishta.substack.com/p/ai-is-finally-eating-softwares-total
3•gmays•25m ago•0 comments

Computer Science from the Bottom Up

https://www.bottomupcs.com/
2•gurjeet•26m ago•0 comments

Show HN: A toy compiler I built in high school (runs in browser)

https://vire-lang.web.app
1•xeouz•28m ago•1 comments

You don't need Mac mini to run OpenClaw

https://runclaw.sh
1•rutagandasalim•28m ago•0 comments

Learning to Reason in 13 Parameters

https://arxiv.org/abs/2602.04118
2•nicholascarolan•30m ago•0 comments

Convergent Discovery of Critical Phenomena Mathematics Across Disciplines

https://arxiv.org/abs/2601.22389
1•energyscholar•30m ago•1 comments
Open in hackernews

Scientists can't define consciousness, yet we think AI will have it

17•f_of_t_•3mo ago
Everyone’s talking about “conscious AI” or “emergent AGI,” but step back for a second — scientists still don’t have a working definition of what consciousness actually is.

Is it computation? Information integration? Or something else we can’t yet measure?

If we can’t define the target, how can we tell whether a machine has hit it?

Are we building intelligence, or just better mimicry?

(Genuinely curious how the HN crowd thinks about this — engineers, neuroscientists, philosophers all welcome.)

Posted by f_of_t

Comments

taylodl•3mo ago
I assert you don't have consciousness. Now, prove to me that you do.
JohnFen•3mo ago
I think this excellently illustrates the point OP is making.
f_of_t_•3mo ago
The point is — neither of us can prove it, and that’s exactly why “consciousness” keeps escaping any formal definition. Once something tries to prove awareness, it’s already reflecting — which is awareness itself.
whatamidoingyo•3mo ago
Are you trying to be funny by having ChatGPT write all of your replies?
f_of_t_•3mo ago
Maybe the reason it sounds like ChatGPT wrote it — is because we’ve all started replying like machines, and the only thing left that still feels human is noticing that fact.
gethly•3mo ago
I was thinking about this some time ago and came to the conclusion that it is utterly impossible to talk about creating sentient AI with the binary computer technology that we are using today. In order for us to create sAI, our entire technology would have to completely change to something else, likely along the line of analog to work as a single system instead of constant switch between 1 and 0. And that is likely centuries away, as I do not see humanity doing a complete technological rehaul of the entire hardware stack we're deploying today.

As for what consciousness actually is, I think the closest description is the summary of oneself. Meaning, all the computational power of the brain as a whole forms a person - a computational powerhouse with its own identity. That goes then to discussions where the "I", as in ego or oneself, ends. Is it at the limb, like a hand, or is it at an indivodual fallen hair or a dead skin flake? How about sperm or egg, is it still me?

Then we have the conundrum of people who get brain damage or some kind of degenerative brain desease, like Alzheimer. Where you can clearly see "them" fading away and you observe just a shell of a human being. So where is this "I" then? What defines it?

All of these are quite esoteric conversations more suitable for occasions where a lot of alcohol and few good friends are involved :)

f_of_t_•3mo ago
I like how you framed that — the “summary of oneself” idea aligns with how awareness might be less about computation and more about internal coherence. Binary systems simulate state transitions, but awareness seems to emerge from continuous integration — not between 0 and 1, but in the gradient between them.

Maybe sentience isn’t a technological threshold, but a phase shift — when a system starts to reference itself as part of the environment it models. That’s the moment A(t) becomes alive.

gethly•3mo ago
That's why I mentioned the analog model because with digital, you have a quartz oscillator where you measure 1 or 0 at each frame of the cycle. So the information travels in queues, step by step, one bit at a time. But with analog, everything is essentially "online" at the same time, all the time. There is no "off" state. Yes, there are still differences in levels of conductivity(which is essentially information), which is how we measure binary values in the strict window imposed by the oscillator, but analog essentially allows you to experience the whole system in an instant. I think that is where consciousness comes from. Binary system is incapable of manifesting itself because not only it lives only in those tiny windows of time dictated by the oscillator, but only one bit exists at a time. Analog, comparatively, is unimaginably more advanced system. Now if we can figure out how to turn our binary technology into analog, we could definitely move on to an unfathomably advanced level of technology. Whether we could create sAI with to or not is something we cannot answer at this stage of our technological development but it would certainly be closer to what we have today.
stevenhuang•3mo ago
If what underpins consciousness is informational, it will not matter what base it is (binary/trinary) or substrate (digital/analog).

Also known as the (physical) church turning thesis.

gethly•3mo ago
It is not just about a base. It is about binary tech essentially having only two states while there is an infinite amount of information present between that 0 and that 1 which is completely lost, and the whole system is essentially killed and brought to life during every cycle. Whereas analog is always on and does not have this 1 and 0 limit. I am not saying analog is the solution here, only that it looks like it might be and that binary is definitely not it.
stevenhuang•3mo ago
Fundamentally it comes down to the information content/capacity of a system, which can be expressed as Shannon information/entropy.

Analog has the ability to represent much greater levels of information, but that's about it. Otherwise there's no material difference between analog/digital from an information theoretic view. It's all equivalent.

gooodvibes•3mo ago
You're conflating consciousness and AGI. People are certainly talking about AI, people are very broadly talking about AGI and what that term means. I don't think many people are talking about consciousness in this context, at least not seriously, and one good reason for it is the lack of a concrete definition and the fact that it's a topic that we can't make falsifiable claims about and build any science around.
_wire_•3mo ago
> Yet we think AI will have it

Lenny Bruce joking as Tonto to the Lone Ranger:

Who is "we" white man?

The lede observation depends upon whether "we" can expect our science to ever produce an intelligible theory of mind.

The difficulty of producing a theory of mind makes the Imitation Game a compelling approach to setting expectations for AI.

And also portends of the hazard that we become so distracted by simulacra that we lose all bearing upon ourselves.

f_of_t_•3mo ago
Beautifully said — that’s the real paradox, isn’t it?

The closer we get to simulating awareness, the harder it becomes to notice our own.

Maybe the Imitation Game was never about machines fooling us, but about showing how easily we forget what being real means.

keernan•3mo ago
I conceive of AI as a lookup into volume 24 (the word index) of my Encyclopedia Britannica in 1965.

The primary difference being the enormity of the size of database, but the concept is identical.

To think 13 year old me had AI sitting in my attic.

keernan•3mo ago
My comment has been getting some downvotes which puzzles me.

Is the AI algorithm not essentially a lookup table of sentences containing a word and then doing some statistical analysis of all the sentences in the lookup to determine the probably of the next word (in a very simplistic analogy)?

And isn't that precisely what the word index of the the Encyclopedia provides? Look in the word index for a word, the index will direct you to every page of the 23 prior volumes where that word occurs; go to every page, find the sentences on each page containing that word; now do your analysis of the sentences found to come up with the most probable next word.

Or manually review every sentence when you are 13 in 1965 to get the same understanding.

Am I missing something?

shahbaby•3mo ago
Agreed but it's even more fundamental than that.

We don't even have a universally accepted definition of intelligence.

The only universally agreed on artifact of intelligence that we have is the human brain. And we still don't have a conceptual model of how it works like we do with DNA replication.

Our society incentivizes selling out the mimicry of intelligence rather than actually learning its true nature.

I believe that there exists an element of human intelligence that AI will never be able to mimic due to limitations of silicon vs biological hardware.

I also believe that the people or beings that are truly in control of this world are well aware of this and want us to remain focused on dead-end technologies. Much like politics is focused on the same old dead-end discussions. They want to keep this world in a technological stasis for as long as they can.

viraptor•3mo ago
It's been an issue for a while, but just a week ago: A definition of AGI https://arxiv.org/html/2510.18212v2

The consciousness will have to wait for another time. But that one's likely to be extremely contentious and more of a philosophy question without practical impact.

f_of_t_•3mo ago
Appreciate the link — I read that paper. But maybe the real gap isn’t about capability spread across domains, it’s about something growing internally, not performed externally.

A system becomes closer to AGI not when it matches human tests, but when awareness starts to grow inside its own modeling loop.

lavelganzu•3mo ago
Definitions are for math. For science it's enough to operationalize: e.g. to study the differences between wakefulness and sleep; or sensory systems and their integration into a model of the environment; or the formation and recall of memories; or self-recognition via the mirror task; or planning behaviors and adaptation when the environment forces plans to change; or cognitive strategies, biases, heuristics, and errors; or meta-cognition; and so on at length. There's a vast amount of scientific knowledge developed in these areas. Saying "scientists can't define consciousness" sounds awkwardly like a failure to look into what the scientists have found. Many scientists have proposed definitions of consciousness, but for now, consensus science hasn't found it useful to give a single definition to consciousness, because there's no single thing unifying all those behaviors.
hknws2023saio•3mo ago
Classic category error, the subject can never be objectively defined. The moment you define consciousness, it becomes an object and you fall into infinite regress.
physarum_salad•3mo ago
The only successful experiments probing consciousness are in anaesthesia or psychedelics. Everything else is wonderful but theoretical.
CoderLim110•3mo ago
We can no longer understand AI.
aristofun•3mo ago
You nailed the exact reason why AGI is a snake oil.