frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Browse and buy all CS2 skins, cases, capsules, and more

https://pricempire.com/
1•salkahfi•6m ago•0 comments

Allen Lane

https://en.wikipedia.org/wiki/Allen_Lane
2•petethomas•7m ago•0 comments

Open source system profiling, app tracing and analysis for Linux and Android

https://perfetto.dev/
1•teleforce•7m ago•0 comments

Noam Chomsky Slams ŽIžek and Lacan: Empty 'Posturing' (2013)

https://www.openculture.com/2013/06/noam_chomsky_slams_zizek_and_lacan_empty_posturing.html
1•appreciatorBus•9m ago•0 comments

Cara Membatalkan Pinjaman Adapundi

1•mananannsnnns•16m ago•0 comments

Future Architecture Technologies: POE2 and VMTE

https://community.arm.com/arm-community-blogs/b/architectures-and-processors-blog/posts/future-ar...
1•c2xlZXB5•16m ago•0 comments

Fluidity Index: Next-Generation Super-Intelligence Benchmarks

https://arxiv.org/abs/2510.20636
2•richasfuck•18m ago•0 comments

Cara Pembatalan Pinjaman EasyCash

1•mananannsnnns•18m ago•0 comments

Trusted Prompts

https://zero2data.substack.com/p/trusted-prompts
1•wj•21m ago•1 comments

Five countries dominating semiconductor production in 2025

https://www.wionews.com/photos/5-countries-dominating-semiconductor-production-in-2025-1761324120797
1•teleforce•24m ago•0 comments

Acwj: A Compiler Writing Journey

https://github.com/DoctorWkt/acwj
1•pykello•26m ago•0 comments

Hardware Hedging Against Scaling Regime Shifts: Mlscaling

https://old.reddit.com/r/mlscaling/comments/1eyophn/hardware_hedging_against_scaling_regime_shifts/
1•mefengl•31m ago•0 comments

SARS-CoV-2 mRNA vaccines sensitize tumours to immune checkpoint blockade

https://www.nature.com/articles/s41586-025-09655-y
6•croemer•34m ago•0 comments

Fixing Intel Foundry Is Like Stopping Tripping Down the Stairs

https://www.nextplatform.com/2025/10/24/fixing-intel-foundry-is-like-stopping-tripping-down-the-s...
2•ashvardanian•35m ago•0 comments

Retrospective on Weaknesses in Fuzzing Research

https://addisoncrump.info/research/what-the-hell-are-we-doing/
2•todsacerdoti•36m ago•0 comments

A worker fell into a nuclear reactor pool

https://www.nrc.gov/reading-rm/doc-collections/event-status/event/2025/20251022en?brid=vscAjql9kZ...
72•nvahalik•37m ago•35 comments

Clock-keepers prepare to turn back time

https://www.bbc.com/news/articles/c1wl0219p4yo
1•1659447091•38m ago•0 comments

Show HN: Thumbnail Bench 1.0

https://tubesalt.com/thumbnail-bench
1•barefootford•43m ago•0 comments

Microsoft's Halo series heading to rival PlayStation for first time

https://www.bbc.com/news/articles/ckg14442r73o
1•1659447091•45m ago•0 comments

Starter Guide for London Founders

https://www.makeinlondon.com/
1•ashvardanian•48m ago•0 comments

A.I. slop and the epidemic of Bad writing [video]

https://www.youtube.com/watch?v=JJLoLdyJ5-g
2•andy99•48m ago•0 comments

Why the New Leisure Class Enjoys Activism and Philanthropy

https://letter.palladiummag.com/p/early-article-why-the-new-leisure
3•walterbell•52m ago•0 comments

Bay Area chief of police allegedly commutes from Idaho

https://abc7news.com/post/exclusive-millbrae-police-chief-facing-questions-allegedly-commuting-wo...
3•pastureofplenty•54m ago•1 comments

Faster, Higher, Stronger–and Full of Drugs. The Billionaire Quest to Hack Sports

https://www.wsj.com/sports/enhanced-games-swimmer-world-record-doping-c415384b
1•bookofjoe•1h ago•1 comments

Bitter taste preferences are associated with antisocial personality traits

https://www.sciencedirect.com/science/article/abs/pii/S0195666315300428
4•nreece•1h ago•3 comments

Show HN: Convert any MCP server to a Claude Skill (90% context savings)

https://gist.github.com/Felo-Sparticle/69f4b54fb3c67fa9d9d9db78dd615a1d
2•jinfeng79•1h ago•0 comments

The Problem with Farmed Seafood

https://nautil.us/the-problem-with-farmed-seafood-1243674/
1•dnetesn•1h ago•0 comments

China's 'Great Green Wall' brings hope but also hardship

https://www.japantimes.co.jp/news/2025/10/03/asia-pacific/china-great-green-wall/
1•PaulHoule•1h ago•0 comments

Magic Leap raises $1B from Saudi Arabia's Vision 2030 strategy

https://www.cryptopolitan.com/saudi-places-1-billion-bet-on-a-vr-company/
2•Olshansky•1h ago•0 comments

Show HN: Proofof Absence Solved

1•Epistria•1h ago•0 comments
Open in hackernews

Sora might have a 'pervert' problem on its hands

https://www.businessinsider.com/sora-video-openai-fetish-content-my-face-problem-2025-10
35•zdw•2h ago

Comments

panny•1h ago
Isn't generating fetish content a legitimate business model? Doesn't anyone remember the song,

>The internet is for porn

I actually think this is what's going to happen with AI once the easy money dries up. They'll quickly race to the bottom selling porn generators. AI slop porn already seems like the majority usage after homework generation.

JKCalhoun•1h ago
Don't even bother to look at r/grok. When you have a waifu anime avatar… yeah, it went exactly where you would expect it to. Apparently XAi is reigning in the NSFW and people on the sub are livid.
redwood•1h ago
Well thanks for that rabbit hole... 20 mins I'll never get back :P funny tho
QuadmasterXLII•1h ago
The article is describing a feedback loop: very few women consent to have their face be usable -> the number of perverts vastly outnumber the number of women so each woman, who has posted nothing nsfw, or even suggestive, gets dedicated attention from many perverts -> other women aren’t comfortable allowing their faces to be used by large numbers of perverts-> very few women consent to have their faces be usable.

This is all independent of what is and isn’t a legitimate business model, it’s a social dynamic. It’s also a pretty familiar one: it shows up everywhere from nightclub bouncer policies to the dynamics of early 2000s irc rooms

panny•1h ago
You can also use photoshop to glue a woman's head on a porn star's body. The results are about the same as AI slop. AI is just faster.
threatofrain•19m ago
The number of perverts vastly outweigh the number of women? Um?
frumplestlatz•1h ago
> I've allowed anyone to make "cameos" using my face. (You don't have to do this …)

… and this is probably where the article should’ve ended. Or in fact, where the author should’ve realized there didn’t need to be an article at all.

People are weird and gross. They do weird things that we would often prefer they not. Sora provided a tool to avoid that weirdness. Use it.

GaryBluto•1h ago
All evidence points to the author trying to make this a moral panic, especially with the emphasis on "real women" being used to generate these (despite the feature meaning they have consented to it)
JKCalhoun•1h ago
Will you be a little outraged though when they use real women without their consent?

If we know anything about software in general (and AI specifically) getting around roadblocks is often a fairly simple thing.

panny•1h ago
>Will you be a little outraged though when they use real women without their consent?

Probably not. AI slop doesn't really "go viral" except when it is super ridiculous like shrimp Jesus. Most people generating AI slop porn are likely in the 10s of people who will see it. If someone generates porn with my face on it and I never even know, how does this harm me? Why should I care?

rafterydj•1h ago
This seems like you are not fully thinking this through, either intentionally or not. Benefit of the doubt, I will add to your question about why you should care with more questions: what if it's someone you know? what if they want to then sell the generated content? what if it was your political enemies? what if it was your boss? what if it was someone who was stalking you or made you feel unsafe?
woodruffw•1h ago
> Most people generating AI slop porn are likely in the 10s of people who will see it. If someone generates porn with my face on it and I never even know, how does this harm me? Why should I care?

You would presumably care if one of those "10s of people" was a family member or peer.

Maybe not caring is enlightened of you, but it shouldn't stretch your imagination to consider why others would.

portaouflop•1h ago
There are dozens of scenarios where this will completely ruin your life. One example is someone generating prom of you with underage children and sending it to everyone you know -> life ruined. You can already easily do this with open source models
UncleMeat•46m ago
Have you been in a high school in the past two years? "Your classmates will generate AI porn of you and share it amongst your other classmates" is rampant.
exasperaited•1h ago
Yes. And we also have documented cases where generative AI, in the hands of people with serious psychological issues, is significantly accelerating those people's loss of control of those issues, to very negative outcomes.

The fact that the AI industry is apparently littered with incredibly immature guys who perceive themselves to be Randian superheros does not reassure me that this tool is going to be better.

frumplestlatz•51m ago
What additional solution would you propose?
torginus•1h ago
People always have been pervy and gross in private and society has always provided outlets for them to do so while preserving their privacy and dignity.

However, using a wannabe AI social media platform to engage with this stuff (and said platform encouraging you to do so), is crossing several uncomfortable lines for me.

frumplestlatz•58m ago
It crosses lines for me too. So I won’t be allowing people to use my face. QED.
torginus•42m ago
I mean any sort of involvement in this in any way, either making stuff or being used as a face model or anything. This is semi-public and content you make is associated with you, and so is the content made about you.
Lerc•1h ago
This does appear to be her shtick. Engaging in a thing so she can report upon it.

https://www.businessinsider.com/threads-meta-engagement-rage...

userbinator•1h ago
But the idea that someone was making a video that had some potential sexual gratification element made me feel fairly icked out.

Clearly the author has never visited 4chan... but I think seeing others make such content with your appearance should be taken as flattery.

More seriously, I hope this flood of easy video generation will cause people to more easily realise how they can be persuaded, and increase skepticism of evidence in general.

exasperaited•1h ago
> but I think seeing others make such content with your appearance should be taken as flattery.

I think maybe you're an adult male.

Having actually talked to some female friends about this, I'm pretty sure that women in general don't take so well to the idea of tools that might be used to encourage the fantasies of the men that already have a dangerous interest in harming them sexually.

Whether the Valley thinks that's their problem to solve, I doubt. But making a joke out of it is pretty fucked up, dude.

ETA: even women who have done some modelling and are a bit more aware of the way those images are used are at least somewhat concerned about content that can make them act and speak like puppets. This is at least as much about consent as it is about content.

ETA2: I am rate-limited for being an argumentative sod in the past so I will finally edit this to note that 1) I am replying only to the sentence I quoted which has very troubling connotations, and 2) I really think a lot of people here seem not to have read Julian Dibbell's crucial 1993 article "A Rape In Cyberspace" and it really shows.

aleph_minus_one•1h ago
> tools that might be used to encourage the fantasies of the men that already have a dangerous interest in harming them sexually.

This assumption sounds like being taken from some feminist manifesto.

crooked-v•1h ago
One in five women in the US experience rape or attempted rape during their lifetime, overwhelmingly by someone they alreay know at the time (https://www.nsvrc.org/statistics).
exasperaited•1h ago
And the majority have experienced some stalking in real life.
afavour•1h ago
OP did use the word “might”. Not saying every man has that dangerous interest but are there men out there with sexually dangerous intent? Without a doubt.
exasperaited•1h ago
More to the point, there are men out there with unhealthy interests that haven't and might not accelerate to something worse.

Much like when that AI executive dude started talking about layers of hidden reality or whatever it was, that some LLM helped him "find": people were clear that whatever his problems, he might not have blurted that shit out loud, or even developed those thoughts as much, if it were not for the reassurance loop of whatever tool was helping him go a bit more mad.

We understand what happened in his case, right? Perhaps he was keeping that under control and then wasn't, because it was all so plausible.

Now imagine it being video of some young woman realistically depicted doing things she has not consented to do, in the hands of a man who is obsessed and is just keeping that under control. An obsessive fan, for example.

Aeolun•42m ago
You can just as easily imagine that going the other way?

I get your point, but I’ve never seen any research into whether this material makes people more or less likely to actually perpetrate crimes related to it.

A chat loop is a bit different from a static video/photo.

Aeolun•45m ago
Sure, but by that reasoning there’s women out there with sexually dangerous intent too.
exasperaited•1h ago
It's a totally reasonable, evidence-backed, dare I say it plainly obvious, non-feminist assumption that it might be used that way, and you know it.
smcin•1h ago
The OP never at any point suggested "harming them sexually". Her objection was to people making fetish content of her (or other real people) without their consent, which is entirely a reasonable objection. Let us not misrepresent her.

(As commented elsewhere, the author expressly opted-in, apparently with the intent of generating ragebait to write an article about.)

anigbrowl•1h ago
I think seeing others make such content with your appearance should be taken as flattery.

Post body or gtfo

(for those lacking context, this is a callback to a 4chan trope that is inextricable from OP's argument)

yesbut•1h ago
might? ascii art has a pervert problem.
appreciatorBus•1h ago
Yup. Likewise cinematography, photography, fiction writing, spoken language, and cave paintings, all have had a pervert problem since day one.
yesbut•1h ago
humans are perverts. who knew?
marcellus23•1h ago
> And how do you stop people from making fetish content of purely AI-generated characters that aren't cameos of real people? Does OpenAI want to stop that? Maybe OpenAI thinks it's fine for people to make belly-flation or foot-fetish videos as long as they're not of a real person.

I can't figure out the tone here. Is the author suggesting we should stop people from creating fetish content of purely AI-generated characters? OpenAI might want to for business reasons, but surely there's nothing inherently wrong with using AI for fetish content. Should we also stop people from drawing fetish content with pencil and paper?

fwip•1h ago
I don't think she's passing any judgment here - she pointed out earlier in the article that she knows people are into weird stuff, and didn't want to "yuck their yum."
ivape•1h ago
The problem is adults contribute to turning public platforms lewd. So one lewd person on Instagram leads to many, leading to a lewd platform. This becomes problematic when the children uptake it. It’s not really too different than prostitution appearing near general-purpose places, it turns it into a red light district.

A lot of social media is a sex platform, and it got mixed up in this way because there’s no talking adults out of being lewd in public.

frumplestlatz•53m ago
I wish we could talk adults out of it, but many decades on this earth have convinced me that’s just not going to happen.
smcin•36m ago
A good-faith reading of the sort of suggestion the author never made is for the blanket opt-in consent to allow other users generate images/videos of you to be segmented into separate consents for PG, adult, fetish etc.; also face/whole body. A very clear consent form that tells them upfront "If you consent to users generating fetish consent of you, here are some examples of what's allowed and forbidden".

> Is the author suggesting we should stop people from creating fetish content of purely AI-generated characters?

Presumably not, but she's farming outrage rather than suggesting any fix. In the above suggested setup, people could then generate fetish consent from the much smaller set of users who consented to have fetish consent generated from them. But then of course they might expect some royalties or revenue-sharing, or at least identification/attribution/watermarking so the depicted user could drive traffic to social-media. OpenAI is skirting around not just segmented consent but any concept of revenue-sharing (i.e. OpenAI wants to dip its toe into OnlyFans territory, but without any revenue-sharing or licensing deal with creators).

aleph_minus_one•4m ago
> OpenAI is skirting around not just segmented consent but any concept of revenue-sharing (i.e. OpenAI wants to dip its toe into OnlyFans territory, but without any revenue-sharing or licensing deal with creators).

OpenAI is still doing basic experiments with which product offering are well received by users and/or work well and which are not. If some data provided by users (e.g. photos depicting the user) are seen to be very essential to the success of the AI-created content using this data so that OpenAI will likely loose an insane amount of money of these users leave (I think this is rather unlikely, but not impossible), then OpenAI will think about some concept of revenue-sharing, but not before (why should they?).

blindriver•1h ago
This is kind of a side issue but I think the time has come where videoing in public and using images of people without explicit consent should be outlawed. It used to be okay when the only medium was TV or movies and it was hard to distribute but now I think it has become a nuisance at best and dangerous at worst. The only thing I would make an exception for is actual journalists with some sort of credentials. But posting videos on social media of unwitting unconsenting people should be outlawed in the age where people with phones can upload videos in second.
afavour•1h ago
In a world where absolutely everyone has a camera in their pocket I can’t see how you’d ever be able to enforce this.
lyu07282•44m ago
Terrible argument, you should say because of generative AI where anybody can fabricate any image/video/audio of anyone in any context without their consent we urgently need new regulation of social media platforms to accommodate this profound change in our reality. Then make your argument.

You have no expectation of privacy in public spaces since forever, that is not a problem. Because nobody can photograph you stabbing someone and uploading it on social media without you actually stabbing someone. This is now different, because anyone now can make that photograph of you stabbing someone and post it.

That must be your argument.

And it must be on the social media side, because in X months some open model on GitHub is gonna make every watermark or cloud-based safety feature meaningless anyway.

ungreased0675•1h ago
Wait a sec, you have to specifically opt-out of this? OpenAI needs to be bankrupted for their massive copyright infringement.
_345•1h ago
no she opted in
lschueller•1h ago
Yep, guilty as charged
Retr0id•1h ago
Who could possibly have seen this coming.
busymom0•1h ago
> I've allowed anyone to make "cameos" using my face. (You don't have to do this: You can choose settings that make your likeness private, or open to just your friends — but I figured, why not? And left my likeness open to everyone, just like Sam Altman.)

So the author specifically allowed their face to be used for content and then is surprised people acted on it? This is silly imo.

People should just never even allow their face to be used for this.

This reminds me of the "Joanne is awful" episode from black mirror. Exact same story.

portaouflop•1h ago
Tbh if you are sufficiently famous or have a bunch of pictures of you available online you don’t need to opt-in.

So this is silly yes but people can still easily make deepfake porn of you from any photo available with other oss tools

Aeolun•1h ago
Sure, but then they’re not immediately public on Sora xD
sathackr•1h ago
Rule 34
portaouflop•1h ago
Making fetish porn is the no1 use case for AI since inception.
cyberax•1h ago
Just force Sora to avoid photos of real people. Instead, synthesize a "generic" face. This can be done with training, create a database of photos to avoid and train Sora on that.
itake•1h ago
the whole point of sora is you can generate photos of yourself and friends...

The app allows you to control if other people can generate photos of you. If the author doesn't want other people to make these photos, disable public video generation...

thfuran•1h ago
Is it actually easy to train the AI on all the faces so it can make pictures that look like humans while also training it to not make pictures that look like any specific human?
zb3•1h ago
Journalism might have a "bullshit content" problem on its hands.
doganugurlu•53m ago
I have no idea how I would feel if someone used my face in fetish or sexual content. Never happened. It’d probably make me uncomfortable. But, I imagine I would be ok if I grew up in a culture where sex wasn’t as much of a taboo. Maybe I would find it flattering.

My mental test for deciding whether something should be illegal or unacceptable is questioning if any one would see it the same if religion never existed.

During #metoo I remember reading an article where the author was uncomfortable with the “drug fueled sex parties in Silicon Valley.” They basically didn’t want consenting adults to do drugs or engage in group sex. The argument against fetish content with AI generated characters reminded me of the #metoo author’s discomfort with the drug/sex freedom of the Bay Area. The article about Sora sounds like the author is uncomfortable with people generating fetish content, regardless of the content featuring real people or not.

It’s sad that the liberals now include the prudes/conservatives.

frumplestlatz•45m ago
I don’t think one is a prude or a conservative for not wanting AI generated porn of themselves to exist.

I also don’t think one is a prude or a conservative for thinking there are consent and power issues around anything that commingles sex and the workplace.

Things can be unacceptable without being illegal. Things can even be unacceptable without needing to be banned or privately controlled.

My bar for what should be unacceptable is a lot lower than my bar for what should be illegal or privately banned.

Making weird pregnancy fetish videos of real people without their permission is definitely unacceptable. I have no issue with the idea that anyone doing that should be shamed.