frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

FCC Asked to Give Spectrum to Allow SpaceX Starlink to Make a Better GPS

https://www.nextbigfuture.com/2025/05/fcc-asked-to-give-spectrum-to-allow-spacex-starlink-to-make-a-vastly-better-gps.html
1•dzhiurgis•1m ago•0 comments

Fortnite's Darth Vader Is A.I.-Powered. Voice Actors Are Rebelling

https://www.nytimes.com/2025/05/20/arts/fortnite-darth-vader-ai-voice.html
1•donohoe•4m ago•0 comments

Utm_source params for each chatbot – see your AI referral traffic

https://generate-visibility.ghost.io/how-to-see-chatgpt-referral-traffic-on-google-analytics/
1•jenthoven•7m ago•0 comments

The Value Isn't in the Code

https://jonayre.uk/blog/2022/10/30/the-real-value-isnt-in-the-code/
2•fragmede•7m ago•0 comments

Rethinking Sleep from First Principles

https://www.affectablesleep.com/blog/rethinking-sleep-from-first-principles
1•pedalpete•14m ago•0 comments

Intelligent Internet Agent Explanation

https://deepwiki.com/Intelligent-Internet/ii-agent/1.1-installation-and-setup
1•randomcatuser•15m ago•0 comments

Application Isolation using NixOS Containers (2021)

https://msucharski.eu/posts/application-isolation-nixos-containers/
1•kugurerdem•15m ago•0 comments

How Peter Thiel's Relationship with Eliezer Yudkowsky Launched the AI Revolution

https://www.wired.com/story/book-excerpt-the-optimist-open-ai-sam-altman/
1•jandrewrogers•22m ago•0 comments

Creating Art with Code (2020)

https://medium.com/@mauralian/coding-art-312efa2020fd
1•airstrike•23m ago•0 comments

The Laser Revolution Part I: Megawatt beams to the skies

http://toughsf.blogspot.com/2025/05/the-laser-revolution-part-i-megawatt.html
1•EA-3167•23m ago•0 comments

Opus Dei: The Handmaid's School

https://buenosairesherald.com/society/opus-dei-the-handmaids-school
2•Anon84•25m ago•0 comments

Google Acquired GalileoAI

https://twitter.com/arnaudai/status/1924942577545195982
1•nallatamby•25m ago•0 comments

Show HN: I made a pretty cheap marketing breakthrough

https://smarketly.lema-lema.com/
2•abilafredkb•26m ago•0 comments

Ask HN: Frustrated with Health-Data Silos?

2•iaftb•28m ago•0 comments

Google Unveils A.I. Chatbot, Signaling a New Era for Search

https://www.nytimes.com/2025/05/20/technology/personaltech/google-ai-mode-search.html
2•breadwinner•29m ago•0 comments

Login to any user account using other Facebook app access token

https://hackerone.com/reports/101977
2•gilsonconte•30m ago•1 comments

The Napkin Project

https://web.evanchen.cc/napkin.html
3•luu•33m ago•0 comments

Violence on TV: what happens to children who watch?

https://nouvelles.umontreal.ca/en/article/2025/01/20/violence-on-tv-what-happens-to-children-who-watch/
2•gnabgib•34m ago•0 comments

You Are So Not Ready for This ChatGPT Prompt but You Need It

https://medium.com/readers-club/you-are-so-not-ready-for-this-prompt-but-you-need-it-c22391606c2b
2•stevenjgarner•35m ago•1 comments

Biotech companies I wish existed

https://blog.eladgil.com/p/biotech-companies-i-wish-existed
1•todsacerdoti•37m ago•0 comments

Microsoft's Attempted Merger with Intuit

https://dfarq.homeip.net/microsofts-attempted-merger-with-intuit/
2•rbanffy•38m ago•0 comments

Adguard Mail

https://adguard-mail.com/en/welcome.html
1•dotmanish•38m ago•0 comments

"Safe" YAML Monster

https://gist.github.com/taramtrampam/fca4e599992909b48a3ba1ce69e215a2
2•birdculture•41m ago•0 comments

Google Announces Smart Glasses Partnerships [video]

https://www.youtube.com/watch?v=RvMWLYRCj6s
1•handfuloflight•43m ago•0 comments

The Golden Age of computer user groups

https://arstechnica.com/information-technology/2020/08/the-golden-age-of-computer-user-groups/
6•rbanffy•43m ago•0 comments

Lex Fridman twisted my arm into using Cursor and Lovable – welcome Åndra

https://www.aandra.it.com/
2•paaloeye•48m ago•0 comments

Computex 2025: Intel Arc Pro B-Series – By George Cozma

https://chipsandcheese.com/p/computex-2025-intel-arc-pro-b-series
2•rbanffy•48m ago•0 comments

Multiple systems to estimate the number of unattributed paintings by Modigliani

https://link.springer.com/article/10.1007/s10260-024-00774-w
1•bookofjoe•51m ago•0 comments

Subprocess.run (Domain)

https://docs.python.org/3/library/subprocess.html
1•uonr•52m ago•0 comments

You can try Imagen 4 at Krea

https://www.krea.ai/image
2•dvrp•56m ago•1 comments
Open in hackernews

Google AI Ultra

https://blog.google/products/google-one/google-ai-ultra/
213•mfiguiere•5h ago

Comments

codydkdc•4h ago
I really want Google to launch a Claude Code/OpenAI Codex CLI alternative. also if they included a small VM in one of these subscriptions I'd seriously consider it!
boole1854•4h ago
They are working on it: https://jules.google/
unshavedyak•3h ago
I got the feeling Jules was targeted at Web (ala Github) PR workflows. Is it not?

The Claude Code UX is nice imo, but i didn't get the impression Jules is that.

kridsdale1•3h ago
At Google, our PR flow and editing is all done in web based tools. Except for the nerds who like vi.
codydkdc•3h ago
people don't use local editors? it's weird to lock people into workflows like that
johnisgood•1h ago
Damn... you guys don't use proper text editors?
incognito124•3h ago
https://firebase.studio/
bn-l•37m ago
It’s absolutely garbage. I was annoyed because there’s a lot of hype on Reddit.
johnmlussier•4h ago
I’m not interested in about 75% of this offering. Really wish they had pieced this out or done a credit system for it all. I want higher coding and LLM limits, Deep Think, and would like to try agent stuff, but don’t care at all about image or video generation, Notebook limits, YouTube Premium, or more storage.

$250 is a the highest cost AI sub now. Not loving this direction.

camillomiller•4h ago
What other direction would you expect to be possible? Even with this rates most AI companies are still bleeding money.
piskov•4h ago
If anyone it’s Google that would be very-very inference efficient (given their custom TPUs and what have you).

However if all this power is wasted on video generation, then even them probably will choke.

Then again, I guess your average Joe/Jane will looove to generate some 10 seconds of you daily whatsapp stuff to share.

karmakurtisaani•4h ago
I wonder how long this free access to LLMs can continue. It's like early days of Facebook, before the ads and political interference started to happen. The question is, when will we see the enshittification of LLMs?
camillomiller•3h ago
Exactly. When Masayoshi Son’s money will run out I guess.
Workaccount2•4h ago
I'm assuming their will be an ala carte API offer too.
causal•4h ago
Yeah it says "designed for coding" but it's missing the one thing programmers need which is just higher Gemini API usage limits.
_heimdall•4h ago
Building and running these models is expensive, there's no way around that in the near future.

LLM companies have just been eating the cost in hopes that people find them useful enough while drastically subsidized that they stay on the hook when prices actually cover the expense.

linsomniac•4h ago
At $125/mo for 3 months I'm tempted to try it, but I don't understand how it would interfere with my existing youtube/2TB storage family plan, and that's a big barrier for me.
paxys•2h ago
What's wrong with just using the API for that?
logicchains•2h ago
Deep Think isn't available on the API.
paxys•1h ago
Deep Think is only available on the API. It's just restricted to "trusted testers" right now before a wide launch.
lallysingh•4h ago
$250 a month. Oh my.
akomtu•4h ago
"Rent a brain for only $250/mo"
SoftTalker•3h ago
About an hour of a senior developer's fully-loaded cost.
paxys•2h ago
And less than an hour of an external consultant's time.
piskov•4h ago
Given the lack of comments after an hour passed, we have a strong case of maybe five Google AI Ultra subscribers worldwide.

I, personally, try to stay as far as possible from google: Kagi for search, Brave for browsing (Firefox previously), Pro on OpenAI, etc.

We’ll see how fair OpenAI will be with tracking and what have you (given “off” for improve for everyone), but Google? Nah.

rohansood15•4h ago
"I think there is a world market for maybe five computers." -- Thomas Watson, chairman of IBM, 1943.
sunaookami•3h ago
This is an urban legend btw, Thomas Watson never said that.
__natty__•4h ago
> YouTube Premium: An individual YouTube Premium plan lets you watch YouTube and listen to YouTube Music ad-free, offline and in the background.

It seems weird to me, they included entertainment service in „work” related plan.

jihadjihad•4h ago
They don't even have the decency to make it a family plan, either.
dewey•4h ago
It's not though, it's just the highest tier of the regular "Google One" account that also has Google Photos etc. included.
kumarm•4h ago
So everyone who want Youtube Premium can explain to their boss why they need Gemini AI Ultra for work?
Keyframe•4h ago
Hmm, interesting. There's basically no information what makes Ultra worth that much money in concrete terms except "more quota". One interesting tidbid I've noticed is that it seems Google One (or what is it called now) also carries sub for youtube. So far, I'm still on "old" Google One for my family and myself storage and have a separate youtube subscription for the same. I still haven't seen a clear upgrade path, or even a discount based on how much I have left from the old subscription, if I ever choose to do so (why?).

edit: also google AI Ultra links leads to AI Pro and there's no Ultra to choose from. GG Google, as always with their "launches".

flakiness•4h ago
I believe Imagen 4 and Veo 3 (the newest image/video models) and the "deep think" variant are for Ultra only. (Is it worth it? It's a different question.)
skybrian•3h ago
I just tried it and Whisk seems to be using Imagen 4 and and Veo 2 when used without a subscription.
ComplexSystems•4h ago
The problem with all of these is that SOTA models keep changing. I thought about getting OpenAI's Pro subscription, and then Gemini flew ahead and was free. If I get this then sooner or later OpenAI or Anthropic will be back on top.
SirensOfTitan•4h ago
This is even the case with Gemini:

The Gemini 2.5 Pro 05/06 release by Google’s own reported benchmarks was worse in 10/12 cases than the 3/25 version. Google re routed all traffic for the 3/25 checkpoint to the 05/06 version in the API.

I’m also unsure who needs all of these expanded quotas because the old Gemini subscription had higher quotas than I could ever anticipate using.

magicalist•3h ago
> I’m also unsure who needs all of these expanded quotas because the old Gemini subscription had higher quotas than I could ever anticipate using.

"Google AI Ultra" is a consumer offering though, there's no API to have quotas for?

airstrike•4h ago
This 100%. Unless you are building a product around the latest models and absolutely must squeeze the latest available oomph, it's more advantageous to just wait a little bit.
pc86•3h ago
I am willing to pay for up to 2 models at a time but I am constantly swapping subscriptions around. I think I'd started and cancelled GPT and Claude subscriptions at least 3-4 times each.
xnx•3h ago
> If I get this then sooner or later OpenAI or Anthropic will be back on top.

The Gemini subscription is monthly, so not too much lock-in if you want to change later.

Ancapistani•3h ago
I wonder if there's an opportunity here to abstract away these subscription costs and offer a consistent interface and experience?

For example - what if someone were to start a company around a fork of LiteLLM? https://litellm.ai/

LiteLLM, out of the box, lets you create a number of virtual API keys. Each key can be assigned to a user or a team, and can be granted access to one or more models (and their associated keys). Models are configured globally, but can have an arbitrary number of "real" and "virtual" keys.

Then you could sell access to a host of primary providers - OpenAI, Google, Anthropic, Groq, Grok, etc. - through a single API endpoint and key. Users could switch between them by changing a line in a config file or choosing a model from a dropdown, depending on their interface.

Assuming you're able to build a reasonable userbase, presumably you could then contract directly with providers for wholesale API usage. Pricing would be tricky, as part of your value prop would be abstracting away marginal costs, but I strongly suspect that very few people are actually consuming the full API quotas on these $200+ plans. Those that are are likely to be working directly with the providers to reduce both cost and latency, too.

The other value you could offer is consistency. Your engineering team's core mission would be providing a consistent wrapper for all of these models - translating between OpenAI-compatible, Llama-style, and Claude-style APIs on the fly.

Is there already a company doing this? If not, do you think this is a good or bad idea?

planetpluta•3h ago
I think the biggest hurdle would be complying with the TOS. Imagine that OpenAI etc would not be a fan of sharing quotas across individuals in this way
Ancapistani•2h ago
How does it differ from pretty much every SaaS app that's using OpenAI today?
wild_egg•3h ago
Isn't that https://openrouter.ai? Or do you have something different in mind?
Ancapistani•2h ago
I haven't seen this, but it looks like it solves at least half of what I was thinking.

I'll investigate. Thanks!

mrnzc•2h ago
I think what Langdock (YC-backed, https://www.langdock.com) offers might be matching to your proposal?!
Ancapistani•2h ago
Looks like this is at least the unified provider. I'll dig in - thanks :)
UncleOxidant•2h ago
You can just surf between Gemini, DeepSeek, Qwen, etc. using them for free. I can't see paying for any AI subscription at this point as the free models out there are quite good and are updated every few months (at least).
devjab•1h ago
I wonder why anyone would pay these days, unless its using features outside of the chatbot. Between Claude, ChatGPT, Mistral, Gemini, Perplexity, Grok, Deepseek and son on, how do you ever really run out of free "wannabe pro"?
smeeth•4h ago
I hate hate hate putting Deep Think behind a paywall. It's an ease-of-use tax. I fully expect to be able to get it over API through Poe or similar for way cheaper.

Just have usage limit tiers!

lvl155•4h ago
Google is a bit tone deaf with these offerings. Are they interested in competing?
vFunct•4h ago
Does this include Gemini 2.5 Pro API access? What are the API limits?

I blew through my $40 monthly fee in Github Copilot Pro+ in a few hours. =^/

Workaccount2•4h ago
I suspect Google sees the writing on the wall, and needs to move to a more subscription based business model. I don't think the ad model of the internet is dead, but I also don't think it was particularly successful. People block ads rather than forgo those services, ads conditioned people to think everything on the internet is free, and the actual monetization of ad views makes people pretty uncomfortable.

So here we are, with Google now wading into the waters of subscriptions. It's a good sign for those who are worried about AI manipulating them to buy things, and a bad sign for those who like the ad model.

Is the future going to be everyone has an AI plan, just like a phone plan, or internet plan, that they shell out $30-$300/mo to use?

I honestly would greatly prefer it if it meant privacy, but many people seem to greatly prefer the ad-model or ad-subsidized model.

ETA: Subscription with ads is ad-subsidized. You pay less but watch more ads.

aceazzameen•4h ago
It will eventually be subscriptions PLUS ads combined.
continuational•4h ago
YouTube is probably the most expensive streaming app, and there are still ads (sponsored sections) in nearly every video.
tintor•3h ago
Sponsored sections are baked into video and very easy to skip.

Unlike platform ads which disable video control while the ad is playing.

jeffbee•3h ago
Cannot find any factual basis for the claim that YTP is the most expensive streaming app. Is this the case in some non-US market?
jerjerjer•3h ago
> ads (sponsored sections) in nearly every video.

SponsorBlock for YouTube resolves the issue.

conductr•3h ago
No idea if this is right but based on AI summary google results, annual operating costs for Youtube is ~$3B-5B and Netflix is ~$25B-30B. While YT probably spends most on CDN/bandwidth, they have a mostly free content cost which is by far Netflix's largest expense
iamdelirium•3h ago
By that metric, every streaming platform has ads since they serve movies with product placement.
ninininino•4h ago
They already do subs for YouTube w/o ads and for storage (email attachments + Google Photos + Google Drive), for Stadia while it was around.
jeffbee•4h ago
This doesn't seem like new territory ("wading in"). This is an extension of the existing Google One plans to reach people with extreme demands.
Etheryte•4h ago
I think this is a bit too rose tinted glasses. Being a paying customer doesn't necessarily mean you won't get ads, look at Netflix for a start. Their cheapest paid tier still gets ads. The subscription model will be an addition to the ad revenue, not a replacement.
ljm•3h ago
It should mean that though.

Ads are well and truly the cancer on the service industry.

It’s an outright abuse to force ads and then make you pay for the bandwidth of those ads on your own plan to render them.

pc86•3h ago
Anyone can say A should mean B, that doesn't mean it's obviously true.

Very few services still commercially viable today actually force ads - meaning there is no paid tier available that removes them entirely.

I don't particularly like ads but this idea that any advertisement at any point for any good or service is by definition a cancer is a fringe idea, and a pretty silly one at that.

myko•2h ago
Google used to let you pay a flat rate to avoid (most) of their ads. It was nice. This program was, of course, canceled.
MichaelZuo•3h ago
How does this relate to the parent?

I don’t there was a claim that nobody would ever offer a partially subscription partially ad funded service.

skarz•4h ago
I wonder how viable the referral link/referrer code method is? Based on my own YouTube viewing habits it seems like a lot of prominent channels have gone that route. Seems like it could work for the web overall. Ads would no longer have to target via cookies or browsing history because you could just serve links or offer codes related to your site's content.
abtinf•3h ago
> I also don't think [the ad model of google] was particularly successful.

Only on HN.

narrator•3h ago
Yeah, only a few trillion in revenue over the last decades including Facebook and others. Not particularly successful.
jonluca•3h ago
Actually hilarious, the distribution of comments on HN is truly bimodal
Workaccount2•3h ago
I mean that in that I don't think it lived up to what Google envisioned. People have extremely hostile views towards ads but have a full expectation that everything is just an account creation away, if not outright just given away.

30% of people who use Google don't view their ads. It's hard to call a business where 30% of people don't pay successful. The news agencies picked up on this years ago, and now it's all paywalls.

This doesn't even get into the downstream effects of needing to coax people into spending more time on the platform in order to view more ads.

tekla•1h ago
Maybe if you ignore objective reality.

Google ads revenue AND income has consistently risen basically forever. Its ~75% of Alphabets total revenue and corresponds to over ~%50% of all Ad revenue in the world.

netsharc•3h ago
Heh, although I'm a cheapskate, the ad-based world is a fucked up one. We now have an attention-economy, trying to keep you hooked on the content so "the platform" can serve you ads and earn money off you. And they do that by serving content that engages you, and apparently it's content that stirs up a lot of emotions.

"Worried about refugees? Here's some videos about refugees being terrible". Replace "refugee" with "people celebrating Genocide", etc, etc...

add-sub-mul-div•3h ago
> Is the future going to be everyone has an AI plan, just like a phone plan, or internet plan, that they shell out $30-$300/mo to use?

Not the people who haven't been trained to require the crutch.

kleiba•4h ago
These prices are nuts, in my opinion. It basically means that only companies can afford access to the latest offerings - this used to be the case for specialist software in the past (e.g., in the medical sector), but AI has the potential to be useful for anyone.

Not a good development.

esafak•4h ago
And I think it is a good thing. If there are buyers, it means they are getting that much value out of it. That there is a market for it. Competition will bring prices down.
mschuster91•4h ago
> Competition will do its thing and bring prices down.

It won't. For now the AI "market" is artificially distorted by billionaires and trillion-dollar companies dumping insane amount of cash into NVDA, but when the money spigot dries out (which it inevitably will) prices are going to skyrocket and stay there for a loooong time.

esafak•4h ago
How will prices skyrocket when there is a flood of open models? Or are you talking about GPU prices? They're already high.
jsheard•3h ago
Who do you think is paying to train those open models? The notable ones are all released by VC-funded startups or gigacorps which are losing staggering amounts of money to make each new release happen. If nobody is making a profit from closed models then what hope do the companies releasing open models have when the money spigot runs dry?

The open models which have already been released can't be taken back now of course, but it would be foolish to assume that SOTA freebies will keep coming forever.

conductr•3h ago
It won't be the end of the world if the 'progress' were to slow down a little, I have trouble keeping up with what's available as it is - much less tinkering with it all
delusional•2h ago
It will because "keeping up" is the sleight of hand. By constantly tweaking the model you don't ever notice anything it's consistently wrong about. If they "slowed progress" you'd notice.

Current AI is Fast Fashion for computer people.

johnisgood•3h ago
I do not think I will ever be able to afford hardware that is capable of running local LLMs. :(

What I can afford right now is literally the ~20 EUR / month claude.ai pro subscription, and it works quite well for me.

mschuster91•2h ago
> How will prices skyrocket when there is a flood of open models?

Easy: once the money spigot runs out and/or a proprietary model has a quality/featureset that other open-weight models can't match, it's game over. The open-weight models cost probably dozens of millions of dollars to train, this is not sustainable.

And that's just training cost - inference costs are also massively subsidized by the money spigot, so the price for end users will go up from that alone as well.

msikora•3h ago
ChatGPT is insanely subsidized. The $20/month sub is such a great value. Just the image gen is about $0.25 a pop through the API. That's 80 image generations for $20.
jeffbee•3h ago
> used to be the case for specialist software

I think that's a great example of how a competitive market drives these costs to zero. When solid modeling software was new Pro/ENGINEER cost ~$100k/year. Today the much more capable PTC Creo costs $3-$30k depending on the features you want and SOLIDWORKS has full features down to $220/month or $10/month for non-professionals.

gigaflop•2h ago
Off-topic, but I work 'around' PTC software, and am surprised to see them mentioned. Got much knowledge in the area?

On-topic, yeah. PTC sells "Please Call Us" software that, in Windchill's example, is big and chunky enough to where people keep service contracts in place for the stuff. But, the cost is justifiable to companies when the Windchill software can "Just Do PLM", and make their job of designing real, physical products so much more effective, relative to not having PLM.

Aurornis•3h ago
> It basically means that only companies can afford access to the latest offerings

The $20/month plan provides similar access. They hint that in the future the most intense reasoning models will be in the Ultra plan (at least at first). Paying more for the most intense models shouldn't be surprising.

There's plenty of affordable LLM access out there.

Calwestjobs•4h ago
Magics !

I do not know what that hate about 250 $ is, just flow is worth more.

leoh•3h ago
I would agree, were I to use flow frequently; but I would guess it’s the most operationally expensive API for Google and they may be subsidizing it (and profit in general) via users that don’t use it (ie software developers).
AJRF•4h ago
Would love to know how many people end up on this plan.

If i had to guess, looking at the features I would have guessed 80 bucks. Absurdly high, but lots of little doodads and prototypes would make the price understandable at that price.

250?!

I actually find that price worrying because it points to a degree of unsustainability in the economics of the products weve gotten used to.

jeffbee•4h ago
The long-standing Google One plan with 30TB of storage was already $150/mo, so your estimate was a bit low.
blagie•4h ago
I'm holding out for "Ultra Max Pro."

(Comment is on the horrible naming; good naming schemes plan ahead for next month's offerings)

kylehotchkiss•3h ago
Somehow, someway you’re gonna need a Dell or Apple Ultra Edition to use it.
ivape•4h ago
Is this the only price to get Google Flow at? Any alternatives? That seemms like the killer app here.
adverbly•3h ago
Price point here is a bit too high... They have bundled so many things together into this that the sticker shock on the price is too much.

I get what they're trying to do but if they were serious about this they would include some other small subscriptions as well... I should get some number of free movies on YouTube per month, I should be able to cancel a bunch of my other subscriptions... I should get free data with this or a free phone or something... I could see some value if I could actually just have one subscription but I'm not going to spend $250 a month on just another subscription to add to the pile...

ehsankia•3h ago
They put anything that makes sense. I don't know if including random movies makes sense.

They got Youtube Premium which is like 15$. 30TB of storage, a bit excessive and no equivalent but 20TB is around 100$ a month.

highwaylights•3h ago
I’m not seeing the relevance of YouTube and the One services to this at all.

I get that Big Tech loves to try to pull you into their orbit whenever you use one of their services, but this risks alienating customers who won’t use those unrelated services and may begrudge Google making them pay for them.

j_maffe•3h ago
Idk if anyone will see these offerings more than just an added bonus, especially when you compare to OAI which asks for more for only the AI models.
OJFord•1h ago
It's trying to normalise it, make it just another part of your Google experience, alongside (and integrated with) your other Google tools. (Though that's weakened a bit by calling it 'AI Pro/Ultra' imo.)
bezier-curve•3h ago
For $250/mo I would hope it includes API access to Gemini 2.5 pro, but it's nice to want things.
highwaylights•3h ago
I can’t see a way that anyone would be able to give uncapped access to these models for a fixed price (unless you mean it’s scoped to your own use and not for company use? Even then, that’s still a risk to the provider.)
bezier-curve•2h ago
I use Msty a lot for personal use. I like its ability to fork conversations. Seems like a simple feature but even ChatGPT's UI, which everyone has tried to copy, is fairly limited by comparison.
pc86•3h ago
As a consumer it seems to me the low hanging fruit for these super-premium offerings is some substantial amount of API credits included every month. Back when API credits were a relatively new thing I used LLMs infrequently enough I just paid $5-10/mo for API credits and used a desktop UI to talk to ChatGPT.

Now they want $200, $250/mo which is borderline offensive, and you have to pay for any API use on top of that?

Aurornis•3h ago
Putting API use into the monthly plans doesn't make a lot of business sense. The only people who would sign up specifically to use API requests on a monthly plan would be looking to have a lower overall bill, then they'd pay-per-request after that. It would be a net loss.
mrbluecoat•3h ago
Amusing they picked a deadly jellyfish attacking earth as their hero image
sharpshadow•3h ago
Technically more than one could use this subscription, if used through the same device. Also to make it available to users in not yet supported countries.
aylmao•3h ago
I can't decide how I feel about Google's design for this looking so Apple-y.

Didn't they just release Material Design Expressive a few days ago [1]? Instead of bold shapes, bold fonts and solid colors it gradients, simple lines, frosted glass, and a single, clean, sans-serif font here. The bento-box slides look quite Apple-y too [2]. Switch the Google Sans for SF Pro, pull back on the border radius a bit, and you've essentially got the Apple look. It does look great though.

[1]: https://news.ycombinator.com/item?id=43975352

[2]: https://blog.google/products/gemini/gemini-app-updates-io-20...

GuinansEyebrows•3h ago
it makes sense if you believe that Google has zero business interest in UI/UX.

they've learned that they can shovel out pretty much anything and as long as they don't directly charge the end-user and they're able to put ads on it (or otherwise monetize it against the interest of the end user), they just don't care.

they've been criticized for years and years over their lack of standardization and relatively poorly-informed design choices especially when compared with Apple's HIG.

solomatov•3h ago
Is there a premium option to control which of your data will be used for training? Or is it implemented the same way as Gemini Pro?
rudedogg•3h ago
If anyone at Google cares, I’d pay for Gemini Pro (not this new $200+ ridiculousness) if they didn’t train on my chats in that case. I actually would like to subscribe..
j_maffe•3h ago
There's already an option for that. The downside is you can't access your chat history.
buildfocus•2h ago
And lots of other features don't work, particularly external integrations. Gemini on Android refuses to do basic things like set a timer unless chat history is enabled. It is the one key feature I really want to pay extra to get, and that preference goes x2 when the AI provider is Google.
submeta•3h ago
Google Ultra: USD 250. Claude Pro: 218 EUR ChatGPT Pro: 220 EUR

Not included: Perplexity, Openrouter, Cursor, etc

Wow. You gotta have lots of disposable income.

thenaturalist•3h ago
There are enough people who do. ;)

And from a business perspective, this is enabling people from solo freelancers to mid managers and others for a fraction of the time and cost required to outsource to humans.

Not that I am personally in favor of this, but I can very much see the economics in these offerings.

loudmax•3h ago
The target market for these offerings are corporations, or self-employed developers. If these tools really do make your developers more productive, $250 a month is easily justifiable. That's only $3000 per year, a fraction of the cost of a full time developer.

Obviously, the benefit is contingent on whether or not the models actually make your developers more productive.

catigula•2h ago
Do any of these companies actually sell their products as developer replacements?
mwigdahl•2h ago
No, but they do sell them as developer augmentations.
catigula•33m ago
Interesting because that post was comparing the cost directly to a developer salary.
6510•2h ago
I hear it doesn't have to be productive right now, if you have deep pockets it is worth being familiar with the tools even if it is just in case.
mattfrommars•2h ago
$250 a month is still a lot of money in India to spend on a digital product.
kkarakk•2h ago
AI has completely eliminated salary adjusted pricing in software. no discounts for 3rd world in anything.
ivm•2h ago
Yup, I was paying $225/mo for three Unity3D subscriptions (basic+iOS+Android) ten years ago, while earning less than $4k/mo – just considered it part of my self-employed expenses.
eastbound•3h ago
The goal is to capture all your disposable AI income, so they can starve the competitors. The goal is, as long as you subscribe to several, increase the price.

And you haven’t strung the price that stings yet.

ZeroTalent•3h ago
that $250/month can make you $20k/month if you do some automation and subjectively unethical things
lazharichir•3h ago
Like taking on a gazillion of contract work?
mirkodrummer•2h ago
Apart from clients that usually are not stupid, you still need to understand requirements to guide the ai in a possible right direction. I don't usually understand boss tickets at first look, very often we need to discuss them, i doubt an ai could despite the hype
add-sub-mul-div•3h ago
And this is still early days, the pre-enshittification era.
paxys•2h ago
This isn't a luxury purchase. If you aren't able to increase your income by $250/mo using these tools or otherwise get $250/mo worth of value out of them then you shouldn't sign up.
i_love_retros•3h ago
This is all a bit silly
qweiopqweiop•3h ago
Am I the only one getting the AI fatigue?
hammock•3h ago
Don’t worry, everyone is. That is why they are switching to quantum soon
charles_f•3h ago
Quantum AI on the blockchain, now in Rust.
spookie•3h ago
With data analytics visualized in AR
tsujamin•3h ago
Obviously not speaking for others experience, but it all makes me feel pretty fatigued, and as if this growing expectation of "AI-enhanced productivity" is coming at the expense of a craft and process (writing software) that I enjoy.
zhivota•3h ago
Ok so I have Google One AI or whatever the previous version of this is called, and what's wild to me is that in Google Sheets, if I ask it to do anything, literally anything, it says it can't do it. The only thing it can do is read a sheet's data and talk about it. It can't help you form formulas, add data to the sheet, organize things, anything, as far as I've seen.

How does Google have the best models according to benchmarks but it can't do anything useful with them? Sheets with AI assist on things like pivot tables would be absolutely incredible.

noosphr•3h ago
>How does Google have the best models according to benchmarks but it can't do anything useful with them?

KPI driven development with no interest in killing their cash cow.

These are the people who sat on transformers for 5 years because they were too afraid it would eat their core business, e.g. Bert.

One need look at what Bell Labs did to magnetic storage to realize that a monopoly isn't good for research. In short: we could have had mass magnetic storage in the 1920s/30s instead of 50s/60s.

A pop sci article about it: https://gizmodo.com/how-ma-bell-shelved-the-future-for-60-ye...

catigula•3h ago
I mean there's an article in Fortune magazine about the people pushing transformer "research" building doomsday bunkers.

Making Google look like the mature person in the room is a tall order but it seems to have been filled.

wrs•2h ago
The AI assistant in Sheets doesn’t understand how even the basic features work. When it’s not saying it lacks basic knowledge, it hallucinates controls that don’t exist. Why even bother having it there?
zb3•3h ago
Good good, please buy it so I can continue using Gemini for free.
retep_kram•3h ago
Given that the AI scene is not stable at all in the current moment (every day a new release that make last month's obsolete), any offer that tries to lock you with a model or model provider is a bad idea.

Pay-per-use for the moment, until market consolidation and/or commoditization.

Aurornis•3h ago
> any offer that tries to lock you with a model or model provider is a bad idea.

It's a monthly plan that you can cancel at any time. Not really locking in.

sandspar•2h ago
30TB of Google storage is a soft lock-in. If you fill it up you're kinda stuck.
charles_f•3h ago
This is the kind of pricing that I expect most AI companies are gonna try to push for, and it might get even more expensive with time. When you see the delta between what's currently being burnt by OpenAI and what they bring home, the sweet point is going to be hard to find.

Whether you find that you get $250 worth out of that subscription is going to be the big question

Ancapistani•3h ago
I agree, and the problem is that "value" != "utilization".

It costs the provider the same whether the user is asking for advice on changing a recipe or building a comprehensive project plan for a major software product - but the latter provides much more value than the former.

How can you extract an optimal price from the high-value use cases without making it prohibitively expensive for the low-value ones?

Worse, the "low-value" use cases likely influence public perception a great deal. If you drive the general public off your platform in an attempt to extract value from the professionals, your platform may never grow to the point that the professionals hear about it in the first place.

garrickvanburen•2h ago
this is the problem Google search originally had.

They successfully solved it with an advertising....and they also had the ability to cache results.

mysterydip•47m ago
Do LLMs cache results now? I assume a lot of the same questions get asked, although the answer could depend on previous conversational context.
jsheard•2h ago
I wonder who will be the first to bite the bullet and try charging different rates for LLM inference depending on whether it's for commercial purposes. Enforcement would be a nightmare but they'd probably try to throw AI at that as well, successfully or not.
chis•1h ago
I think there are always creative ways to differentiate the two tiers for those who care.

“Free tier users relinquish all rights to their (anonymized) queries, which may be used for training purposes. Enterprise tier, for $200/mo, guarantees queries can only be seen by the user”

emzo•1h ago
This would be great for open source projects
jfrbfbreudh•1h ago
This is what Google currently does for access to their top models.

AI Studio (web UI, free, will train on your data) vs API (won’t train on your data).

typewithrhythm•1h ago
Value capture pricing is a fantasy often spouted by salesmen, the current era AI systems have limited differentiation, so the final cost will trend towards the cost to run the system.

So far I have not been convinced that any particular platform is more than 3 months ahead of the competition.

bryanlarsen•15m ago
OpenAI claims their $200/month plan is not profitable. So this is cost level pricing, not value capture level pricing.
morkalork•2h ago
Costs more than seats for Office 365, Salesforce and many productivity tools. I don't see management gleefully running to give access to whole departments. But then again, if you could drop headcount by just 1 on a team by giving it to the rest, you probably come out ahead.
EasyMark•2h ago
I feel prices will come down a lot for "viable" AI, not everyone needs the latest and greatest at rock-bottom prices. Assuming AGI is just a pipe-dream with LLMs as I suspect.
Wowfunhappy•2h ago
> When you see the delta between what's currently being burnt by OpenAI and what they bring home, the sweet point is going to be hard to find.

Moore's law should help as well, shouldn't it? GPUs will keep getting cheaper.

Unless the models also get more GPU hungry, but 2025-level performance, at least, shouldn't get more expensive.

dvt•2h ago
> Moore's law should help as well, shouldn't it? GPUs will keep getting cheaper.

Maybe I'm misremembering, but I thought Moore's law doesn't apply to GPUs?

godelski•2h ago
Not necessarily. The prevailing paradigm is that performance scales with size (of data and compute power).

Of course, this is observably false as we have a long list of smaller models that require fewer resources to train and/or deploy with equal or better performance than larger ones. That's without using distillation, reduced precision/quantization, pruning, or similar techniques[0].

The real thing we need is more investment into reducing computational resources to train and deploy models and to do model optimization (best example being Llama CPP). I can tell you from personal experience that there is much lower interest in this type of research and I've seen plenty of works rejected because "why train a small model when you can just tune a large one?" or "does this scale?"[1] I'd also argue that this is important because there's not infinite data nor compute.

[0] https://arxiv.org/abs/2407.05694

[1] Those works will out perform the larger models. The question is good, but this creates a barrier to funding. Costs a lot to test at scale, you can't get funding if you don't have good evidence, and it often won't be considered evidence if it isn't published. There's always more questions, every work is limited, but smaller compute works have higher bars than big compute works.

jorvi•1h ago
Small models will get really hot once they start hitting good accuracy & speed on 16GB phones and laptops.
godelski•37m ago
Much of this already exists. But if you're expecting identical performance as the giant models, well that's a moving goalpost.

The paper I linked explicitly mentions how Falcon 180B is outperformed by Llama-3 8B. You can find plenty of similar cases all over the lmarena leader board. This year's small model is better than last year's big model. But the Overton Window shifts. GPT3 was going to replace everyone. Then 3.5 came out at GPT 3 is shit. Then o1 came out and 3.5 is garbage.

What is "good accuracy" is not a fixed metric. If you want to move this to the domain of classification, detection, and segmentation, the same applies. I've had multiple papers rejected where our model with <10% of the parameters of a large model matches performance (obviously this is much faster too).

But yeah, there are diminishing returns with scale. And I suspect you're right that these small models will become more popular when those limits hit harder. But I think one of the critical things that prevents us from progressing faster is that we evaluate research as if they are products. Methods that work for classification very likely work for detection, segmentation, and even generation. But this won't always be tested because frankly, the people usually working on model efficiency have far fewer computational resources themselves. Necessitating that they run fewer experiments. This is fine if you're not evaluating a product, but you end up reinventing techniques when you are.

kllrnohj•37m ago
> GPUs will keep getting cheaper. [...] but 2025-level performance, at least, shouldn't get more expensive.

This generation of GPUs have worse performance for more $$$ than the previous generation. At best $/perf has been a flat line for the past few generations. Given what fab realities are nowadays, along with what works best for GPUs (the bigger the die the better), it doesn't seem likely that there will be any price scaling in the near future. Not unless there's some drastic change in fabrication prices from something

fellowniusmonk•3h ago
As someone who grew up very poor and first got access to email via Juno's free ad based email client.

I've seen so many people over the years just absolutely shit on ad based models.

But ad based models are probably the least regressive approach to commercial offerings that we've seen work in the wild.

I love ads. If you are smart you don't have to see them. If you are poor and smart you get free services without ads so you don't fall behind.

I notice that there are no free open source providers of LLM services at this point, it's almost as if services that have high compute costs have to be paid for SOME HOW.

Hopefully we get a Juno for LLM soon so that whole cycle can start again.

leoh•3h ago
Ads have really harmed our society imo, despite having some advantages as you mention
danesparza•3h ago
I don't mean to be snarky, but is this announcement timed just so they take press away from the Microsoft Copilot announcement?
jonas21•3h ago
Today was the Google I/O keynote. The date was set months in advance.
Analemma_•3h ago
It’s exactly the opposite: the date for I/O was fixed months ago, Microsoft made their announcement to try and take press away from Google.
Ancapistani•3h ago
I've toyed with Gemini 2.5 briefly and was impressed... but I just can't bring myself to see Google as an option as an inference provider. I don't trust them.

Actually, that's not true. I do trust them - I trust them to collect as much data as possible and to exploit those data to the greatest extent they can.

I'm deep enough into AI that what I really want is a personal RAG service that exposes itself to an arbitrary model at runtime. I'd prefer to run inference locally, but that's not yet practical for what I want it to do, so I use privacy-oriented services like Venice.ai where I can. When there's no other reasonable alternative I'll use Anthropic or OpenAI.

I don't trust any of the big providers, but I'm realizing that I have baseline hostility toward Google in 2025.

nowittyusername•3h ago
Understanding that no outside provider is going to care about your privacy and will always choose to try and sell you their crappy advertisements and push their agenda on you is the first step in building a solution. In my opinion that solution will come in the form of a personalized local ai agent which is the gatekeeper of all information the user receives and sends to the outside world. A fully context aware agent that has the users interests in mind and so only provides user agreed context to other ai systems and also filters all information coming to the user from spam, agenda manipulation, etc... Basically a very advanced spam blocker of the future that is 100% local and fully user controlled and calibrated. I think we should all be either working on something like this if we want to keep our sanity in this brave new world.
Ancapistani•2h ago
Exactly.

To be clear, I don't trust Venice either . It just seems less likely to me that they would both lie about their collection practices and be able to deeply exploit the data.

I definitely want locally-managed data at the very least.

CSMastermind•3h ago
I pay for OpenAI Pro but this is a clear no for me. I just don't get enough value out of Gemini to justify a bump from $20 / month to $250.

If they really want to win they should undercut OpenAI and convince people to switch. For $100 / month I'd downgrade my OpenAI Pro subscription and switch to Gemini Ultra.

radicality•2h ago
It does look like it comes with a few other perks that would normally cost a bunch too, specifically, 30TB of Google drive storage
airstrike•1h ago
Yeah, no, thanks for the cross-sell but I'm not interested.
J_Shelby_J•2h ago
If they really want to win, they should make a competitor for O1-pro. It’s worth $200 to reduce LLM babysitting needs by %10.
mvdtnz•1h ago
Perhaps they're not interested in beating OpenAI in the business of selling $1 for $0.50.
CSMastermind•49m ago
Sure but failure to capture marketshare now could easily snowball into larger failure of the business later.
hackrmn•3h ago
Ok, this reminds me of that Black Mirror episode,

*spoilers ahead*

where the lady had a fatal tumor cut out for emergency procedure, only for it to be replaced by a synthetic neural network used by a cloud service with a multi-tier subscription model where even the basic features are "conveniently" shoved into a paying tier, up until the point she's on life support after being unable to afford even the basic subscription.

Life imitates art.

j_maffe•3h ago
wdym life imitates art? This is exactly what the episode was about lol
johnisgood•3h ago
Art imitates life!
OtherShrezzing•3h ago
The global average salary is somewhere in the region of $1500.

There’s lots of people and companies out there with $250 to spend on these subscriptions per seat, but on a global scale (where Google operates), these are pretty niche markets being targeted. That doesn’t align well with the multiple trillions of dollars in increased market cap we’ve seen over the last few years at Google, Nvda, MS etc.

paxys•2h ago
New technology always starts off available to the elite and then slowly makes its way down to everyone. AI is no different.
dimitrios1•2h ago
This is one of those assumed truisms that turns out to be false upon close scrutiny, and there's a bit of survivorship bias in the sense that we tend to look at the technologies that had mass appeal and market forces to make them cheaper and available to all. But theres tons of new tech thats effectively unobtainable to the vast majority of populations, heck even nation states. With the current prohibitive costs (in terms of processing power, energy costs, data center costs) to train these next generation models, and the walled gardens that have been erected, there's no reason to believe the good stuff is going to get cheaper anytime soon, in my opinion.
paxys•2h ago
> turns out to be false upon close scrutiny

Care to share that scrutiny?

Computers, internet, cell phones, smartphones, cameras, long distance communication, GPS, televisions, radios, refrigerators, cars, air travel, light bulbs, guns, books. Go back as far as you want and this still holds true. You think the the majority of the planet could afford any of these on day 1?

kkarakk•2h ago
the point is not that AI services will be affordable "eventually" it's that the advantage is so crazy that people who don't have access to them will NEVER be able to catch up. First AI wrappers disrupt industries ->developing nations can't compete coz the services are priced prohibitively -> AI wrappers take over even more -> automation disrupts the need for anyone -> developing nations never develop further. this seems more and more likely not less. cutting edge GPUs for eg - already are going into the stratosphere pricing wise and are additionally being sanctioned off.
tekla•2h ago
How is this different from literally all of human history
hombre_fatal•1h ago
It seems you're suggesting that once you start this process of building tech on top of tech, then you get far ahead of everyone because they all have to independently figure out all the intermediate steps. But in reality, don't they get to skip to the end?

e.g. Nations who developed internet infrastructure later got to skip copper cables and go straight to optical tech while US is still left with old first-mover infrastructure.

AI doesn't seem unique.

sxg•1h ago
I disagree. There are massive fixed costs to developing LLMs that are best amortized over a massive number of users. So there's an incentive to make the cost as cheap as possible and LLMs more accessible to recoup those fixed costs.

Yes, there are also high variable costs involved, so there’s also a floor to how cheap they can get today. However, hardware will continue to get cheaper and more powerful while users can still massively benefit from the current generation of LLMs. So it is possible for these products to become overall cheaper and more accessible using low-end future hardware with current generation LLMs. I think Llama 4 running on a future RTX 7060 in 2029 could be served at a pretty low cost while still providing a ton of value for most users.

TulliusCicero•1h ago
Yeah, GP is overextending by saying it's always true.

The more basic assertion would be: something being expensive doesn't mean it can't be cheap later, as many popular and affordable consumer products today started out very expensive.

pier25•2h ago
Do you have a source for the $1500 number? Seems pretty high.
bradleybuda•2h ago
It's 6x that. The median is 2x: https://chatgpt.com/share/682ceb2a-b56c-800b-b49d-1a24c48709...
Aurornis•27m ago
> The global average salary is somewhere in the region of $1500.

The global average salary earner isn't doing a computer job that benefits from AI.

I don't understand the point of this comparison.

julianpye•2h ago
Why do people keep on saying that corporations will pay these price-tags? Most corporations really keep a very tight lid on their software license costs. A $250 license will be only provided for individuals with very high justification barriers and the resulting envy effects will be a horror for HR. I think it will be rather individuals who will be paying out of their pocket and boosting their internal results. And outside of those areas in California where apples cost $5 in the supermarket I don't see many individuals capable of paying these rates.
troupo•2h ago
Corps will likely negotiate bulk pricing and discounts, with extra layers of guarantees like "don't use and share our data" on top
bryanlarsen•2h ago
"AI will make us X% more productive. 100%-X% of you are fired, the rest get a $250/month license".
kulahan•1h ago
I don’t see any benefit to removing humans in order to achieve the exact same level of efficiency… wouldn’t that just straight-up guarantee a worse product unless your employees were absolutely all horrendous to begin with?
bryanlarsen•17m ago
It'll improve profit margins for a brief moment, long enough for the execs making the decision to cash out.
verdverm•2h ago
We just signed up to spend $60+/month for every dev to have access to Copilot because the ROI is there. If $250/month save several hours per month for a person, it makes financial sense
delusional•2h ago
We signed up for that too. 2 quaters later the will to pay is significantly lower.
tacker2000•2h ago
How are you measuring this? How do you know it is paying off?
afroboy•1h ago
And why AI hype train didn't work on gaming industry? why it didn't save hundreds of hours from game devs times to get latest GTA anytime sooner?

I'm not sure it's correct that we need to measure the benefits of AI depending on the lines of codes that we wrote but on how much we ship more quality features faster.

Aurornis•28m ago
$60/month pays off if it saves even an hour of developer time over a month.

It's really not hard to save several hours of time over a month using AI tools. Even the Copilot autocomplete saves me several seconds here and there multiple times per hour.

julianpye•2h ago
Okay, but you're in a S/W team in a corp, where everyone's main task is to code. A coding agent has clear benefits here.

This is not the usecase of AI Ultra.

Aurornis•30m ago
This isn't really out of line with many other SaaS licenses that companies pay for.

This also includes things like video and image generation, where certain departments might previously have been paying thousands of dollars for images or custom video. I can think of dozens of instances where a single Veo2/3 video clip would have been more than good enough to replace something we had to pay a lot of money and waste of a lot of time acquiring previously.

You might be comparing this to one-off developer tool purchases, which come out of different budgets. This is something that might come out of the Marketing Team's budget, where $250/month is peanuts relative to all of the services they were previously outsourcing.

I think people are also missing the $20/month plan right next to it. That's where most people will end up. The $250/month plan is only for people who are bumping into usage limits constantly or who need access to something very specific to do their job.

ir77•2h ago
people here keep saying that this is targeted at big companies/corporations. the big company that i work for explicitly block uploads of data to these services and we're forbidden to put anything company related in there for many reasons, even if you use your own account, we don't have 'company accounts'.

so no, i can't see companies getting all excited about buying $250mo/user licenses for their employees for google or chatgpt to suck in their proprietary data.

verdverm•2h ago
These subscriptions explicitly do not suck in your proprietary data, it's all laid out in their ToS.
quantumHazer•1h ago
Yeah, and who will make them accountable? How can you verify that they’re noteworthy stealing your data anyways? This companies don’t give a shit about copyright or privacy.
ndriscoll•1h ago
I don't think I've ever worked somewhere where you wouldn't get fired for sending company data to a party that doesn't have an NDA signed with the company, regardless of whatever ToS they have.
sigmaisaletter•1h ago
The same companies who stole... sorry.. fair-used all the worlds artworks and all text on the internet to train their models are now promising you they won't steal...sorry... fair-use your uploaded data?

In unrelated matters, I have a bridge to sell you, if you are interested.

spaceman_2020•1h ago
Honestly, at this point, nation states will have to figure out an AI strategy. Poorer countries where the locals can't afford cutting edge AI tools will find themselves outproduced by wealthier workers.
dbspin•1h ago
Not just can't afford, but can't access. Most of these new AI tools aren't available outside the US.
Papazsazsa•1h ago
I just tried it.

8 out of 10 attempts failed to produce audio, and of those only 1 didn't suck.

I suppose that's normal(?) but I won't be paying this much monthly if the results aren't better, or at least I'd expect some sort of refund mechanism.

johnisgood•1h ago
They are running out of ideas for names. What next, Google AI Ultra Max Pro?
loloquwowndueo•1h ago
You joke but don’t forget there’s an actual product called “product name” pro max (an iPhone lol)
johnisgood•17m ago
Ffs... I actually had no idea lmao.

Well, that does make sense then.

Fergusonb•1h ago
That's a car payment...
backendEngineer•1h ago
All to put even more AI generated bs down our throats to sell us trash that we don't need and can't afford. God the roaring twenties can't catch a break.
MOARDONGZPLZ•1h ago
I use it to make meeting agendas.
icelancer•52m ago
I paid for it and Google Flow was upgraded to Ultra but Gemini still shows Pro, and asks for me to upgrade. When I go to "upgrade," it says I am already at Google Ultra.

Average Google product launch.

siliconc0w•29m ago
As long as these companies have an API, I imagine it's going to be cheaper to pay a la carte than pay monthly. $250 is a lot of API calls, especially as competition drives the cost lower. For stuff not behind an API, you're kinda shooting yourself in the foot because other providers might and at the very least it means developers aren't going to adopt it).
spondyl•21m ago
Can you... talk to a human for support? Perhaps I'm just used to SSO tax billing pages where I expect the right most column to mention that. I was partly expecting it because it'd be ironic to see people complaining that a model hallucinated only for some engineer at Google to shrug and be like "Nothing we can do about it"
paxys•13m ago
Unless that human is an AI researcher at Google what support are you expecting?
bn-l•16m ago
Deep think, the only thing that’s interesting in the whole i/o day is not accessible via api.

Also I was hoping for a chatgpt test time search thing that. That is absolutely killer.