While the topic matter is interesting, I feel like obviously synthetic content falls into the “that which was not worth writing, is not worth reading either” trap.
If the authors tone is extremely ChatGPT-esque, I apologize in advance.
I keep getting a paragraph or two into something, read one of the terrible "It's not just word - it's massive hyperbole!" sentences, see that there are several more in subsequent paragraphs and can't continue.
However bad the author's original writing that generate this output was, it can't be as awful as this.
… and so I'll continue to stick with AVC, thanks! :-)
Rather than this inflated slop that look like I am trying to reach word count in a paper and one sentence becomes 15 useless ones
Edit: This is not so much commentary on AI than it is the core of your post is a few tables. Just post the tables and one or two sentence of conclusion and that is all ! It is so tedious to read through dozens of paragraph of autogenerated unnecessary nonsense -- that contribute nothing of value to the data
I think what stands out to me is this cartoonishly punchy, faux-dramatic framing.
That, and specialist terms that seem to be thrown in there in an empty way, just to signal subject-matter expertise that’s not even expected of a DIYer’s experiment report:
> It’s a multi-decade, billion-dollar street fight over bytes and pixels, waged in the esoteric battlegrounds of DCT blocks and entropy coding
I like using real idioms that have percolated through culture ('birds of a feather', 'white elephant', 'nip in the bud', etc), not stupid contrivations.
As someone who sweated through hours and hours of English essay-writing in school, reading LLM output that is misrepresented as genuine human writing is annoying and highly disrespectful of the reader's time and effort. The moment I saw the stupid, contrived headers and dozens of emojis, I closed the tab.
I refuse to waste my time reading the output of a matrix multiplication done in some server farm when I could do the latter myself.
> uses slopbot 9000 to explode his point into ten times the "prose"
> mfw
One of the quite expensive paid plans, as the free one has to have "Created with Datawrapper" attribution at the bottom. I would guess they've vibe-coded their way to a premium version without paying, as the alternative is definitely outside individual people's budgets (>$500/month).
By now - it should be in most devices that's aren't outdated by even average standards. And it's worth mentioning that for devices that don't have hardware decoding, dav1d does an excellent job of decoding it on the CPU.
The problem is more with hardware encoding. That's indeed only present in only recent generations (or a couple) of hardware and even with that, AMD for example have an aspect ratio limitation bug in their AV1 hardware encoder (which requires adding black bands to work around) that's only fixed in RDNA 4 which is not available in their APUs, so it won't be fixed in APUs until their UDNA is used for them (they didn't fix it in RDNA 3.5 chips).
Presumably nothing jumped out at the author as being worse, but come on how can you have a whole section on why AV1's regression on Bojack is actually a good thing because the quality is way higher, and then not show any quality comparisons?
Also, If anyone was wondering where AV1 stands in comparison to VP8 and VP9... I just looked it up after a few years of not paying attention and I guess Google donated VP8 and VP9 to the alliance for open media foundation (AOMedia) in 2015 and they created AV1 and released it in 2018.
CharlesW•1h ago
It'd be great to hear from someone at Netflix about the unexpected Bojack Horseman results. I'd bet that Netflix just isn't yet taking advantage of AV1 features designed especially for this kind of animation and synthetic content.
alfalfasprout•50m ago
adzm•41m ago
toast0•39m ago
CharlesW•35m ago
isatty•7m ago
TVs have horrible UIs and are generally ad ridden garbage. Not using anything android based because of the same reason and slow.
zamadatix•32m ago
Also 6 GHz Wi-Fi would be nice. I had to run a cable to my 2 because the 5 GHz airspace where I am is too crowded to stream high quality movies via Infuse without occasional hitching. Same with the seek speed. Meanwhile my iPhone gets 2.9 Gbps of goodput at solid jitter on 6 GHz Wi-Fi.
There's probably some updates to the HDR standards. For me at least though the current one already supports what my TV does.
Also apps seem to assume "because hardware decode isn't available don't serve AV1" sometimes. As silly as that is with the CPU power in the AppleTV, at least that problem would go away with hardware support and they'd stop trying to serve a "compatible" SDR h.264 stream. Despite internet pessimism, sometimes the quality is also raised with more efficient codecs rather than just "the same quality at less bandwidth".
CharlesW•17m ago
This isn't a completely unreasonable decision, since the current 2022 model's software AV1 decode apparently can only sustain 4K AV1 decode (although it handled 1080p content fine in my test) for as little as 45 minutes before thermal throttling kicks in.
mdasen•18m ago
In terms of AV1 support, YouTube often only does 4K with AV1 so that's an issue for people.
Personally, I'd love to see an Apple TV that was great for gaming. New Apple processors have hardware ray tracing and decent gaming performance.
I think it's also likely that Apple will try and make an Apple TV that will support next-gen Siri and on-device AI stuff. Yes, you can complain about Apple's AI delays, but Apple's probably looking toward an Apple TV that can support their AI models.
In some ways, "what benefit would a new <insert-thing> have?" Sometimes we don't know until we have it and people start using it.
pnw•8m ago
I don't think new graphics hardware solves the problem. Beyond the friction of the unit not shipping with a controller, tvOS lacks good discovery for games and there is no ad infrastructure comparable to mobile. Most game developers aren't looking to invest in small, closed platforms either. It's hard enough to make money on Apple's mobile platforms.
adzm•45m ago
While the percentages look scary, it's only a slight difference (60kbps!) and still around 1mbps average, but with a significant quality boost (very crisp lines and near perfect quality). I bet Netflix could encode at nearly half that bitrate and stay similar to HEVC in quality, but I'm pleased they seem to have made a good tradeoff here.
It's actually quite amazing the quality that AV1 delivers at such low bitrates across the board. I've said it before, but AV1 is almost magical. Which I think is behind the lack of enthusiasm for VVC/h266; is anyone even using that? I've yet to actually see it in the wild.
CharlesW•29m ago
https://aomedia.org/press%20releases/AOMedia-Announces-Year-...