they film for screens , regardless of where those might be.
That's a lost cause. You never know what sort of random crap and filters a clueless consumer may inflict on the final picture. You cannot possibly make it look good on every possible config.
What you can do is make sure your movie looks decent on most panels out there, assuming they're somewhat standard and aren't configured to go out of their way to nullify most of your work.
The average consumer either never knew these settings existed, or played around with them once when they set up their TV and promptly forgot. As someone who often gets to set up/fix setups for aforementioned people, I'd say this is a good reminder.
It is perfectly understandable that the people who really care about how their work was colour-graded, then suggest you turn off all the features that shit all over that work. Similarly for the other settings he mentions.
Don't get me wrong, I haven't seen the first season, so won't watch this, but creators / artists do and should care about this stuff.
Of course, people can watch things in whatever dreaded settings they want, but lots of TVs default to bad settings, so awareness is good.
Since consumers are not trained to critically discern image and video quality, the "Wow!" often wins the sale. This easily explains the existence of local dimming solutions (now called miniLED or some other thing). In a super bright Best Buy or Walmart viewing environment they can look fantastic (although, if you know what to look for you can see the issues). When you get that same TV home and watch a movie in the dark...oh man, the halos jump off the screen. Now they are starting to push "RGB miniLED" as if that is going to fix basic optics/physics issues.
And don't get me started on horrible implementations of HDR.
This is clearly a case of the average consumer not knowing enough (they should not have to be experts, BTW) and effectively getting duped by marketing.
These settings are the television equivalent of clickbait. They are there to get people to say "Oh, wow!" at the store and buy it. And, just like clickbait, once they have what they clicked on, the experience ranges from lackluster and distorted to being scammed.
Most people have absolutely no idea what goes into making the pixels on their screens flicker with quality content.
This way, you should not need to change them unless you want nonstandard settings for whatever reason.
Went to the in-laws over the holidays and the motion smoothing on the otherwise very nice LG tv was absolutely atrocious.
My sister had her Nintendo Switch connected to it and the worst thing was not the low resolution game on the 4k display - it was the motion smoothing. Absolutely unbearable. Sister was complaining about input lag and it was most definitey caused by the motion smoothing.
I keep my own TV on game mode regardless of the content because otherwise all the extra “features” - which includes more than just motion smoothing - pretty much destroys picture quality universally no matter what I’m watching.
Not she why Netflix is destroying destroying the experience themselves here.
For example Kate Bush's 1985 "Running up that Hill" because a huge worldwide hit after appearing in season 4.
I was also gradually switching to treating this season as a background noise, as it fails to be better than that. It is insultingly bad at places even consumed this way.
But then getting into recommendations like "turn off vivid mode" is pretty freaking pretentious, in my opinion, like a restaurant where the chef freaks out if you ask for salt. Yes, maybe the entree is perfectly salted, but I prefer more, and I'm the one paying the bill, so calm yourself as I season it to my tastes. Yes, vivid modes do look different than the filmmaker intended, but that also presumes that the viewer's eyes are precisely as sensitive as the director's. What if I need higher contrast to make out what's happening on the screen? Is it OK if I calibrate my TV to my own personal viewing conditions? What if it's not perfectly dark in my house, or I want to watch during the day without closing all the blinds?
I tried watching the ending of Game of Thrones without tweaking my TV. I could not physically see what was happening on the screen, other than that a navy blue blob was doing something against a darker grey background, and parts of it seemed to be moving fast if I squinted. I cranked the brightness and contrast for those episodes so that I could actually tell what was going on. It might not have aligned with the director's idea of how I should experience their spectacle, but I can live with that.
Note that I’d also roll my eyes at a musician who told me how to set my equalizer. I’ll set it as I see fit for me, in my living room’s own requirements, thanks.
The point you make isn't incorrect at all. I would say that TV's should ship without any such enhancements enabled. The user should then be able to configure it as they wish.
Plenty of parallel examples of this: Microsoft should ship a "clean" version of Windows. Users can they opt into whatever they might want to add.
Social media sites should default to the most private non-public sharing settings. Users can open it up to the world if they wish. Their choice.
Going back to TV's: They should not ship with spyware, log-ware, behavioral tracking and advertising crap. Users can opt into that stuff if they value proposition being offered appeals to them.
Etc.
I strongly agree with that. The default settings should be… well, “calibrated” is the wrong word here, but that. They should be in “stand out among others on the showroom floor” mode, but set up to show an accurate picture in the average person’s typical viewing environment. Let the owner tweak as they see fit from there. If they want soap opera mode for some bizarre reason, fine, they can enable it once it’s installed. Don’t make the rest of us chase down whatever this particular brand calls it.
The equalizer analogy is perfect.
Having said that, there are a lot of bad HDR masters.
> Whatever you do, do not switch anything on ‘vivid’ because it’s gonna turn on all the worst offenders. It’s gonna destroy the color, and it’s not the filmmaker’s intent.
I’m interested in trying the filmmaker’s intent, like I’ll try the chef’s dinner before adding salt because it’ll probably be wonderful. But if I think the meal still needs salt, or my TV needs more brightness or contrast, I’ll add it. And even if the filmmaker or chef thinks I’m ruining their masterpiece, if I like it better that way, that’s how I’ll enjoy it.
And I’m very serious about the accessibility bit. My vision is great, but I need more contrast now than I did when I was 20. Maybe me turning up the brightness and contrast, or adding salt, lets me perceive the vision or taste the meal the same way as the director or chef does.
Martin has claimed he flew to HBO to convince them to do 10 seasons of 10 episodes instead of the 8 seasons with just 8 episodes in the final one [1]. It was straight up just D.B. Weiss and David Benioff call how the series ended.
[1]: https://variety.com/2022/tv/news/george-rr-martin-shut-out-g...
You know there are two more episodes, right? That seems like an obvious finale reveal.
They just invent stuff that they have no idea how to explain later. Just like Lost.
There are also stupid leaps of faith like Holly's mom hobbling out of bed and sticking an oxygen tank in a clothes dryer(as if that would even do anything)...
From that point on, everyone gets 10 inch thick plot armour, and then the last two episodes skip a whole season or two of character development to try and box the show off quickly.
It's the way stuff is done, the characters' changed behavior, incomprehensible logic, stupid explanations, etc.
Television writers pussying out in their finales is its own meme at this point. Makes me respect David Chase and how The Sopranos ended all that much more.
And no, I'm not talking about the gay thing. The writing is simply atrocious. Numerous plot holes, leaps of reasoning, and terrible character interactions.
Basically I think the main problem with the show is the character of eleven. She’s boring. She isn’t even really a character as she has no thoughts or desires or personality. She is a set piece for the other characters to manipulate. That works in the first season. But by season 3 it’s very tiring. She just points her hands at things and psychic powers go. This is why it might feel weird when Will tells everyone he is gay and all the original boys are like dude we are totally cool with you being gay and give him a group hug then eleven joins too it feels weird. To use the show’s language, she isn’t really part of the party.
Season 3 is a great example of how the show pays too much attention to eleven without developing her character while giving her lots of screen time. billy is a very interesting character you could spend a lot of time understanding why billy is the way he is but instead you get one dream sequence because eleven sees his dreams and oh his dad sucks. Except you knew that already from season 2. And most of elevens screen time when not shopping at the mall is spent pointing her hands at things to make psychic powers go boom.
But this is basically the problem with the show. The writers like eleven too much. And she is incredibly boring as a character after season 1.
I think the show succeeded greatly in the first season at creating actions for the characters to do that developed both their characters and the narrative. And those happen in this 1980s nostalgic world. But I think the shows attachment to eleven has ultimately harmed the narrative.
That being said, I do think that the general narrative of the show going from the demogorgon to the mindflayer to vecna and the abyss is very dungeons and dragons. Haha. That would be a fun campaign to play.
1. Millie Bobbie Brown, the actor of Eleven, is unfathomably stupid (watch any out-of-character interview with her) and the role has simply outgrown her acting abilities. They can't make Eleven do anything interesting because Millie can't act it.
2. Writers have introduced so many supporting characters and separate story lines that it is impossible to give any of them enough screen time for proper character development.
There are other major writing problems with the show, like the overreliance on cheap 80s culture references, but I think the main problem is with the characters. The writers simply don't understand what made the first season so good.
I hunted around on YouTube for a bit, but in nothing I landed on did she come across as “unfathomably stupid.” Young and green, maybe.
It’s honestly not the worse AI content out there! Lots of movies I wouldn’t consider watching but that I’m curious enough to see summarized (eg a movie where only the first title was good but two more were still published)
You'd think television production would be calibrated for the median watcher's TV settings by now.
I think TV filters (vivid, dynamic brightness, speech lifting, etc) are actually a pretty decent solution to less-than-ideal (bright and noisy environment, subpar screen and audio) viewing conditions.
On the other hand, things that are objective like color calibration, can be hard to “push down” to each TV because they might vary from set to set. Apple TV has a cool feature where you can calibrate the output using your phone camera, it’s really nifty. Lots of people comment on how good the picture on my TV looks, it’s just because it’s calibrated. It makes a big difference.
Anyways, while I am on my soap box, one reason I don’t have a Netflix account any more is because you need the highest tier to get 4k/hdr content. Other services like Apple TV and Prime give everyone 4k. I feel like that should be the standard now. It’s funny to see this thread of suggestions for people to get better picture, when many viewers probably can’t even get 4k/hdr.
No.
(You ever think about how many fantastic riffs have been wasted with cringe lyrics?)
Ever look at the lyrics to Toto's Africa? We can start there, someone send a poet please
Here's how bad it was in 2017. One of the earliest things I watched on that TV was "Guardians of the Galaxy" on some expanded basic cable channel. The fight between Peter and Gamora over the orb looked very jerky, like it was only at about 6 fps. I found some reviews of the movie on YouTube that included clips of that fight and it looked great on them, so I know that this wasn't some artistic choice of the director that I just didn't like. Some Googling told me about the motion enhancement settings of the TV, and how they often suck. I had DVRed the movie, and with those settings off the scene looked great when I watched it again.
To preempt replies: ask yourself why 24 frames per second is optimal for cinema instead of just being an ancient spec that everyone got used to.
I dunno if it's just a me thing, but I wonder if a subconscious part of my brain is pegging the motion smoothed content as unnatural movement and dislikes it as a result.
Also imagine the hand of a clock rotating at 5 minutes’ worth of angle per frame, and 1 frame per second. If you watched that series of pictures, your brain might still fill in that the hand is moving in a circle every 12 seconds.
Now imagine smoothing synthesizing an extra 59 frames per second. If it’s only consider the change between 2 frames, it might show a bright spot moving in a straight line between the 12 and 1 position, then 1 and 2, and so on. Instead of a circle, the circle of the hand would be tracing a dodecagon. That’s fine, but it’s not how your brain knows clocks are supposed to move.
Motion smoothing tries to do its best to generate extra detail that doesn’t exist and we’re a long way from the tech existing for a TV to be able to do that well in realtime. Until then, it’s going to be weird and unnatural.
Film shot at 60FPS? Sure. Shot at 24 and slopped up to 60? Nah, I’ll pass.
But synthesizing these frames ends up with a higher frame rate but with the same shutter angle / motion blur of the original frame rate, which looks off to me. Same reason the shutter angle is adjusted for footage that is intended to be slow motion.
"Everyone" includes the filmmakers. And in those cases where the best filmmakers already found all kinds of artistic workarounds for the lower framerate in the places that mattered, adding interpolation will fuck up their films.
For example, golden age animators did their own interpolation by hand. In Falling Hare, Bugs' utter despair after looking out the window of a nosediving airplane is animated by a violent turn of his head that moves farther than what could be smoothly animated at 24fps. To avoid the jumpcut, there is a tween of an elongated bunny head with four ears, seven empty black eye sockets, four noses, and eight teeth. It's absolutely terrifying if you pause on that frame[1], but it does a perfect job of connecting the other cells and evoking snappier motion than what 24fps could otherwise show.
Claiming that motion interpolation makes for a better Falling Hare is like claiming that keeping the piano's damper pedal down through the entirety of Bach's Prelude in C produces better Bach than on a harpsichord. In both cases, you're using objectively better technology poorly, in order to produce worse results.
movies above 24fps won't become a thing, it looks terrible and should be left for documentaries and sports
Higher frame rates are superior for shooting reality. But for something that is fictional it helps the audience suspend their disbelief.
If it did horror films would be filmed at higher frame rates for extra scares.
Humans have a long history of suspending belief in both oral and written lore. I think that 'fps' may be as functionally equivalent as the santa clause stories, fun for kids but the adults need to pick up the bill.
With one caveat, some games that use animation-inspired aesthetics, the animation itself is not smoothed out but basically ran on the slower framerate (see guilty gear games) while everything else (camera movement, some effects) is silky smooth and you still get quick reaction time to your inputs.
People are “used to” high FPS content: Live TV, scripted TV shot on video (not limited to only soap operas), video games, most YouTube content, etc are all at 30-60FPS. It’d be worth asking yourself why so many people continue to prefer the aesthetic of a lower framerates when the “objectively better” higher FPS has been available and moderately prevalent for quite some time.
(However, modern TV sets are often filled with enough other junk that maybe you will not want all of these things anyways)
Despite being a subscriber I pirate their shows to get some pixels.
Many years ago, I had a couple drinks with a guy from Netflix who worked on their video compression processes, and he fully convinced me they're squeezing every last drop out of every bit they send down the pipes. The quality is not great compared to some other streaming services, but it's actually kind of amazing how they're able to get away with serving such tiny files.
Anyway, I think we can expect these companies to mostly max out the resultant video quality of their bitstreams, and showing the average bitrate of their pricing tiers would be a great yardstick for consumers.
for us nerds there is hidden stats for nerds option.
No other service does this.
And for some reason, if HDR versions of their 1080p content are even more bitstarved than SDR.
Could barely tell what was going on, everything was so dark, and black crush killed it completely, making it look blocky and janky.
I watched it again a few years later, on max brightness, sitting in the dark, and I got more of what was going on, but it still looked terrible. One day I'll watch the 'UHD' 4k HDR version and maybe I'll be able to see what it was supposed to look like.
When I last rewatched it (early pandemic), as far as I could tell at the time there was no HDR version available, which I assume would fix it by being able to represent more variation in the darker colours.
I might hunt one down at some point as it does exist now. Though it still wouldn’t make season 8 ‘good’ !!
Most other movies/series instead are so dark that make my mid-range TV look like crap. And no, it's not an HW fault, as 500 nits should be enough to watch a movie.
Except Primer did that without this crutch.
Oppenheimer didnt suffer from either of those issues but I’ve only watched it once on a good TV.
But vivid mode (et al) literally loses information. When the TV tries to make everything look vibrant, it’s effectively squishing all of the colors into a smaller color space. You may not be able to even tell two distinct objects apart because everything is similarly bright and vibrant.
Same with audio. The famous “smile” EQ can cause some instruments to disappear, such as woodwinds.
At the end of the day, media is for enjoyment and much of it is subjective, so fine do what you need to do to be happy. But few people would deliberately choose lower resolution (except maybe for nostalgia), which is what a lot of the fancy settings end up doing.
Get a calibration if you can, or use Filmmaker Mode. The latter will make the TV relatively dark, but there’s usually a way to adjust it or copy its settings and then boost the brightness in a Custom mode, which is still a big improvement over default settings from the default mode.
Are you talking about the center channel on an X.1 setup or something else? My Denon AVR certainly doesn't have a dedicated setting for dialog, but I can turn up the center channel which yields variable results for improved audio clarity. Note that DVDs and Blurays from 10+ years ago are easily intelligible without any of this futzing.
Voice boost makes the dialogue louder.
Everyone in the house loves these two settings and can tell when they are off.
The sound is mud, we've just become accustomed.
https://www.audiology.org/consumers-and-patients/hearing-and...
Flatscreen TV's have shitty speakers.
Then I noticed that native speakers also complain.
Then I started to watch YouTube channels, live TV and old movies, and I found out I could understand almost everything! (depending on the dialect)
When even native speakers can't properly enjoy modern movies and TV shows, you know that something is very wrong...
But also, people in old movies often enunciated very clearly as a stylistic choice. The Transatlantic accent—sounds a bit unnatural but you can follow the plot.
It only works if you're watching in a room that's acoustically quiet, like a professional recording studio. Once your heater / air conditioner or other appliance turns on, it drowns out everything but the loudest parts of the mix.
Otherwise, the problem is that you probably don't want to listen to ear-splitting gunshots and explosions, then turn it down to a normal volume, only to make the dialog and whispers unintelligible. I hit this problem a lot watching TV after the kids go to bed.
As much as I enjoy deafeningly bright explosions in the movie theater, it's almost never appropriate in the casual living room.
I recently bought a new TV, Bravia 8ii, which was supposedly not bright enough according to reviewers. In it's professional setting, it's way to bright at night, and being an OLED watching HDR content the difference between the brightest and darkest is simply too much, and there seems to be no way to turn it down without compromising the whole brightness curve.
In Poland our original productions have so badly mixed sound that in almost none series in my native language I cannot understand without captions.
But the upside of it is - with English being my second language - I understand most of movies/series I watched.
it's very funny how when watching a movie on my macbook pro it's better for me to just use HDMI for the video to my TV but keep on using my MBP speaker for the audio, since the speakers are just much better.
I don't find the source anymore but I think that I saw that it was even a kind of small conspiracy on tv streaming so that you set your speakers louder and then the advertisement time arrive you will hear them louder than your movie.
Officially it is just that they switch to a better encoding for ads (like mpeg2 to MPEG-4 for DVB) but unofficially for the money as always...
Film makers want to preserve dynamic range so they can render sounds both subtle and with a lot of punch, preserving detail, whereas ads just want to be heard as much as possible.
Ads will compress sound so it sounds uniform, colorless and as clear and loud as possible for a given volume.
It's not just that. It's obsession with "cinematic" mixing where dialogues are not only quieter that they could, to make any explosion and other effects be much louder than them, but also not enough above background effects.
This all work in cinema where you have good quality speakers playing much louder than how most people have at home.
But at home you just end up with muddled dialogue that's too quiet.
Doesn't matter if it makes vocals part of the backgroud at all times.
Just anecdotally, I can tell speaker tech has progressed slowly. Stepping in a car from 20 years ago sound... pretty good, actually.
IMO, half the issue with audio is that stereo systems used to be a kind of status symbol, and you used to see more tower speakers or big cabinets at friends' houses. We had good speakers 20 years ago and good speakers today, but sound bars aren't good.
They do sound pretty much ok for very discreet objects compared to tower speaker. I only occasionally rant when sound skip a beat because of WiFi or other smart-assery. (Nb: of course I never ever activated the smart assistant, I use them purely as speakers).
Lower spec speakers have become good enough, and DSP has improved to the point that tiny speakers can now output mediocre/acceptable sound. The effect of this is that the midrange market is kind of gone, replaced with neat but still worse products such as soundbars (for AV use) or even portable speakers instead of hi-fi systems.
On the high end, I think amplified multi-way speakers with active crossovers are much more common now thanks to advances in Class-D amplifiers.
The time and money cost of going further than that is not going to provide a sufficient return on investment except to a very small proportion of people.
They are notorious for bad vocal range audio.
I have a decent surround sound and had no issues at all.
I want to get a 3.0 setup with minimal changes to the equipment.
My brother has 2 of the apple speakers in stereo mode and they sound pretty good imo.
Nothing beats true surround sound though.
Regarding internal speakers, I have listened to several cheap to medium TVs on internal speakers, and yes on some models the sound was bad. But it doesn't matter, because the most mangled frequencies are high and low, and that's not the voice ones. When I listen on the TV with meh internal speakers I can clearly understand without any distortion voices in the normal TV programming, in sports TV, in old TV shows and old movies. The only offenders again are some of he new content.
So no, it's not the internal speakers who are at fault, at all.
You can buy Bluetooth receivers that plug into the line input of your amplifier.
Phone/tablet/laptop etc. in my top comment was not a technological limitation, like "oh no, we don't have a port or protocol to connect o speakers and so we can't use them". It was a logistical limitation. Like being physically in place without speakers or possibility to even buy them. Traveling, renting, having big family and only one set of speakers, and so on. Situations where you can't just pluck a Dolby setup from a thin air but do still watch movies.
Here is a datapoint - in the whole world around 1-2 *billion* headphones are sold, every single year. I would bet that at least a double digit percentage of those numbers had been used to watch a movie at least once. Proposing that all those people in all those situations bought themselves a surround speaker setup just to understand voice track in the movies is an inane take.
You are trying to turn this into some ideological debate about the equity of audio mixes.
If you care this much about audio just buy some better speakers ffs.
New content especially comes with more channels, more channels that will get muddled into your two for output.
Do you spend the effort of specifically selecting stereo tracks (or adjusting how it gets downmixed)?
It's pretty annoying.
Umm, isn't that literally a job description of a sound engineer, who on a big production probably makes more in a year than I will do in my whole lifetime?
Is spending a few hours one time to adjust levels on a track, which will run for likely millions of hours across the world such a big ask? I think no, because not every modern movie is illegible, some producers clearly spend a bit of effort to do just that what you wrote. But some just don't care.
Well, if your setup is stereo then either selecting a stereo track is your job, or your job is to adjust the downmix that is done by your computer because you didn't select the stereo track.
I agree that providing a good stereo mix is the sound engineer's job, but nothing beyond that.
That's the whole point of this whole thread, no one asks for anything more or out of ordinary. Stereo tracks sometimes have unreasonably bad quality. Nolan even admitted he does this on purpose.
Oh, and the youtube videos don't have the infamous mixing issues of "voices too low, explosions too high".
It's the source material, not the device. Stop accusing TV speakers, they are ok-tier.
When you have a good setup those same movies sound incredible, Nolan films are a perfect example.
Yet I do live in a flat, in Paris, with neighbors on the same floor, on the floor above, and on the floor below. Thus I tune the volume to something that is acceptable in this context.
Or I should say, I spend the whole movie with the remote in my hand, tuning the volume up and down between voices and explosions.
Theatre mix is a bad home mix. It is valid for home cinema. Not for everyday living room.
Yes I could buy a receiver and manually EQ all channels and yadda yadda yadda. I live in an apartment. My 65" LG C2 TV is already ginormous by parisian flat standards. Ain't nobody got space for a dedicated receiver and speakers and whatnot. I tuned the audio, and some properly mixed movies actually sound great!
As an added bonus, I had troubles with "House of Guinness" recently both on my TV and with good headphones, where I also did the volume dance.
IMHO there's no care spent on the stereo mixes of current movies and TV shows. And to keep your example, Nolan shows are some of the most understandable and legible on my current setup :)
Another fact is, I have no trouble with YouTube videos in many languages and style, or with video games. You know, stuff that care about legibility in the home.
For current movies, some of the most legible are "children" oriented movies: I watched the Dragons set and it was trouble-free.
Which is just another drama that should not be on consumers shoulders.
Every time I visit friends with newer TV than mine I am floored by how bad their speakers are. Even the same brand and price-range. Plus the "AI sound" settings (often on by default) are really bad.
I'd love to swap my old tv as it shows it's age, but spending a lot of money on a new one that can't play a show correctly is ridiculous.
No thank you. We should make the default work well, and if people want a sound optimized experience that requires 6x the pieces of equipment let those who want to do the extra work do what they need to for the small change in audio quality.
Without that change in defaults more and more people will switch to alternatives, like TikTok and YouTube, that bother to get understandability as the default rather than as something requiring hours of work and shopping choices.
Most AVRs come with an automatic calibration option. Though there are cheap 5.1 options on the market that will get results multiple times better than your flatscreen can produce.
> We should make the default work well
Yep, movies should have properly mastered stereo mixes not just dumb downmixes from surround that will be muddy, muffled and with awful variations in loudness.
However getting a better sound system is a current solution to the problem that doesn't require some broad systemic change that may or may not ever happen.
I have spent about half an hour investigating sound bars as a result of these discussions, and that's a loss of life that I can never get back, and I regret spending that much time on the problem.
I believe one could do some fun stuff with waveguides and beam steering behind the screen if we had 2 inch thick screens. Unfortunately decent audio is harder to market and showcase in a bestbuy than a "vivid" screen.
If someone buys a TV (y'know, a device that's supposed to reproduce sound and moving pictures), it should at least be decent at both. But if people want a high-end 5.1/7.1/whatever.1 sound then by all means they should be able to upgrade.
My mum? She doesn't want or need that, nor does she realistically have the space to have a high-end home-cinema entertainment setup (much less a dedicated room for it).
It's just a TV in her living room surrounded by cat toys and some furniture.
So, if she buys a nearly €1000 TV (she called it a "stupid star trek TV") it should at least be decent—although at that price tag you'd reasonably expect more than just decent—at everything it's meant to do of the box. She shouldn't need to constantly adjust sound volume or settings, or spend another thousand on equipment and refurbishment to access to decent sound.
In contrast, they say the old TV that's now at nan's house has much better sound (even if the screen is smaller) and are thinking of swapping the TVs since nan moved back in with my mum.
Honestly I think high-end TVs should just not include speakers at all, similar to how high-end speakers don't contain built-in amplifiers. Then you could spend the money saved on whatever speakers you want.
> She shouldn't need to constantly adjust sound volume or settings, or spend another thousand on equipment and refurbishment to access to decent sound.
How about €100 on a soundbar?
And I do own two proper dedicated speakers + amps setups. I also know how to use REW and Sigma Studio. So I guess I qualify regarding "cares".
Sadly I lack time to build a third set of cabinets to the constraints of our living room.
I'm surprised given you care about audio that you can even tolerate internal speakers. I'd just not use that TV and watch wherever you have better audio.
Also - this isn’t a speaker problem this is a content problem. I watched the princess bride last week on the TV, and didn’t require captions, but I’m watching Pluribus on Netflix and I’m finding it borderline impossible to keep up without them.
When you listen to that content on a good system you don't have these issues.
Nolan films are a perfect example.
You can still watch these movies, its just sounds bad on low quality sound systems.
on AppleTV/TV+
That said, I’m Irish and live in the UK. You’ve never heard people say “I’ll hoover that”, or “you can google that”? Kleenex and band aid are definitely American ones but given the audience I thought it was apt
I turned it off though and use an external Atmos receiver and speakers.
The crux of the issue IMHO is the theatrical mixes. Yes I can tune the TV volume way up and hear the dialogue pretty well. In exchange, any music or sfx is guaranteed to wake the neighbors (I live in a flat, so neighbors are on the other side of the wall/floor/ceiling).
In a movie the characters may be far away (so it needs to sound like that, not like a podcast), running, exhausted, with a plethora of background noises and so on.
If we cant do the same in the movie, sound is just badly mixed. It is not the story setup and it is not "realistic".
Because in real life you don't listen through an internal TV speaker, duh.
That being said, people listening to TV through TV is 100% predictable. If we cant understand, the mix is not "realistic", it is "badly done".
Somehow youtube videos don't have this issue. Go figure /s
The problem, as you say, is that if you don't want to have loud parts, you lower the volume so that loud is not loud anymore, and then the quiet but audible parts become inaudibly quiet.
I consider this to be a separate issue to the lack of clarity of internal speakers, and a bit harder to solve because it stems from the paper thin walls common in the US and other places.
You can usually use audio compression to fix this if you can't play the movie at the volume level it's meant to be played.
Intentionally making audio uncomfortable is not a sign of art or skill, it's a sign of delivering a bad product.
75 dB in real life is your typical restaurant, office, etc.
The most common experience on the poorly mixed content that several in this thread are complaining about are: the volume setting necessary for intelligible audio results in uncomfortably loud audio in other parts.
This is a defect of the content, not of the system it's playing on.
Top-down processing
(or more specifically, top-down auditory perception)
This refers to perception being driven by prior knowledge, expectations, and context rather than purely by sensory input. When you already know the dialog, your brain projects that knowledge onto the sound and experiences it as “clear.”
I like the long movie format, lots of good shows to watch. Movies feel too short to properly tell a story. It's just like a few highlights hastily shown and then it's over.
Still, sometimes it feels like the writers weren't granted enough time to write a shorter script. Brevity isn't exactly incentivized by the business model.
In a show like Stranger Things, almost none of the episodes are individually memorable or watchable on their own. They depend too much on the surrounding episodes.
Compare to e.g. Strange New Worlds, which tells large stories over the course of a season, but each episode is also a self-contained story. Which in turn allows for more variety and an overall richer experience, since you can have individual episodes experiment with wacky deviations from the norm of the show. Not all of those experiments will land for everybody (musical episodes tend to be quite divisive, for example), but there is a density to the experience that a lot of modern TV lacks.
It's so obvious in hindsight. Shows like the Big Bang theory, House and Scrubs I very rarely caught two episodes consecutively (and when I did they were on some release schedule so you'd forgotten half of the plot by next week). But they are all practically self contained with only the thread of a longer term narrative being woven between them.
It's doubtful that any of these netflix series you could catch one random episode and feel comfortable that you understand what's going on. Perhaps worse is the recent trend for mini-series which are almost exactly how you describe - just a film without half of it being left on the cutting room floor.
Buffy is a great example: plenty of monster of the week episodes, but also season long arcs and character progression that rewarded continuity. The X-Files deliberately ran two tracks in parallel: standalone cases plus the mythology episodes. Lost was essentially built around long arcs and cliffhangers, it just had to make that work on a weekly broadcast cadence.
What’s changed is the delivery mechanism, not the existence of serialisation. When your audience gets one episode a week, with mid-season breaks, schedule slips, and multi-year gaps between seasons, writers have to fight a constant battle to re-establish context and keep casual viewers from falling off. That’s why even heavily serialised shows from that era often kept an episodic spine. It’s a retention strategy as much as a creative choice.
Streaming and especially season drops flip that constraint. When episodes are on demand and many viewers watch them close together, the time between chapters shrinks from weeks to minutes. That makes it much easier to sustain dense long-form narrative, assume recent recall, and let the story behave more like a novel than a syndicated procedural.
So the pattern isn’t new. On demand distribution just finally makes the serialised approach work as reliably at scale as it always wanted to.
How does completely dropping a season flip that? Some shows with complicated licensing and rights have caused entire seasons to be dropped from a given streaming service and it’s very confusing when you finish season N and go right into season N+2.
As I explained, that model can permit a binge of content which grants heavy context carryover.
As a binge watcher, this irks me to no end; I usually end up delaying watching episode 1 until everything is released, and in the process forget about the show for half a year or something, at which point there's hardly any conversation happening about it anymore.
Multi-year gaps between seasons is a modern thing, not from the era you're talking about. Back then there would reliably be a new season every year, often with only a couple of months between the end of one and the beginning of the next.
If John dumped Jane at the beginning of the episode, they had to get back together at the end, otherwise the viewer who had to go to her son's wedding that week wouldn't know what was going on. There was no streaming, recaps were few and far between, and not everybody had access to timeshifting, so you couldn't just rely on everybody watching the episode later and catching up.
Sometimes you'd get a two-episode sequence; Jane cheated on John in episode 1 but they got back together in episode 2. Sometimes the season finale would permanently change some detail (making John and Jane go from being engaged to being married). Nevertheless, episodes were still mostly independent.
AFAIK, this changed with timeshifting, DVRs, online catchup services and then streaming. If viewers have the ability to catch up on a show, even when they can't watch it during first broadcast, you can tell a long, complex, book-sized story instead of many independent short-stories that just happen to share a universe.
Personally, I much prefer the newer format, just as I prefer books to short stories.
This doesn't make sense; no show I know from that time followed that principle - and for good reason, because they'd get boring the moment the viewer realizes that nothing ever happens on them, because everything gets immediately undone or rendered meaningless. Major structural changes get restored at the end (with exceptions), but characters and the world are gradually changing.
> If John dumped Jane at the beginning of the episode, they had to get back together at the end, otherwise the viewer who had to go to her son's wedding that week wouldn't know what was going on.
This got solved with "Last time on ${series name}" recaps at the beginning of the episode.
Go back and watch any two episodes (maybe not the season finale) from the same season of Star Trek TOS or TNG, or Cheers, or MASH, or Friends, or any other prime time show at all prior to 1990. You won't be able to tell which came first, certainly not in any obvious way. (Networks didn't really even have the concept of specific episode orders in that era. Again looking back to Babylon 5 which was a pioneer in the "ongoing plot arc" space, the network deliberately shuffled around the order of a number of first-season episodes because they wanted to put stronger stories earlier to hook viewers, even though doing so left some character development a bit nonsensical. You can find websites today where fans debate whether it's best to watch the show in release order or production order or something else.)
By and large, we all just understood that "nothing ever happens" with long-term impact on a show, except maybe from season to season. (I think I even remember the standard "end of episode reset" being referenced in a comedy show as a breaking-the-fourth-wall joke.) Yes, you'd get character development in a particular episode, but it was more about the audience understanding the character better than about immediate, noticeable changes to their life and behavior. At best, the character beats from one season would add up to a meaningful change in the next season. At least that's my memory of how it tended to go. Maybe there were exceptions! But this really was the norm.
And when older format "nothing ever happens" shows like The Simpsons did try to go story-arc ("Who Shot Mr Burns?"), likewise: outrage.
Heh I was going to reply "B5 is better than TNG", but thought "better check all the replies first". Wherever there's discussion of extended plots there's one of us nerds. (If anyone hasn't seen it... yes half the first season is rough, but you get a season's worth of "The Inner Light"-quality episodes by the end and for all the major characters; TNG, while lovely, has just a few because there's so little character development besides Picard)
This is the point. There persistent changes in these shows tended to be very minor. Nothing big ever happened that wasn’t fully resolved by the time the credits rolled unless it was a 2-part episode, and then it was reset by the end of the second episode.
I think the slow changes in the 2000s and early 2010s were the sweet spot - a lot of room for episodic world and character building that would build to interspersed major episodes for the big changes.
This is not true as a generality. e.g. soap operas had long-running stories long before DVRs. Many prime-time dramas and comedies had major event episodes that changed things dramatically (character deaths, weddings, break-ups, etc.), e.g. the whole "Who shot J.R." event on *Dallas*. Almost all shows that I watched as a kid in the 80s had gradual shifts in character relationships over time (e.g. the on-again/off-again relationship between Sam and Diane on Cheers). Child actors on long-running shows would grow up and the situations on the show changed to account for that as they move from grade school, to high school, to college or jobs.
Sitcoms are - and I know this is a little condescending to point out - comedies contrived to exist in a particular situation: situation comedy → sitcom.
In the old day, the "situation" needed to be relatively relatable and static to allow drop-in viewers channel surfing, or the casual viewer the parent described.
Soap operas and other long-running drama series are built differently: they are meant to have long story arcs that keep people engaged in content over many weeks, months or years. There are throwbacks to old storylines, there are twists and turns to keep you watching, and if you miss an episode you get lost, so you don't ever miss an episode - or the soap adverts within them, their reason for being for which they are named - in case you are now behind with everything.
You'll find sports networks try to build the story arc around games too - create a sense of "missing out" if you don't watch the big game live.
I think the general point is that in the stream subscription era, everything has become like this "don't miss out" form, by doubling down on the need to see everything from the beginning and become a completist.
You can't easily have a comedy show like Cheers or Married... With Children, in 2026, because there's nothing to keep you in the "next episode" loop in the structure, so you end up with comedies with long-running arcs like Schitt's Creek.
The last set of sitcoms that were immune to this were probably of the Brooklyn 99, Cougartown and Modern Family era - there were in-jokes for the devotees, but you could pick up an episode easily mid-series and just dive in and not be totally lost.
Interesting exception: Tim Allen has managed to get recommissioned with an old style format a couple of times, but he's had to make sure he's skewing to an older audience (read: it's a story of Republican guys who love hunting), for any of it to make sense to TV execs.
That is why slow graduate changes.
Neither of these could afford serious multi episodes long arc with nuance played out the way current series can have.
It's basically daytime TV, to be watched at work, often as background, and without looking at the actual screen very often.
For example most of new the Star Trek stuff, none of the episodes stand by themselves. They don’t have their own stories.
I think Strange New Worlds walks that balancing act particularly well though. A lot of episodes are their own adventure but you do have character development and an overarching story happening.
TNG: You get e.g. changes in political relationships between major powers in the Alpha/Beta quadrant, several recurring themes (e.g. Ferengi, Q, Borg), and continuous character development. However, this show does much better job at exploring the Star Trek universe breadth-first, rather than over time.
DS9: Had one of the most epic story arcs in all sci-fi television, that spanned multiple seasons. In a way, this is IMO a golden standard for how to do this: most episodes were still relatively independent of each other, but the long story arcs were also visible and pushed forward.
VOY: Different to DS9, with one overarching plot (coming home) that got pushed forward most episodes, despite individual episodes being mostly watchable in random order. They've figured out a way to have things have accumulating impact without strong serialization.
> Last season of TNG they introduced the fact that warp was damaging subspace. That fact was forgotten just a few episodes later.
True, plenty of dropped arcs in TNG in particular. But often for the better, like in the "damaging subspace" aspect - that one was easy to explain away (fixing warp engines) and was a bad metaphor for ecological anyway; conceptually interesting, but would hinder subsequent stories more than help.
I wouldn't say they had any noticeable accumulating impact.
Kim was always an ensign, system damage never accumulated without a possibility of repair, they fired 123 of their non-replaceable supply of 38 photon torpedoes, the limited power reserves were quickly forgotten, …
Unless you mean they had a few call-back episodes, pretty much the only long-term changes were the doctor's portable holo-emitter, the Delta Flier, Seven replacing Kes, and Janeway's various haircuts.
> True, plenty of dropped arcs in TNG in particular. But often for the better, like in the "damaging subspace" aspect - that one was easy to explain away (fixing warp engines) and was a bad metaphor for ecological anyway; conceptually interesting, but would hinder subsequent stories more than help.
That and beta-cannon is this engine fix is why Voyager's warp engines moved.
The Doylist reason is of course "moving bits look cool".
Lots of novelizations fall into this category. Most decently dense and serious novels cannot be done justice to in 2 hours. The new TV formats have enabled substantial stories to be told well.
The Godfather parts I and II is just one story cut in half in a convenient place. Why not cut it into 4 50 minute eps and an 80 minute finale? (Edit: this substantially underestimates the running time of the first two Godfather movies!)
People are going to pause your thing to go to the toilet anyway. You might as well indicate to them when's a good time to do so.
Obviously there are also quite a few movies where 90 minutes is plenty. Both formats seem needed.
The alternative is the 1980s version of Dune, which tried to fit a massive novel into a single mass-market film runtime. It was fantastic, but people who hadn’t read the novel were left very short on story. The newer movies I’ve heard are much better in this regard, and it’s understandable because the runtime of the combined films is longer. The Dune 2000 (AKA SciFi Presents Frank Herbert’s Dune) miniseries was even better in some ways than the original film, largely for the same reasons.
Ender’s Game deserved to be at least two parts, because even the main character got no real character development. You barely learn Val exists, there’s really no Peter, and you barely meet Bean or Petra. There’s no Alai, Achilles, Fly, or Crazy Tom. There’s no zero-G battles at Battle School. The computer game is never even mentioned but is integral to the book. I don’t think it’s even mentioned in the film that Ender is a third child and why that’s important. It could have been a much better film in two or three parts.
Wake up dead man? I feel like 30-45m could be cut and it'd be good. Why is One Battle after another almost 3 hours?
Is there a competition to try to beat the notoriously long Lord of the Rings Extended edition in runtime?
Movies with a runtime over 3 hours really stood out.
I propose Apathy Opera
That part of it, while I know the reasons given for it, is becoming increasingly annoying/frustrating.
Could be mis-remembering though, when I think about early anthologies like Twilight Zone or Freddy’s Nightmares.
Though I have switched to mostly using Plex, so maybe I could look into doing something there.
Doesn't solve for single units but could help with people who use soundbars or amps
Abandoned though - was basically just multiband compression and couldn't find a way to make it adaptable enough (some media always ended up sucking)
Would be super interested to hear what you tried!
doesn't seem like anyone outside the audience thinks it's a serious problem (?)
(This may also apply to the "everything's too dark" issue which gets attributed to HDR vs. SDR)
Up until fairly recently both of these professions were pretty small, tight-knit, and learnt (at least partially) from previous generations in a kind of apprentice capacity
Now we have vocational schools - which likely do a great job surfacing a bunch of stuff which was obscure, but miss some of the historical learning and "tricks of the trade"
You come out with a bunch of skills but less experience, and then are thrust into the machine and have to churn out work (often with no senior mentorship)
So you get the meme version of the craft: hone the skills of maximising loudness, impact, ear candy.. flashy stuff without substance
...and a massive overuse of the Wilhelm Scream :) [^1]
[^1]: once an in joke for sound people, and kind of a game to obscure its presence. Now it's common knowledge and used everywhere, a wink to the audience rather than a secret wink to other engineers.
https://en.wikipedia.org/wiki/Wilhelm_scream
EDIT: egads, typing on a phone makes it far too easy to accidentally write a wall of text - sorry!
You reminded me of so many tv shows and movies that force me to lower all the roller shutters in my living room and I've got a very good tv otherwise I just don't see anything on the screen.
And this is really age-of-content dependent with recent one set in dark environments being borderline impossible to enjoy without being in a very dark room.
The darkness of shows has more to do with the mastering monitors having gotten so good that colorists don’t even notice if the dynamic range is just the bottom half or less. Their eyes adjust and they don’t see the posterisation because there isn’t any… until the signal is compressed and streamed. Not to mention that most viewers aren’t watching the content in a pitch black room on a $40K OLED that’s “special order” from Sony.
I could probably fix over half of the problems I have with TV audio with a decent sound bar, and a good one is a decent percentage of the cost of a brand new TV.
Our need to turn up the volume in dialog scenes and turn it back down again in action scenes (for both new and old content) got a lot less when we added a mid-range soundbar and sub to our mid-range TV (previously was using just the TV speakers). I’m not sure whether it’s sound separation - now we have a ‘more proper’ center channel - or that the ends of the spectrum - both bass and treble - are less muddy. Probably a combination of the two.
Now, we have "satellite" speakers that are smaller than the tweeter and are touted as being all that's necessary. Sound bars are also using speakers the size of an old tweeter just in an array with separation between left/right sides smaller than the width of the TV. Some how, we let the marketing people from places like Bose convince us that you can make the same sound from tiny speakers.
Multichannel mixes used to also include a dedicated stereo mix for those mere mortals without dedicated surround setups. These were created in the same studio with mixing decisions made based on the content. Now, we just get downmixes made by some math equation on a chip that has no concept of what the content is and just applies rules.
I also think the focus on surround means people don’t consider stereo speakers. Good bookshelf speakers are better than surround kits and easier to install. I also wonder if normal speakers are no longer cool.
Finally, I wonder if people don’t like the big receivers. There are small amplifiers but I can’t find one that works in home theater with HDMI port.
I think the main problem is that producers and audio people are stupid, pompous wankers. And I guess it doesn't help that some people go to cinema for vibrations and don't care about the content.
Voice comes through the center channel. Music tends to come out of multiple speakers, and so do a lot of explosions and sound effects.
Most people don't have multi-channel setups at home. So you get everything coming out of 2 speakers or a sound bar.
What that means, is that you get 4+ channels of music and sound effects mixed into 2 channels. So it ends up drowning out the single voice channel.
When I play movies through Kodi, I generally go into the audio settings and turn up the center channel. This fixes the issue every time for me.
I've certainly had the experience of hard to hear dialog but I think (could be wrong) that that's only really happened with listening through the TV speakers. Since I live in an apartment, 99% of the time I'm listening with headphones and haven't noticed that issue in a long time.
I had a 720p Sony Bravia from around 2006 and it was chunky. It had nice large drivers and a big resonance chamber, it absolutely did not need a sound bar and was very capable of filling a room on its own.
Engineering tradeoffs--when you make speakers smaller, you have to sacrifice something else. This applies to both soundbars and the built-in speakers.
Depending on what you're using there could be settings like stereo downmix or voice boost that can help. Or see if the media you're watching lets you pick a stereo track instead of 5.1
Also, in consumer setups with a center channel speaker it is rather common for it to have a botched speaker design and be of a much poorer quality than the front speakers and actually have a deleterious effect to dialog clarity.
Yes we know how to mix for stereo. But do we still pay attention to how we do?
Try listening to some mono pink noise on a stereo loudspeaker setup, first hard-panned to a single speaker, and then centered. The effect is especially obvious when you move your head.
It mostly boils down to filmmaker choices:
1. Conscious and purposeful. Like choosing "immersion" instead of "clarity". Yeah, nothing speaks "immersion" than being forced to put subtitles on...
2. Not purposeful. Don't atttibute to malice what can be explained by incompetency... Bad downmixing (from Atmos to lesser formats like 2.0). Even if they do that, they are not using the technology ordinary consumers have. I mean, the most glaring example is the way the text/titles/credits size on screen have been shrinking to the point of having difficulties reading them. Heck, often I have difficulties with text size on by FullHD TV, just because the editing was done on some kind of fancy 4k+ display standing 1m from the editor. Imagine how garbage it looks on 720 or ordinary 480!
For the recent example check the size (and the font used) of the movie title in the Alien Isolation movie and compare it to the movies made in the 80-90s. It's ridiculous!
There are many good youtube videos that explain the problem in more details.
It's the obsession with accents, mixed with the native speakers' conviction that vowels are the most important part.
Older movies tended to use some kind of unplaceable ("mid atlantic") accent, that could be easily understood.
But modern actors try to imitate accents and almost always focus on the vowels. Most native speakers seem to be convinced that vowels are the most important part of English, but I think it isn't true. Sure, English has a huge number of vowels, but they are almost completely redundant. It's hard to find cases where vowels really matter for comprehension, which is why they may vary so much across accents without impeding communication. So what the actors do is that they focus on the vowels, but slur the consonants, and you are pretty much completely lost without the consonants.
As a native English speaker studying Spanish, my impression is that English cares about the consonants and Spanish is way more about the vowels. YMMV
Second, I'm 55. There ARE programs I turn on the captioning for, but it's not universal at all. Generally, it's things with accents.
We absolutely do not need the captions at our house for STRANGER THINGS.
To be a bit more helpful, what are you using to listen to the show? There are dozens of ways to hear the audio. Are you listening through the TV speakers, a properly set up center channel speaker, a Kindle Fire tablet, or something else? Providing those details would assist us in actually helping you.
I turn on closed captions for most American films, but I find that I rarely need them for British ones.
The US is only at most 30% of Hollywood revenue.
Hope you find some peace with it!
Though I'm not sure why you're personally getting pride out of global corporations making money.
I suspect downmixes to stereo and poor builtin speakers might be heavily contributing to the issue you describe. Anecdotally, I have not encountered this issue after I added a center channel.
Nor do I have any issues with the loudness being inconsistent between scenes. I suspect that might be an another thing introduced by downmixing. All the surround channels are "squished" into stereo, making the result louder than it would have otherwise been.
If on a PC, there are numerous websites with various VLC "movie" settings to combat this issue. I've tried several with mixed results, I always end up reverting to default at some point because for some movies, they work, but other movies not so well, and it's horribly annoying to constantly tweak VLC advanced settings (too many clicks IMO). The idea being that with VLC, you can change frequency volumes to raise typical frequencies for voices and an in-turn lower other frequencies typical in actions scenes e.g. for explosions.
Also, as a digital video expert I will allow you to leave motion smoothing on.
That's what's so good about it. They say turning it off respects the artists or something, but when I read that I think "so I'm supposed to be respecting Harvey Weinstein and John Lasseter?" and it makes me want to leave it on.
> black frame insertion is to lower even more the pixel persistence which really does nothing for 24fps content which already has a smooth blur built in to the image
That's not necessarily true unless you know to set it to the right mode for different content each time. There are also some movies without proper motion blur, eg animation.
Or, uh, The Hobbit, which I only saw in theaters so maybe they added it for home release.
> he best is setting your tv to 120hz so that your 24fps fits evenly and you don't get 3:2 pulldown judder
That's not really a TV mode, it's more about the thing on the other side of the TV I think, but yes you do want that or VFR.
I don't care about the "filmmaker's intent", because it is my TV. I will enable whatever settings look best to me.
Yes, I usually run add blockers, Pihole etc, I’m away from home and temporarily without my filters.
> Duffer’s advice highlights a conflict between technological advances and creators' goals. Features like the ones he mentioned are designed to appeal to casual viewers by making images appear sharper or more colorful, but they alter the original look of the content. By asking fans to turn these features off, he is stressing the importance of preserving the director’s vision.
Turning it off was shocking. So much better. And it was buried several levels deep in a weirdly named setting.
I like how it looks because it is "high quality videogame effect" for me. 60 hz, 120hz, 144hz, you only get this on a good videogame setup.
I'm an avid video game player, and while FPS and sports-adjacent games demand high framerates, I'm perfectly happy turning my render rates down to 40Hz or 30Hz on many games simply to conserve power. I generally prefer my own brain's antialiasing, I guess.
Edit: to clarify, I'm suggesting that some people might prefer to let their brains "fill in the missing frames" rather than see the extra frames shown explicitly. For example, you might be more likely to notice visual tearing at 60Hz than you are to take note of visual tearing at 24Hz when you're already accustomed to filling in the missing pieces, or to a greater extreme, across two panels of a comic strip portraying motion.
...that being said motion interpolation is abomination
Real artists understand the limits of the medium they're working in and shape their creations to exist within it. So even if there was no artistic or technical merit in the choice to standardize on 24 FPS, the choice to standardize on 24 FPS shaped the media that came after it. Which means it's gained merit it didn't have when it was initially standardized upon.
>before Scorsese made The Godfather
Can you let me in on the joke?
The director was of course Francis Ford Coppola.
He had been told to slow down because 24hz simply could not capture his fast movements.
At 144hz, we would be able to better appreciate his abilities.
If you watch at a higher frame rate, the mistakes become obvious rather than melting into the frames. Humans look plastic and fake.
The people that are masters of light and photography make intentional choices for a reason.
You can cook your steak well done if you like, but that's not how you're supposed to eat it.
A steak is not a burger. A movie is not a sports event or video game.
Did you read an interview with the cow’s creator?
What next, gonna complain resolution is too high and you can see costume seams ?
The film IS the burger, you said it yourself, it shows off where the movie cheapened on things. If you want a steak you need steak framerate
I'm a filmmaker. Yes, it was.
> What next, gonna complain resolution is too high and you can see costume seams ?
Try playing an SNES game on CRT versus with pixel upscaling.
The art direction was chosen for the technology.
https://www.youtube.com/shorts/jh2ssirC1oQ
> The film IS the burger, you said it yourself, it shows off where the movie cheapened on things. If you want a steak you need steak framerate
You don't need 48fps to make a good film. You don't need a big budget either.
If you want to take a piece of art and have it look garish, you do you.
You can see cheap set decoration at 48 fps. It disappears at 24 fps.
>I'm a filmmaker. Yes, it was.
What you are is dishonest. Quote my entire sentence not cut it in half changing its entire meaning
> The choice wasn't intentional, it was forced by technology and in turn, methods were molded by technological limitation.
There was no choice unless you think "just make it look bad by ignoring tech limitations" is realistic choice of someone actually taking money for their job.
>> What next, gonna complain resolution is too high and you can see costume seams ?
>Try playing an SNES game on CRT versus with pixel upscaling.
>The art direction was chosen for the technology.
There was no choice involved. You had to do it because that was what tech required from you for it to look good.
The technology changed, so art direction changed with it. Why can't movie industry keep up while gaming industry had dozen of revolutions like this ?
> You don't need 48fps to make a good film. You don't need a big budget either.
But you can take it and make it better.
> If you want to take a piece of art and have it look garish, you do you.
"Don't have budget to double the framerate" is fair argument. Why you don't use that instead of assuming anything made in better tech will be "garish" ?
Your argument is essentially saying "I don't have enough skill to use new tech and still make it look great"
I was being civil, but you're taking this too far. I was wary of engaging with your first comment given the bombastic tone, but I thought you might appreciate my domain experience. I disagree with everything you're saying, but I am not going to engage with you further.
And it's time for the art direction of films to take advantage of modern technology just like we have games made for HD resolutions toady - including ones that are made to evoke the feel of older systems while smoothing off the rough edges.
> You don't need 48fps to make a good film. You don't need a big budget either.
And you don't need HD resolutions either, but they do make it look even better - and so do high frame rates when the production is up to it.
They literally had to invent new types of makeup because HD provided more skin detail than was previously available.
It’s why you’ll find a lot of foundation marketed as “HD cream”.
The technical limitations of the past century should not define what constitutes a film.
Sorry for being snarky. It's just that I have large difficulties enjoying 24 fps pan shots and action scenes. It's like watching a slide show to me. I'm rather annoyed that the tech hasn't made any progress in this regard, because viewers and makers want to cling on to the magic/dream-like experiences they had in their formative years.
I haven't thought about or noticed in nearly two decades
My eyes 100% adjusted, I like higher frame and refresh rates now
I cant believe that industry just repeated a line about how magical 24fps feels for ages and nobody questioned it, until they magically had enough storage and equipment resources to abandon it. what a coincidence
I would vastly prefer original material at high frame rates instead of interpolation.
But I remember the backslash against “The Hobbit: An Unexpected Journey” because it was filmed at 48 Hz, and that makes me think that people dislike high frame rate content no matter the source, so my comment also covers these cases.
Also, because of that public response, we don't have more content actually filmed at high frame rates =)
Every PC gamer knows you need high frame rates for camera movement. It's ridiculous the movie industry is stuck at 24 like it's the stone age, only because of some boomers screaming of some "soap opera" effect they invented in their brains. I'd imagine most Gen Z people don't even know what a "soap opera" is supposed to be, I had to look it up the first time I saw someone say it.
My LG OLED G5 literally provides a better experience than going to the cinema, due to this.
I'm so glad 4k60 is being established as the standard on YouTube, where I watch most of my content now... it's just movies that are inexplicably stuck in the past...
Obviously not, because generations of people saw "movement" at 24 fps. You're railing against other people's preferences, but presenting your personal preferences as fact.
Also, there are technical limitations in cameras that aren't present in video games. The higher the frame rate, the less light that hits it. To compensate, not only do you need better sensors, but you probably need to change the entire way that sets, costumes, and lighting are handled.
The shift to higher frame rates will happen, but it's gonna require massive investment to shift an entire industry and time to learn what looks good. Cinematographers have higher standards than random Youtubers.
It is a fact that motion is smoother at 120 fps than 24, and therefore easier to follow on screen. There are no preferences involved.
> Also, there are technical limitations in cameras that aren't present in video games.
Cameras capable of recording high quality footage at this refresh rate already exist and their cost is not meaningful compared to the full budget of a movie (and you can use it more than one time of course).
Yes, but that's not what you wrote. "unwatchable judder that I can't even interpret as motion sometimes" is false, unless you have some super-rare motion processing disorder in area MT of your brain.
> Cameras capable of recording high quality footage at this refresh rate already exist and their cost is not meaningful compared to the full budget of a movie
Yes, but that's not what I wrote. The cost to handle it is not concentrated in the camera itself. Reread my comment.
For those unfamiliar with the term you should watch Vincent Teoh @ HDTVTest:
https://www.youtube.com/hdtvtest
Creative intent refers to the goal of displaying content on a TV precisely as the original director or colorist intended it to be seen in the studio or cinema.
A lot of work is put into this and the fact that many TVs nowadays come with terrible default settings doesn't help.
We have a whole generation who actually prefer the colors all maxed out with motion smoothing etc. turned to 11 but that's like handing the Mona Lisa to some rando down the street to improve it with crayons.
At the end of the day it's disrespectful to the creator and the artwork itself.
Then the release of the next three were just so much worse. More of the same bad stuff, but now they're rewriting the bad guys, good guys, and world setting. Major characters have fallen to the wayside, other side characters are now main. Tons of stupid, "I'm angry at you" followed by splitting up while being hunted by monsters, or hashing out some grievance while being chased by monsters in an end of the world scenario.
The cast is not awkwardly cute anymore. They are full grown adults playing children. I can forgive this as I know the difficulties in getting a production together. But it does make the petty squabbles -- which are constant -- more unbearable.
I'll watch the last episodes at they come out this week, but I have low hopes. It would be nice to see something actually wrapped up nicely even if the show is stumbling to the finish. So help me though, if they win through the power of friendship and love...
I was going to write a lot more but I just want to not think about this anymore, I'm so disappointed. It seems like the best we can wish for from Netflix shows is that they get cancelled before they turn into whatever this became...
It was 10 stars before it was even released... Are humans still needed at all? Just have LLMs generate crappy content and bots upvote it.
The "soap opera" feel is NOT from bad interpolation that can somehow be done right. It's inherent from the high frame rate. It has nothing to do with "video cameras", and a lot to do with being simply too real, like watching a scene through a window. There's no magic in it.
Films are more like dreams than like real life. That frame rate is essential to them, and its choice, driven by technical constraints of the time when films added sound, was one of happiest accidents in the history of Arts.
That can be a factor, but I think this effect can be so jarring that many would realize that there's a technical problem behind it.
For me 24 fps is usually just fine, but then if I find myself tracking something with my eyes that wasn't intended to be tracked, then it can look jumpy/snappy. Like watching fast flowing end credits but instead of following the text, keeping the eyes fixed at some point.
> Films are more like dreams than like real life. That frame rate is essential to them, and its choice, driven by technical constraints of the time when films added sound, was one of happiest accidents in the history of Arts.
I wonder though, had the industry started with 60 fps, would people now applaud the 24/30 fps as a nice dream-like effect everyone should incorporate into movies and series alike?
Real is good, it’s ergonomic and accessible. Until filmmakers understand that, I’ll have to keep interpolation on at the lowest setting.
Yes! The other happy accident of movies that contribute to the dream-like quality, besides the lower frame rate, is the edit. As Walter Murch says in "In the Blink of an Eye", we don't object to jumps in time or location when we watch a film. As humans we understand what has happened, despite such a thing being impossible in reality. The only time we ever experience jumps in time and location is when we dream.
I would go further and say that a really good film, well edited, induces a dreamlike state in the viewer.
And going even further than that, a popular film being viewed by thousands of people at once is as though those people are dreaming the same dream.
I remember when I was very little that it was actually somewhat “confusing”, or at least quite taxing mentally, and I’m pretty sure I see this in my own very little children.
As we grow and “practice” watching plays, TV, movies, read books, our brains adapts and we become completely used to it.
https://en.wikipedia.org/wiki/Kuleshov_effect#:~:text=The%20...
You've just learned to associate good films with this shitty framerate. Also, most established film makers have yet to learn (but probably never will) how to make stuff look good on high frames. It's less forgiving.
It'll probably take the next generation of viewers and directors..
I think the criticisms of The Hobbit when it came out in 48fps showed that it's not just that.
I don’t see how it’s the case for frame rate, except perhaps for CGI (which has also improved).
I think just like with games, there’s an initial surprised reaction; so many console-only gamers insisted they can’t see the point of 60 fps. And just like with games, it only takes a little exposure to get over that and begin preferring it.
> Films are more like dreams than like real life. That frame rate is essential to them
Complete bullshit.
"Duffer’s advice highlights a conflict between technological advances and creators' goals. Features like the ones he mentioned are designed to appeal to casual viewers by making images appear sharper or more colorful, but they alter the original look of the content. By asking fans to turn these features off, he is stressing the importance of preserving the director’s vision."
To be fair, "vivid" mode on my old Panasonic plasma was actually an impressive option compared to how an LCD would typically implement it. It didn't change the color profile. It mostly changed how much wattage the panel was allowed to consume. Upward of 800w IIRC. I called it "light cannon" mode. In a dark room, it makes black levels look like they have their own gravitational field despite being fairly bright in absolute terms.
The interesting thing when turning on filmmaker mode is the feeling of too warm and dark colors. It will go away when the eyes get used to it. But it then lets the image pop when it’s meant to pop etc. I also turned off this auto brightness [2] feature that is supposed to guard the panel from burn it but just fails in prolonged dark scenes like in Netflix Ozark.
[1] https://youtu.be/uGFt746TJu0?si=iCOVk3_3FCUAX-ye [2] https://youtu.be/E5qXj-vpX5Q?si=HkGXFQPyo6aN7T72
------------------------
Settings that make the image look less like how the material is supposed to look are not "advances".
Q: So why do manufacturers create them?
A: They sell TV's.
Assume that every manufacturer out there is equally capable of creating a screen that faithfully reproduces the content to the best ability of current technology. If every manufacturer does just that, then their screens will all look extremely similar.
If your TV looks like everybody else's, how do you get people strolling through an electronics store to say, "Wow! I want that one!"? You add gimmicky settings that make the image look super saturated, bizarrely smooth, free of grain etc.. Settings that make the image look less like the source, but which grab eyes in a store. You make those settings the default too, so that people don't feel ripped off when they take the TV out of the store.
If you take a typical TV set home and don't change the settings from default, you're typically not going to see a very faithful reproduction of the director's vision. You're seeing what somebody thought would make that screen sell well in a store. If you go to the trouble of setting your screen up properly, your initial reaction may be that it looks worse. However, once you get used to it, you'll probably find the resulting image to be more natural, more enjoyable, and easier on the eyes.
That basically isn’t true. Or rather, there are real engineering tradeoffs required to make an actual consumer good that has to be manufactured and sold at some price. And, especially considering that TVs exist at different price points, there are going to be different tradeoffs made.
Takeout is a horrible way to do regular backups. You have to manualy request it, it takes a long time to generate, manual download... I only use it for monthly full backups.
Much better way for continous incremental backups is IMAP client that locally mirrors incomming emails (Mutt or Thunderbird). It can be configured to store every email in separate file.
I know I'm pretty unsophisticated when it comes to stuff like art, but I've never been able to appreciate takes like this. If I'm watching something on my own time from the comfort of my home, I don't really care about what the filmmaker thinks if it's different than what I want to see. Maybe he's just trying to speak to the people who do care about seeing his exact vision, but his phrasing is so exaggerated in how negatively he seems to see these settings makes it seem like he genuinely thinks what he's saying applies universally. Honestly, I'd have a pretty similar opinion even for art outside of my home. If someone told me I was looking at the Mona Lisa wrong because it's "not what the artist intended" I'd probably laugh at them. It doesn't really seem like you're doing a good job as an artist if you have to give people instructions on how to look at it.
That's arguably a thing, due to centuries of aged and yellowed varnish.
You can watch whatever you want however you want, but it's entirely reasonable for the creator of art to give tips on how to view it the way it was intended. If you'd prefer that it look like a hybrid-cartoon Teletubby episode, then I say go for it.
When walking past a high end TV I've honestly confused a billion dollar movie for a teen weekend project, due to this. It's only when I see "hang on, how's Famous Actor in this?" that I see that oh this is a Marvel movie?
To me it's as if people who don't see it are saying "oh, I didn't even realise I'd set the TV to black and white".
This is not high art. It's... well... the soap opera effect.
Similar is the case for sound and (to a much lesser extent) contrast.
Viewers need to be able to see and hear in comfort.
This is more comparable to color being turned off. Sure, if you're completely colorblind, then it's not an issue. But non-colorblind people are not "snobs".
Or if dialog is completely unintelligible. That's not a problem for people who don't speak the language anyway, and would need subtitles either way. But people who speak English are not "snobs" for wanting to be able to understand dialog spoken in English.
I've not seen a movie filmed and played back in high frame rate. It may be perfectly fine (for me). In that case it's not about the framerate, but about the botched interpolation.
Like I said in my previous comment, it's not about "art".
People like you insisting on 24 fps causes people like me to unnecessarily have to choose between not seeing films, seeing them with headaches or seeing them with some interpolation.
I will generally choose the latter until everything is at a decent frame rate.
What has been asserted without evidence can be dismissed without evidence.
I'll take the Pepsi challenge on this any day. It looks horrible.
> Good quality sets and makeup and cameras look good at 24 or 48 or 120 fps.
Can you give an example of ANY movie that survives TV motion interpolation settings? Billion dollar movies by this definition don't have good quality sets and makeup.
E.g. MCU movies are unwatchable in this mode.
> People like you insisting on 24 fps
I don't. Maybe it'll look good if filmed at 120fps. But I have seen no TV that does this interpolation where it doesn't look like complete shit. No movie on no TV.
Edit: I feel like you're being dishonest by claiming that I insist on 24 fps. My previous comment said exactly that I don't, already, and yet you misrepresent me in your very reply.
> causes people like me to unnecessarily [… or …] seeing them with some interpolation
So you DO agree that the interpolation looks absolutely awful? Exactly this is the soap opera effect.
I know that some people can't see it. Lucky you. I don't know what's wrong with your perception, but you cannot simply claim that "there's no such thing" when it's a well known phenomenon that is easily reproducible.
I've come to friends houses and as soon as the TV comes on I go "eeew! Why have you not turned off motion interpolation?". I have not once been wrong.
"There's no such thing"… really… who am I going to believe? You, or my own eyes? I feel like a color blind person just told me "there's no such thing as green".
The “soap opera effect” is what people call video at higher than 24 fps in general, it has nothing to do with interpolation. The term has been used for decades before interpolation even existed. You seem to be confused on that point.
Source video at 120 looks no worse than at 24, that’s all I’m saying.
Please stop repeatedly misrepresenting what I said. This is not reddit.
I have repeatedly said that this is about the interpolation, and that I'm NOT judging things actually filmed at higher framerates, as I don't have experience with that.
> Source video at 120 looks no worse than at 24, that’s all I’m saying.
Again, give me an example. An example that is not video games, because that is not "filmed".
You are asserting that there's no such thing as something that's trivially and consistently repeatable, so forgive me for not taking you at your word that a 120fps filmed movie is free of soap opera effect. Especially with your other lying.
So actually, please take your misrepresentations and ad hominems to reddit.
Edit: one thing that looks much better with motion interpolation is panning shots. But it's still not worth it.
I don’t see what I lied about or what Reddit has to do with anything. I will definitely stop replying to someone so needlessly aggressive.
Earlier video cameras exposed the pixels differently, sampling the image field in the same linear fashion that it was scanned on a CRT during broadcast. In the US this was also an interlaced scanning format. This changes the way motion is reproduced. The film will tend to have a global motion blur for everything moving rapidly in the frame, where the video could have sharper borders on moving objects, but other distortions depending on the direction of motion, as different parts of the object were sampled at different times.
Modern digital sensors are somewhere in between, with enough dynamic range to allow more film-like or video-like response via post-processing. Some are still rolling shutters that are a bit like traditional video scanning, while others are full-field sensors and use a global shutter more like film.
As I understand it, modern digital sensors also allow more freedom to play with aperture and exposure compared to film. You can get surprising combinations of lighting, motion blur, and depth of field that were just never feasible with film due to the limited sensitivity and dynamic range.
There are also culturally associated production differences. E.g. different script, set, costume, makeup, and lighting standards for the typical high-throughput TV productions versus the more elaborate movie production. Whether using video or film, a production could exhibit more "cinematic" vs "sitcom" vs "soapy" values.
For some, the 24 fps rate of cinema provides a kind of dreamy abstraction. I think of it almost like a vague transition area between real motion and a visual storyboard. The mind is able to interpolate a richer world in the imagination. But the mature techniques also rely on this. I wonder whether future artists will figure out how to get the same range of expression out of high frame rate video or whether it really depends on the viewer getting this decimated input to their eyes...
But preferring high frame rate is common, as evidenced by games and the many people who use TV interpolation features.
I hear you, artists (and fans) are frequently overly dogmatic on how their work should be consumed but, well, that strikes me as part-and-parcel of the instinct that drives them to sink hundreds or thousands of hours into developing a niche skill that lets them express an idea by creating something beautiful for the rest of us to enjoy. If they didn't care so much about getting it right, the work would probably be less polished and less compelling, so I'm happy to let them be a bit irritating since they dedicated their life to making something nice for me and the rest of us, even if it was for themselves.
Up to you whether or not this applies to this or any other particular creator, but it feels appropriate to me for artists to be annoying about how their work should be enjoyed in the same way it's appropriate for programmers to be annoying about how software should be developed and used: everyone's necessarily more passionate and opinionated about their domain and their work, that's why they're better at it than me even if individual opinions aren't universally strictly right!
I wouldn't call it a "technological advance" to make even the biggest blockbuster look like it was filmed with a 90s camcorder with cardboard sets.
Truemotion and friends are indeed garbage, and I don't understand how people can leave it on.
It seems they want to make these settings usable without specialist knowledge, but the end result of their opaque naming and vague descriptions is that anybody who actually cares about what they see and thinks they might benefit from some of the features has to either systematically try every possible combination of options or teach themselves video engineering and try to figure out for themselves what each one actually does.
This isn't unique to TVs. It's amazing really how much effort a company will put into adding a feature to a product only to completely negate any value it might have by assuming any attempt at clearly documenting it, even if buried deep in a manual, will cause their customers' brains to explode.
I’m with you personally, but the companies that sell TVs are not.
It wasn't that long ago, that the manual spelled out everything in detail enough that a kid could understand, absorb, and decide he was going to dive into his own and end up in the industry. I wouldn't have broken or created nearly as much, without it.
But, a few things challenged the norm. For many, many reasons, manuals became less about the specification and more about the functionality. Then they became even more simplified, because of the need to translate it into thirty different languages automatically. And even smaller, to discourage people from blaming the company rather than themselves, by never admitting anything in the manual.
What I would do for a return to fault repair guides [0].
[0] https://archive.org/details/olivetti-linea-98-service-manual...
Updates are a mixed bag.
A punch card machine certainly requires specs, and would not be confused with a toy.
A server rack, same, but the manuals are pieced out and specific, with details being lost.
You’ll notice anything with dangerous implications naturally wards off tampering near natively.
Desktop and laptop computers depending on sharp edges and design language, whether they use a touch screen. Almost kids toys, manual now in collective common sense for most.
Tablet, colorful case, basically a toy. Ask how many people using one can write bit transition diagrams for or/and, let alone xor.
We’ve drifted far away from where we started. Part of me feels like the youth are losing their childhoods earlier and earlier as our technology becomes easier to use. Being cynical of course.
It would also help if there was a common, universal, perfect "reference TV" to aim for (or multiple such references for different use cases), with the job of the TV being to approximate this reference as closely as possible.
Alas, much like documenting the features, this would turn TVs into commodities, which is what consumers want, but TV vendors very much don't.
(Still cheaper than a Netflix subscription though.)
Walmart might be able to resell a damaged/open box $2k TV at a discount, but I don’t think that’s so easy for speciality calibrated equipment.
https://news.ycombinator.com/item?id=37218711
It was a real eye(ear?)-opener to watch Seinfeld on Netflix and suddenly have no problem understanding what they're saying. They solved the problem before, they just ... unsolved it.
Watch An American Werewolf in London, Strange Days, True Lies, Blade Runner, or any other movie from the film era all up to the start of digital, and you can see that the sets are incredibly well lit. On film they couldn't afford to reshoot and didn't have immediate view of what everything in the frame resulted on, so they had to be conservative. They didn't have per-pixel brightness manipulation (feathering and burning were film techniques that could technically have been applied per frame, but good luck with doing that at any reasonable expense or amount of time). They didn't have hyper-fast color film-stock they could use (ISO 800 was about the fastest you could get), and it was a clear downgrade from anything slower.
The advent of digital film-making when sensors reached ISO 1600/3200 with reasonable image quality is when the allure of time/cost savings of not lighting heavily for every scene showed its ugly head, and by the 2020's you get the "Netflix look" from studios optimizing for "the cheapest possible thing we can get out the door" (the most expensive thing in any production is filming in location, a producer will want to squeeze every minute of that away, with the smallest crew they could get away with).
My wife grew up in a hot and humid climate where things went bad quickly, so this tendency doesn't come from nowhere. Her whole family now lives in the US midwest, and there are similar arguments between her siblings and their spouses.
https://www.nist.gov/system/files/documents/2023/02/09/2023%...
Any applicable unit pricing requirements would be at the state/local level, not federal, but only a few states have such requirements. See: https://www.nist.gov/pml/owm/national-legal-metrology/us-ret...
This is all out there -- but consumers DO NOT want it, because in a back-to-back comparison, they believe they want (as you'll see in other messages in this thread) displays that are over-bright, over-blue, over-saturated, and over-contrasty. And so that's what they get.
But if you want a perfect reference TV, that's what Filmmaker Mode is for, if you've got a TV maker that's even trying.
Sony specifically targets the reference with their final calibration on their top TVs, assuming you are in Cinema or Dolby Vision mode, or whatever they call it this year.
On my TCL TV I can turn off "smart" image and a bunch of other crap, and there's a "standard" image mode. But I'm not convinced that's actually "as close to reference as the panel can get". One reason is that there is noticeable input lag when connected to a pc, whereas if I switch it to "pc", the lag is basically gone, but the image looks different. So I have no idea which is the "standard" one.
Ironically, when I first turned it on, all the "smart" things were off.
Good to know there seems to be an effort to keep some consistency.
(The trickiest thing is actually brightness. LG originally used to set brightness to 100 nits in Filmmaker Mode for SDR, which is correct dark room behavior -- but a lot of people aren't in dark rooms and want brighter screens, so they changed it to be significantly brighter. Defensible, but it now means that if you are in a dark room, you have to look up which brightness level is close to 100 nits.)
It makes me wish that there was something like an industry standard 'calibrated' mode that everyone could target - let all the other garbage features be a divergence from that. Hell, there probably is, but they'd never suggest a consumer use that and not all of their value-add tackey DSP.
“Filmmaker Mode” on LG OLED was horrible. Yes, all of the “extra” features were off, but it was overly warm and unbalanced as hell. I either don’t understand “Filmmakers” or that mode is intended to be so bad that you will need to fix it yourself.
TV producers always set their sets to way higher by default because blue tones show off colors better.
As a result of both that familiarity and the better saturation, most people don't like filmmaker when they try to use it at first. After a few weeks, though, you'll be wondering why you ever liked the oversaturated neons and severely off brightness curve of other modes.
Or not, do whatever you want, it's your TV!
When you say that "HDR is static" you probably mean that "Dynamic tone-mapping" was turned off. This is also correct behavior. Dynamic tone-mapping isn't about using content settings to do per-scene tone-mapping (that's HDR10+ or Dolby Vision, though Samsung doesn't support the latter), it's about just yoloing the image to be brighter and more vivid than it should be rather than sticking to the accurate rendering.
What you're discovering here is that the reason TV makers put these "garbage features" in is that a lot of people like a TV picture that's too vivid, too blue, too bright. If you set it to the true standard settings, people's first impression is that it looks bad, as yours was. (But if you live with it for a while, it'll quickly start to look good, and then when you look at a blown-out picture, it'll look gross.)
I'd suggest living with it for a while; if you do, you'll quickly get used to it, and then going to the "standard" (sic) setting will look too blue.
What is the correct colour?
Well for starters you’re viewing the real sky in 3D and your TV is a 2D medium. Truly that immediately changes your perception and drastically. TV looks like TV no matter what.
It is not as clear cut as you think and is very much a gradient. I could send 10 different color gradings of the sky and grass to 10 different people and they could all say it looks “natural” to them, or a few would say it looks “off,” because our expectations of “natural” looks are not informed by any sort of objective rubric. Naturally if everyone says it’s off the common denominator is likely the colorist, but aside from that, the above generally holds. It’s why color grading with proper scopes and such is so important. You’re doing your best to meet the expectation for as many people as possible knowing that they will be looking on different devices, have different ideas of what a proper color is, are in different environments, etc. and ultimately you will still disappoint some folks. There are so many hardware factors at play stacked on top of an individual’s own expectations.
Even the color of the room you’re in or the color/intensity of the light in your peripheral vision will heavily influence how you perceive a color that is directly in front of you. Even if you walk around with a proper color reference chart checking everything it’s just always going to have a subjective element because you have your own opinion of what constitutes green grass.
There is no such thing as the “correct” or “most natural” image. There is essentially no “true” image.
Game mode will indeed likely turn off any expensive latency-introducing processing but it's unlikely to provide the best color accuracy.
I also auditioned the LG G5.
I calibrated both of them. It is not that much effort after some research on avsforum.com. I think this task would be fairly trivial for the hackernews crowd.
They want a spy device in your house, recording and sending screenshots and audio clips to their servers, providing hooks into every piece of media you consume, allowing them a detailed profile of you and your household. By purchasing the device, you're agreeing to waiving any and all expectations of privacy.
Your best bet is to get a projector, or spend thousands to get an actual dumb display. TVs are a lost cause - they've discovered how to exploit users and there's no going back.
Unfortunately settings won't help Season 5 be any better, it verges on being garbage itself, a profound drop in quality compared to previous seasons.
"After battling Skynet her whole life, Sarah Connor has vowed to even the playing field... no matter what the cost. Coming soon in Terminator: Hawkins!"
is it just me or does this article's last paragraph feel particularly AI generated..
whether the author did use AI or not isnt my main gripe -- it's just that certain wording (like this) won't be free from scrutiny in my head anymore :(
TV's should not try to be anything more than a large monitor.
The tone felt considerably different: constant action, little real plot, character interaction felt a shallow reflection of prior seasons, exposition rather than foreshadowing and development. I was cringing during the “Tom, Dick and Harry” section. From body language, the actors seemed to feel the same way.
https://www.indiewire.com/features/general/christopher-nolan...
Are movies produced in this colour space? No idea. But they all look great in SRGB.
A work colleague got himself a 40" HD TV as a big computer monitor. This is a few years ago. I was shocked at the overamped colour and contrast. Went through all the settings and with everything turned to minimum - every colour saturation slider, everything that could be found - it was almost realistic but still garish compared to SRGB.
But that's what people want, right? Overamped everything is how those demo loops at Costco are set up, that's what sells, that's what people want in their living rooms, right?
I just want accurate colors to the artists intent, and a brightness nob. No other image “enhancement” features
Last time I heard this reasoning about bad TV settings was during the infamous GoT episode that was very dark.
Producers generally don't warn about TV settings preemptively as if to warn, so it makes me a bit concerned.
Stranger Things already face complaints about S5 lately, having viewing issues on the finale would be the cherry on top.
Also, this is probably just because I’m old, but a lot of recent TV seems inadequately lit, where you can just barely see part of one character’s face in some scenes. It’s like they watched Apocalypse Now and decided everything should be the scene with Marlon Brando in the cave.
(How did we decide on it if the defaults are terrible? A neighbor bought the same one on sale and figured it out ahead of me.)
The guy should just advocate for people to buy $2k+ TVs instead of $200 ones.
Movies have dark scenes nowadays mainly because it is a trend. On top of that dark scenes can have practical advantages (set building, VFX, lighting, etc. can be reduced or become much simpler to do which directly translates into money saved during shooting).
If I had to guess, the trend of dark scenes are a direct result of the fact that in the past two decades we our digital sensors got good enough to actually shoot in such low-light environments.
I use a Playstation 5 for everything including Netflix, Apple TV and so on. But every time I turn on the PS5, my TV detects the Playstation and automatically changes the TV's Sound and Video modes to "Gaming", which makes dialog difficult to hear on TV. So I change the setting manually using its horrible remote control, only for it to change back to Gaming the next time I use it.
I think I could get a proper aftermarket Samsung remote for their older models with 100 buttons and not have to use the menus as much.
Something that recently changed my viewpoint a little bit was that I was noticing that 24-30 fps content was appearing very choppy. I couldn't figure out why it looked like that. It turns out it's because modern OLED TVs can switch frames very cleanly and rapidly, CRTs or older LCDs were not like that, and their relative slowness in switching frames created a smoothing or blending effect.
Now I'm considering turning back on my TVs motion smoothing. I'm just hoping it doesn't do full-blown frame interpolation that makes everything look like a Mexican soap opera.
Unfortunately this is another basic feature that tends to be "branded" on TVs. On my Sony Bravia it's split into a combination of features called Cinemotion and Motionflow.
3:2 pulldown (or other telecine patterns) is what was used to go from 24 FPS film to 30 FPS interlaced NTSC video. Your TV or video player needs to undo that (going back to the original 24 FPS) in order to fix a judder ever 5 frames. But that is not going to fix the inherent choppiness of fast camera movements with 24 FPS film and is also not relevant for most modern content because it is no longer limited to NTSC and can instead give you the original 24 FPS directly.
Something like this:
I think you just find one with the same output specs as your PS5.
I agree with you though. We have a Sony Bravia purchased back in 2016 for $900. It has thr Android TV spam/bloat/spyware but it’s not used and never connects to the Internet which has made the TV quite usable over the years. Apple TV is connected, Sonos too, and everything works fine without any crazy settings changes. I’m not looking forward to whenever this thing needs replaced (which will likely be it actually breaking versus being outdated).
What you describe about it being hard to hear dialog is exactly what I'd expect from someone who has their TV turned down as a result of using the score/soundtrack and loud sound effects as a reference point, which consequently is too low a volume to hear the dialog.
I wouldn't be surprised if you're actually experiencing what your TV's processing turned off is like and sound balancing is actually what you (as in you, personally) _want_ it to be doing.
Not on my Samsung oled - there is an effect to boost the brightness of dark scenes (turning completely black screens into a gray smudge) that cannot be turned off completely.
I have a nec multisync, which is a banger. Its also designed for 24/7 duty cycle, so its less likley to burn out. It also goes brighter than normal TVs.
However I don't think they do OLED yet. I think you're stuck with LG.
Turns out it does not even care if I set lower resolution in Xbox display settings. So I had to just disable game master mode. And I don't miss anything.
Let people enjoy things. If you don’t even watch TV yourself, it shouldn’t bother you how other people enjoy their own TV. If someone enjoys frame interpolation for their private watching, so what?
To be fair, 24p is crap. You know and agree with that, right? Horizontal pans in 24p are just completely unwatchable garbage, verticals aren’t that much better, action sequences in 24p suck, and I somehow didn’t fully realize this until a few years ago.
A lot of motion-smoothing TVs are indeed changing framerate constantly, they’re adaptive and it switches based on the content. I suspect this is one reason kids these days don’t get the soap-opera effect reaction to high framerate that old timers who grew up watching 24p movies and 60i TV do. They’re not used to 24p being the gold standard anymore, and they watch 60p all the time, so 60p doesn’t look weird to them like it does to us.
TVs with motion interpolation fix the horizontal pan problem, so they have at least one thing going for them. I’m serious. Sometimes the smoothing messes up the content or motion, it has real and awful downsides. I had to watch Game of Thrones with frame interp, and it troubled me and it ruins some shots, but on the whole it was a net positive because there were so many horiz pans that literally hurt my eyeballs in 24p.
Consumers, by and large, don’t seem to care about brightness, color, or framerate that much, unless it’s really bad. And most content doesn’t (and shouldn’t) depend on brightness, color, or framerate that much. With some real and obvious exceptions, of course. But on the whole I hope that’s also something film school taught you, that you design films to be resilient to changes in presentation. When we used to design content for analog TV, where there was no calibration and devices in the wild were all over the map, you couldn’t count on anything about the presentation. Ever had to deal with safe regions? You lost like 15% of the frame’s area! Colors? Ha! You were lucky if your reds were even close to red.
BTW I hope you take this as lighthearted ribbing plus empathy, and not criticism or argument. I’ve worked in film too (CG film), and I fully understand your feelings. The first CG film I worked on, Shrek, delivered final frames in 8bit compressed JPEG. That would probably horrify a lot of digital filmmakers today, but nobody noticed.
On your presentation point, I think 20 year old me would have generally agreed with you but also argued strongly that people should be educated on the most ideal environment they can muster, and then should muster it! This is obviously silly, but 20 year old me is still in there somewhere. :)
Shrek was really well done, nice work.
60fps will always look like cheap soap opera to me for movies.
24 fps wasn’t chosen because it was optimal or high quality, it was chosen because it’s the cheapest option for film that meets the minimum rate needed to not degrade into a slideshow and also sync with audio.
Here’s an example that uses the 180-shutter and 1/7-frame rules and still demonstrates bad judder. “We have tried the obvious motion blur which should have been able to handle it but even with feature turned on, it still happens. Motion blur applied to other animations, fine… but with horizontal scroll, it doesn’t seem to affect it.” https://creativecow.net/forums/thread/horizontal-panning-ani...
Even with the rules of thumb, “images will not immediately become unwatchable faster than seven seconds, nor will they become fully artifact-free when panning slower than this limit”. https://www.red.com/red-101/camera-panning-speed
The thing I personally started to notice and now can’t get over is that during a horizontal pan, even with a slow speed and the prescribed amount of motion blur, I can’t see any details or track small objects smoothly. In the animation clip attached to that creativecow link, try watching the faces or look at any of the text or small objects in the scene. You can see that they’re there, but you can’t see any detail during the pan. Apologies in advance if I ruin your ability to watch pans in 24fps. I used to be fine with them, but I truly can’t stand them anymore. The pans didn’t change, but I did become more aware and more critical.
> 60fps will always look like cheap soap opera to me for movies
Probably me too, but there seems to be some evidence and hypothesizing that this is a learned effect because we grew up with 24p movies. The kids don’t get the same effect because they didn’t grow up with it, and I’ve heard that it’s also less pronounced for people who grew up watching PAL rather than NTSC. TVs with smoothing on are curing the next generation from being stuck with 24 fps.
For you it's film, but most people have their thing, and you're probably doing the same thing to something else in your household.
In a Greek restaurant I sometimes eat at there's a TV set to some absurdly high color saturation, colors are at 180%. It's been like that for years. Nobody ever even commented on it, even though it is so very very clearly uncomfortably extreme.
That's a weird one to include. It doesn't impact the pizza at all, it still tastes the same. Plus it's common to eat pizza with a fork in Italy.
I was at my parent's house the other day for Christmas and tried to start watching Wong Kar-wai's Blossoms Shanghai, but the TV made everything look so terrible that I couldn't continue with it. I was having a hard time figuring out what was just from his style and what was whatever crap the TV was trying to do to it on it's own. I'm amazed people don't realize things just look like shit on their TVs now?
There is a lot of panning in the initial scenes of The Hobbit (opening scene is the fall of Erebor). I watched that movie initially with the new higher frequency, and everything was soooo smooth. When I rewatched it, every single time I have to experience the terrible, terrible choppy, hard-to-see-anything lower frequency transformations and I cry. This is the 12st century, and the movies can't even pan across some landscape smoothly?
In that first viewing I saw everything in those caves, it was so easy. Oh how I miss that.
I'm a parent with young kids and I just do not have time to delve into all the settings. I've managed to stop films looking like soap operas but I'm not sure exactly what I've adjusted.
Also, if I'm watching from my pc on the same TV using VLC player is it a mistake just to leave the TV on "game mode" ? This seems to work fine, but I've no idea how that interacts with settings on the TV.
One last rant, it seems to have a setting that uses a light sensor to darken the picture if the room is dark. It seems to be nearly impossible to turn off except through some service menu nonsense I really don't want to touch. The only temp fix is turning the TV on and off again, which I've resorted to when I literally couldn't see what was going on in the film.
I just tried Filmmaker mode on my Samsung S95B and, like you, I find it very dark. Another flaw of this TV is that if the edges of the screen darken (like the borders in widescreen movies), the entire panel goes dark.
I didn't know about the other bug. I wonder if that is actually what I'm seeing rather than the dark room thing I'm talking about.
AI frame-gen Film grain Chromatic aberration Motion blur
Once the TVs became video cards with filtering and fake refresh rates this was always our fate.
Monitors and default video card drivers have had issues in the past. You'd think the TVs would update their filters with as much spyware data mining they do. But alas, your TV will likely never improve by software in any significance. History has proven that.
Disabling all the features to the hardware spec is best, never connecting a smarttv to the net. Interested in any true exceptions.
Plus ordinary people don't give a sh*t. Most people can't see the difference between HD and 4K (remember that in developed countries, most people are over 40, and 25% overall suffer from myopia). In the 00s, people all had 16:9 TVs and watched 4:3 programs horribly stretched without batting an eye. Most Full HD large screens suffered from horrible decoding stutter late into the 2010s.
xxdiamondxx•1mo ago
elondaits•1mo ago
astrange•1mo ago
account42•3w ago
Uehreka•1mo ago
With that being said, I’ve definitely seen TVs that just don’t have FILMMAKER MODE or have it, but it doesn’t seem to apply to content from sources like Chromecast. The situation is far from easy to get a handle on.