In the lion king example you weren't meant to see all of the detail the artists drew. In the Army men example the color on the digital version is nothing like the color of the actual toys.
They originally made those movies the way they did intentionally because what they wanted wasn't crystal clear images with unrealistic colors, they wanted atmosphere and for things to look realistic.
Film grain and dust can be excessive and distracting. It's a good thing when artifacts added due to dirt/age gets cleaned up for transfers so we can have clear images, but the result of that clean up should still show what the artists originally intended and that's where disney's digital versions really miss the mark.
That's the point in that Lion King frame, though. They drew it planning for it to get washed out by the sunlight effect, and when it's not it absolutely ruins the "vast crowd" effect they were going for because you can clearly see there's no more animals in the background and it's just 20 guys standing there.
I don't believe these were part of the filmmaker's vision at the time, but unavoidable. Nowadays they are added again to films (and video games) on purpose to create a certain (nostalgic) effect.
Colorizing a black-and-white film, for example, is not ever restoring the original intention or vision, even "subjectively." If the makers of a black-and-white film had been making a color film, they would have made different choices.
This does not mean that you should not colorize black-and-white films, you should do whatever makes you happy. I honestly can't wait until AI is recreating missing scenes or soundtracks from partially lost films, or even "re"creating entire lost films from scripts or contemporary reviews and cast lists, and expanding films to widescreen by inventing contents on the edges. But this will not be restoring a vision, this will be original work.
It's why they all have "motion smoothing" turned on all their TV's too. Yes, it's animation, but the Blu-rays look "higher resolution", and look "smoother" and less "noisy".
All the artistic benefits you and I see are lost on most watchers.
It is clear that the animators factored in the colour changes from the original media to 35mm, so it seems a disservice to them to re-release their works without honouring how they intended the films to be seen.
https://wackoid.com/game/10-pictures-that-show-why-crt-tvs-a...
This is totally bonkers, because the VHS format is crippled, also color wise. Many modern transfers are just crap.
An infamous case is the Buffy the Vampire Slayer tv show. The Blu-ray (edit: and streaming copies) went back to the film source, which is good, but… that meant losing the color grading and digital effects, because the final show wasn’t printed to film. Not only did they get lazy recreating the effects, they don’t seem to have done scene-by-scene color grading at all. This radically alters the color-mood of many scenes, but worse, it harms the legibility of the show, because lots of scenes were shot day-for-night and fixed in post, but now those just look like they’re daytime, so it’s often hard to tell when a scene is supposed to be taking place, which matters a lot in any show or film but kinda extra-matters in one with fucking vampires.
The result is that even a recorded-from-broadcast VHS is arguably far superior to the blu ray for its colors, which is an astounding level of failure.
(There are other problems with things like some kind of ill-advised auto-cropping seeming to have been applied and turning some wide shots into close-ups, removing context the viewer is intended to have and making scenes confusing, but the colors alone are such a failure that a poor VHS broadcast recording is still arguably better just on those grounds)
There's a fucking lot of things that are not worth it monetarily, but worth it for the sake of itself. Because it's a nice gesture. Or because it just makes people happy. Not to sound like some hippie idealist, but it's just so frustrating that everything has to be commoditized.
In modern tech circles, the utilitarian mindset is going strong, now that the hacker ethos is dead and it’s all about being corporate friendly and hireable.
It's always easy to complain about others not being generous enough with their time, but we always have an excuse for why we won't do it ourselves.
You can't, at least not if you want an acceptable result.
In photography, if you have a JPEG photo only, you can't do post-facto adjustments of the white balance, for that you need RAW - too much information has been lost during compression.
For movies it's just the same. To achieve something that actually looks good with a LUT (that's the fancy way for re-coloring, aka color grading), you need access to the uncompressed scans, as early in the processing pipeline as you can get (i.e. before any kind of filter is applied).
Honestly, by weakening copyright protections. People who love the works will do the work to protect them when they don't have to fear being sued into bankruptcy for trying to preserve their own culture.
> You think a kid is going to notice two pages? All they do is look at the pictures.
I’m quite sure bean counters look at Disney kids movies the exact same way, despite them being Disney’s bread and butter.
With Star Wars you have a dedicated adult fan base that’ll buy up remasters and reworkings. Aladdin? Not so much. Especially in the streaming era, no one is even buying any individual movie any more.
The Disney of yesterday might have been a bit more Jobs than Gates, compared to the Disney of today.
I agree it was likely Disney being cheap, but there are tons of people who'll buy up disney movies on physical media in the age of streaming. Not only are there disney fans who'd rival the obsessiveness of star wars fans, but like Lucas Disney just can't leave shit alone. They go back and censor stuff all the time and you can't get the uncensored versions on their streaming platform. Aladdin is even an example where they've made changes. It's not even a new thing for Disney. The lyrics to one of the songs in Aladdin were changed long before Disney+ existed.
I think there's a discussion to be had about art, perception and devotion to the "original" or "authentic" version of something that can't be resolved completely but what I don't think is correct is the perception that this was overlooked or a mistake.
> During production, we’re working mostly from computer monitors. We’re rarely seeing the images on film. So, we have five or six extremely high-resolution monitors that have better color and picture quality. We put those in general work areas, so people can go and see how their work looks. Then, when we record, we try to calibrate to the film stock, so the image we have on the monitor looks the same as what we’ll get on film.
But they didn't do a perfect job (the behavior of film is extremely complex), so there's a question- should the digital release reflect their intention as they were targeting these calibrated monitors or should it reflect what was actually released? Also, this wouldn't include other artifacts like film grain.
Except, as they say, the high grade monitors were calibrated to emulate the characteristics of film.
If we can show that D+ doesn't look like the film, then we can point out that it probably doesn't look like the calibrated monitors either. Those army men are not that shade of slime green in real life, and you'll have a hard time convincing me that after all the thought and effort went in to the animation they allowed that putrid pea shade to go through.
Another movie with the same / similar problem is the DVD release of the Lord of the Rings Extended editions. Both Blu-ray and 4K version. As far as I remember is that they fixed it for the theatrical version in 4K but not extended.
Another one that's been hard to find is the 4k matrix original color grading release. Ping me if you have it! (Not the 1080p release)
These can be especially hard to find as the files are typically enormous, with low compression to keep things like grain. I see them mostly traded on short-lived gdrives and Telegram.
Someone tell this community to share over BT. Aint nobody got time to keep up with which platform/server everyone is on and which links are expired and yuck.
But you have algorithmic grain in modern codecs, so no need to waste so much space for noise?
The other’s fake noise.
One’s a real photo from 1890. The other’s an old-timey instagram filter.
It makes sense that some folks might care about the difference. Like, I love my old family Polaroids. I would not want a scanned version of those to have the “noise” removed for compression’s sake. If that had been done, I’d have limited interest in adding fake noise back to them. By far my favorite version to have would be the originals, without the “noise” smoothed out at all.
Lots of folks have similar feelings about film. Faked grain isn’t what they’re after, at all. It’s practically unrelated to what they’re looking for.
> The other’s fake noise
But since there is no such thing as the real thing, it could just as well match one of the many real noise patterns in one of the many real things floating around, or a real thing at a different point in time with more/less degradation. And you wouldn't even know the difference, thus...
> It makes sense that some folks might care about the difference
Not really, it doesn't make sense to care about identical noise you can't tell apart. Of course, plenty people care about all kind of nonsense, so that won't stop those folks, but let's not pretentd there is some 'real physics' involved
But also a simulation called compression of a real thing is different from that real thing, so that purity test had already been failed
[EDIT] My point is "film grain's not more-real than algo noise" is simply not true, at all. An attempt to represent something with fidelity is not the same thing as giving up and faking it entirely based on a guess with zero connection to the real thing—its being a representation and not the actual-real-thing doesn't render it equally as "impure" as a noise-adding filter.
It may as well be stimulated because you won't see the difference! So now you've imagined some purity test which was never true, so you have nothing and start hallucinating some hyperbolic AI thing
Quoted: the introduction of “purity test” to the conversation, from not one of my posts.
One is the real deal and another one is a simulation. End of story.
Sometimes people create things that surpass them, and I think it is totally fair for them to belong to humanity after the people that created them generated enough money for their efforts.
You can actually, the 2006 Limited Edition DVD is a double disc version one being the original version.
However they are not DVD quality because they were transferred from LaserDisc and not the original film stock
To pick an arguably-minor but very easy to see point: the title’s different.
I can’t find out if they fix the 3% speed-up from the laser disc. The audio mix, at any rate, will be a combination of the three (stereo, mono, 70mm) original mixes, like on the laser disc, so identical to none of them. The source should predate the replacement of Latin script with made-up letters (not conceived until ROTJ then retrofitted on some releases of SW and Empire) so that’ll be intact unless they “fixed” it.
Still stuck with sub-ordinary-dvd-quality picture, as far as official releases go, so that’s too bad. Oh well, fan 35mm scan projects solved that problem.
But no, of course it looks between slightly and way better in every case. Goddamnit. Pour one out for my overworked disk array.
And here I was thinking it was just my imagination that several of these look kinda shitty on Blu-ray and stream rips. Nope, they really are worse.
Piracy: saving our childhoods one frame at a time.
I can't figure out how to determine if that's intentional.
The careful eye may also notice they almost never strike the sabers against one another in that scene... because it'd break the spinning sticks. Apparent contact is usually gently done, or a trick of perspective.
A true fan who wants to preserve and be faithful on its scan is going to dedicate their life to get it just right, while a mega corp will just open the original, click "Export as..." and call it a day.
https://www.reddit.com/r/toystory/comments/1hhfuiq/does_anyo...
As the Aladdin still shows with its wildly altered colors clearly other aspects matter/are at play. But the analog/digital discussions always seem, at least to me, to hinge heavily on DR. It’s just so interesting to me.
Many of us remember the leap from SD->HD. Many of us also can point out how 4K is nice and even noticeably better than FHD, but man…getting a 4K OLED TV with (and this is the important part) nice DR was borderline another SD->HD jump to me. Especially with video games and older films shot and displayed on film stock from start to finish. The difference is incredibly striking.
The "best" right now, in my opinion, is AgX, which at this point has various "flavours" that operate slightly differently. You can find a nice comparison of OCIO configs here: https://liamcollod.xyz/picture-lab-lxm/CAlc-D8T-dragon
I went down the tonemapping rabbit hole for a hobby game engine project a while ago and was surprised at how complex the state-of-the-art is.
If you're interested in making digital footage look exactly like film in every possible way, I'll shill our product Filmbox: https://videovillage.com/filmbox/
And here I was thinking of re-watching some old Disney/Pixar movies soon :(
The 4k77 et c. fan scans of the original Star Wars trilogy, which aimed to get as close as possible to what one would have seen in a theater the year of release, used multiple prints to fill in e.g. bad frames, used references like (I think) magazine prints of stills and well-preserved fragments or individual frames to fix the (always faded, sometimes badly) color grading and contrast and such, and had to extensively hand-correct things like scratches, with some reels or parts of reels requiring a lot more of that kind of work than others. Even Jedi required a lot of that sort of work, and those reels would have been only something like 30-35 years old when they started working on them.
https://davidsimon.com/the-wire-hd-with-videos/
It seems like the video examples are unfortunately now unavailable, but the discussion is still interesting and it's neat to see the creative trade-offs and constraints in the process. I think those nuances help evoke generosity in how one approaches re-releases or other versions or cuts of a piece of media.
https://x.com/TristanACooper/status/1194298167824650240
Open both images and compare. The visual joke is completely ruined with the cropping.
This is not true at all. Being compatible with outdated, film based projectors was much more important for being able to show it in as many theaters as possible. If they wanted to do a digital screening it would have been technologically possible.
Digital cinema went with Motion JPEG2000 with high quality settings, which leads to very large files, but also much better fidelity than likely with a contemporary video codec.
I agree with that. The article's quote from Pixar's "Making The Cut at Pixar" book was that the technology wasn't there (computer chips fast enough, storage media large enough, compression sophisticated enough) and I--along with the comment I replied to--disagree with that conclusion.
We had an incredible amount of fancy toys with no expense spared, including those SGI Onyx Infinite Reality boxes with the specialist video break out boards that did digital video or analogue with genloc. Disks were 2Gb SCSI and you needed a stack of them in RAID formations to play video. This wasn't even HD, it was 720 x 576 interlaced PAL.
We also had to work within a larger post production process, which was aggressively analogue at the time with engineers and others allergic to digital. This meant tapes.
Note that a lot of this was bad for tape machines. These cost £40k upwards and advancing the tape by one frame to record it, then back again to reposition the tape for the next frame, for hours on end, that was a sure way to reck a tape machine, so we just hired them.
Regarding 35mm film, I also babysat the telecine machines where the film bounces up and down on the sprockets, so the picture is never entirely stable. These practical realities of film just had to be worked with.
The other fun aspect was moving the product around. This meant hopping on a train, plane or bicycle to get tapes to where they needed to be. There was none of this uploading malarkey although you could book satellite time and beam your video across continents that way, which happened.
Elsewhere in broadcasting, there was some progress with glorified digital video recorders. These were used in the gallery and contained the programming that was coming up soon. These things had quite a lot of compression and their own babysitting demands. Windows NT was typically part of the problem.
It was an extremely exciting time to be working in tech but we were a long way off being able to stream anything like cinema resolution at the time, even with the most expensive tech of the era.
Pixar and a few other studios had money and bodies to throw at problems, however, there were definitely constraints at the time. The technical constraints are easy to understand but the cultural constraints, such as engineers allergic to anything digital, are hard to imagine today.
Yeah, but we were still using MPEG-2 back then, weren't we?
They would have looked like utter garbage. Bitrates would have had to be so high that I'm not sure we would have actually had enough storage. I guess we could have shipped TWO hard drives.
It was possible, but much too expensive to get it into wide release that way.
Don't believe me? Download the comparison pictures in the article to your device and play with filters and settings. You can get almost anything you want and the same was true at every step in the render pipeline to your TV.
Ps - and don't get me started on how my 60-year old eyes see color to what they perceived when I saw this in the theater
I do find that often enough commercial releases like Aladdin or other movies like Terminator 2 are done lazily and have completely different colors than what was historically shown. I think part of this is the fact that studios don't necessarily recognise the importance of that legacy and don't want to spend money on it.
Are there like multiple digital releases, one with better colour than the other?
There was similar outrage (if that's the right word) about a Matrix remaster that either added or removed a green color filter, and there's several other examples where they did a Thing with colour grading / filtering in a remaster.
https://www.youtube.com/watch?v=1mhZ-13HqLQ
There's a 35mm scan floating around from a faded copy with really weird colors sometimes
https://www.youtube.com/watch?v=Ow1KDYc9XsE
And there's an Open Matte Version, which I don't know the Origin of.
https://www.youtube.com/watch?v=Z2eCmhBgsyI
For me, it's the Open Matte that I consider the ultimate best version.
> I have an updated, I found out that T2 4K is an HDR movie that needs to be played with MadVR and enable HDR on the TV itself, now the colors are correct and I took a new screenshot: https://i.imgur.com/KTOn3Bw.jpg
> However when the TV is in HDR mode the 4K looks 100% correct, but when seeing the screenshot with HDR off then the screenshot looks still a bit wrong, here is a screenshot with correct colors: https://i.imgur.com/KTOn3Bw.jpg
CD itself can replicate same dynamic range and more, but well that doesn't sell extra copies.
Same applies for people buying modern vinyl records believing them to be more authentic than a CD or (god-forbid) online streaming.
Everything comes from a digital master, and arguably the vinyl copy adds artefacts and colour to the sound that is not part of the original recording. Additionally, the vinyl is not catching more overtones because it's analogue, there is no true analogue path in modern music any more.
Allegedly, for a lot of music that is old enough the best version to get (if you have the kind of hifi system that can make use of it) is an early 80s CD release, because it sits in a sweet spot of predating the loudness war where producers actually using the dynamic range of the CD.
Once better monitors became more commonplace, mastering became dynamic again.
This is most clear with Metallica's Death Magnetic, which is a brickwalled monstrosity on the 2008 release but was fixed on the 2015 release[0]. And you can see this all over, where albums from the 90s had a 2000s "10-year anniversary" remaster that is heavily compressed, but then a 2010s or 2020s remaster that is dynamic again.
[0] Interestingly enough between those dates, fans extracted the non-brickwalled Guitar Hero tracks and mastered them as well as they could. Fun times :).
It was sort of a happy coincidence that vinyl's limitations forced more dynamic (but less bass-y) masters. Although if your artist didn't do vinyl releases -which really was a dying medium until hipsters brought it back in the 2010s- you were hosed.
Interesting, I did not know this! I'm not doubting you, but I'm a little confused and curious about how the physics of that works out. Wouldn't being brickwalled mean the volume stays pretty constant, meaning there's less work for the needle? Or is there some kind of limit to how many overlapping waveforms a needle can pick up at once?
"Dynamic range compression" is a bit of a misleading term because it sounds like you're taking an audio signal and and squeezing it.
What you're really doing is two things: reducing (compressing) the difference between the quiet (valleys) and loudest (peaks) parts, and then pushing the volume of the peaks up to or past 0dB. Technically, that second step isn't dynamic range compression, but in practice it is / was always done. The reason they do this is because for human ears, louder sounds better. However, you lose dynamism. Imagine if you watched a movie, and a whisper during a military night raid would sound as loud as the shouty conversation they had in the planning room.
Past 0dB, a signal will 'clip'[0], which means the loudest parts of the signal cannot be expressed properly and will be cut off, leading to signal loss. Basically, 0dB is the loudest you can get.
These days, in practice, music tracks get mastered so that the average value is -14dB because streaming sites will 'normalize' tracks so that the average dB is -14dB. Here[1] you can see why that makes brickwalling bad. If your track goes full tilt and has almost no valleys, the average dB per second is rather high, so your entire track gets squeezed to average out to -14dB. But if you have lots of valleys, you can have more peaks and the average will still be -14dB!
RE: vinyl? Well, too much and / or too intense motion in the groove (the groove is effectively a physical waveform) makes the needle slightly skip out of the groove. "Too much" happens with brickwalling, "too intense" happens with very deep bass. Try to imagine the upcoming links I'm referring to as a physical groove a needle has to track, instead of a digital waveform.
Here[2] is one Death Magnetic track waveform of the brickwalled original vs. fixed remastered release. It's not too bad. But then there is this[3] insanity.
[0] https://www.youtube.com/watch?v=SXptusF7Puo / https://www.youtube.com/watch?v=g7AbmhOsrPs
[1] https://cdn.shopify.com/s/files/1/0970/0050/files/46eacedf-c...
[2] https://happyhipster.wordpress.com/wp-content/uploads/2023/0...
[3] https://happyhipster.wordpress.com/wp-content/uploads/2023/0...
I've watched Aladdin more than any as a child and the Blu-ray screenshot is much more familiar to me than the 35mm scan. Aladdin always had the velvia look.
> Early home releases were based on those 35 mm versions.
Here's the 35mm scan the author presents: https://www.youtube.com/watch?v=AuhNnovKXLA
Here's the VHS: https://www.youtube.com/watch?v=dpJB7YJEjD8
So, yes the VHS is expected to have more magenta.
Anecdotally, I remember watching Aladdin at the movie theatre when it came out and later on TV multiple times and the VHS you saw doesn't correspond to my memories at all.
I can't challenge the vividness of your memory. That's all in our heads. I remember it one way, and you remember it another.
The author is not wrong that oversaturation is a source transfer phenomena (which will always be different unless special care is taken to compare with the source material).
On most TVs that magenta wouldn't have shown as much as the youtube video shows because TVs tended to have weaker magentas. Of course, it's not like TVs were that uniformly calibrated back then and there were variations between TVs. So depending on the TV you had, it might have ended up having too much magenta but that would have usually been with more expensive and more accurate TVs.
TLDR: Transfers are hard, any link in the chain can be not properly calibrated, historically some people in charge of transferring from one source to another compensated for perceived weak links in the chain.
Regarding my memory, it becomes shakier the more I think about it. I do remember the purples but me having watched the cartoon could have affected that.
If you plug a Nintendo system's RCA cables into a modern TV, it will look like garbage. Emulated games on LCDs look pixelated.
Those games were designed for a CRT's pixel grid. They don't look right on LCDs, and the upscalers in home theater equipment don't respect that. There are hardware upscalers and software shaders that are specifically designed to replicate a CRT's quirks, to let you better approximate how those games were designed to be played.
Related - someone recently built a CRT dock for his Switch, so he could play Nintendo Switch Online's emulated games as originally intended:
I don't buy that it's a real degradation due to different presentation methods. I'm sorry, but no matter what film stock you lovingly transfer Toy Story to, it's never going to look like it does in your memory. Same with CRTs. Sure, it's a different look, but my memory still looks better.
It's like our memories get automatically upgraded when we see newer stuff. It's jarring to go back and realise it didn't actually look like that in the 90s. I think this is just the unfortunate truth of CGI. So far it hasn't reached the point of producing something timeless. I can watch a real film from the 80s and it will look just as "good" as one from today. Of course the colours will be different depending on the transfer, but what are we hoping for? To get the exact colours the director saw in his mind's eye? That kind of thing has never really interested me.
I don’t have this issue and never have. For whatever reason I’ve never “upgraded” them in my mind, and they look today exactly as I remember them when played on period hardware.
> Their system was fairly straightforward. Every frame of Toy Story’s negative was exposed, three times, in front of a CRT screen that displayed the movie.
While I have no doubt that this hadn't been done at the scale and resolution, it struck me that I'd heard about this concept in a podcast episode [1] in which very early (1964) computer animation was discussed alongside the SC4020 microfilm printer that used a Charactron CRT which could display text for exposure to film or plot lines.
[1] https://adventofcomputing.libsyn.com/episode-88-beflix-early...
Movies projected on film look different not only because of the color and texture, but also a constant spatial jitter over time. When the film moves through the projector, each frame locks into a slightly different position vertically. That creates a wobble that's called "film weave."
(If you want to create truly authentic-looking titles for a 1980s B-grade sci-fi movie, don't forget to add that vertical wobble to your Eurostile Extended Bold layout that reads: "THE YEAR IS 2025...")
I wonder if artificial grain would actually make it look better.
Like when the game Splinter Cell was released, there weee two additional ‘views’ simulating infrared and thermal cameras. Those had heavy noise added to them and felt so real compared to the main view.
Some films didn't age well though.
And finally, accept it and move on, ultimately it's their loss.
Yes not only have the best movies aged well, but the old movies that you've never heard of (not the bad ones, there are plenty of those, but the good ones, which there are still plenty of) are BETTER than the new movies of today. It's unbelievable and it's creatively inspiring to me personally, even if it requires "time travel."
It's like visiting a museum, or a grand old architectural building. If you abstract even a tiny bit from the viewer immersion (which is great), you can't help but think "What craftsmanship! We don't get stuff like this anymore" And of course you also start to pick up on where everything came from, such that even the new stuff you love looks like a slightly sleeker yet baser, cheaper copy of the old.
LOL, what? Anyone with a Blu-Ray rip file and FFmpeg can decide how it looks to them.
https://www.vulture.com/2019/07/motion-smoothing-is-ruining-... https://www.filmindependent.org/blog/hacking-film-24-frames-...
I think it was The Hobbit that had a 60 fps version, and people just... weren't having it. It's technologically superior I'm sure (as would higher frame rates be), but it just becomes too "real" then. IIRC they also had to really update their make-up game because on higher frame rates and / or resolutions people can see everything.
Mind you, watching older TV shows nowadays is interesting; I think they were able to scan the original film for e.g. the X Files and make a HD or 4K version of it, and unlike back in the day, nowadays you can make out all the fine details of the actor's skin and the like. Part high definition, part watching it on a 4K screen instead of a CRT TV.
In a FPS, trying to track movement at only 24 fps is pretty much impossible unless your target's movement is entirely predictable.
In a flight simulator, trying to land a plane in gusty weather conditions is a lot harder with only 24 fps.
Lower framerates don't just make motion choppy, it increases latency. At 24 fps, any change in movement could be up to 42 ms behind. At 120 fps, that's down to 8.3 ms. And those numbers assume that you can notice the difference in only a single frame.
I'm convinced that people claiming 24 fps is fine for games just because it's fine for film don't actually play games. At least, nothing that requires quick reaction times.
I’m sure many young people feel the exact opposite.
Load it up in DaVinci Resolve, knock the saturation and green curve down a bit, and boom, it looks like the film print.
Or you could slap a film-look LUT on, but you don't need to go that far.
Side node - I wonder if it's a millenial thing that our memories are worse due to modern technology, or perhaps we are more aware of false memories due to the sheer availability of information like this blog post.
Then a few years ago I was throwing out my parent's old CRT and decided to plug in the N64 one last time. Holy crap was it like night and day. It looked exactly as I remembered it, so much more mysterious and properly blended than it does on an LCD screen.
I don't see why the same wouldn't apply to films, sometimes our memories aren't false.
Although some people do infact remember the differences, but I'd guess a lot of those incidents are caused by people experiencing them in fairly quick succession. It's one thing to remember the difference between a DVD 20 years ago and a blu-ray you only watched today, and another to watch a DVD 15 years ago and a blu-ray 14 years ago.
But I was never very good, and it has been decades, so I don't know how much of this is just poor memory - I actually don't think I'm good enough/play enough that the latency of modern input/displays makes a difference at my level.
I would love to try both side-by-side to see if I could pick out the difference in latency/responsiveness.
I find a good test is Punch Out!!! If it's much trouble at all for me to reach at least Great Tiger, the latency is really bad (even if I couldn't tell you just by looking). If I can get to Great Tiger without much trouble but struggle to do much damage to him before getting taken out, the latency's still unacceptably high for some games, but not totally awful.
Another good one's High Speed. If I can't land the final multi ball shot at least a decent percentage of the time (the game pauses the ball a couple times while police chatter plays, when you're set up for a multi ball, and after the last pause you can land the shot to initiate multi ball immediately and skip all the flashing-lights-and-sirens crap if you're very precise with your timing, it's like very-small number of milliseconds after the ball resumes its motion) then the latency is high enough to affect gameplay.
If I can land that shot at least 60-70% of the time, and if I can reach Bald Bull in Punch Out!!!, then probably any trouble I have in other games is my own damn fault :-)
I suppose as I age further these tests will become kinda useless for me, because my reflexes will be too shot for me to ever do well at these no matter how many hours of practice I've had over the decades :-(
Anyway, even in the best case you're always going to have worse display and input latency on a digital screen with a digital video pipeline and USB controllers than an early console hooked up over composite or component to a CRT. I've found it's low enough (even on mediocre TVs, provided they have a "game mode", and those are a ton worse than most PC monitors) for me not to mind much if the emulation itself is extremely snappy and is adding practically no extra latency, and there are no severe problems with the software side of the video & input pipelines, but otherwise... it can make some already-tough games unplayably hard in a hurry.
I do wonder about the experience of people who try these games for the first time in an emulator. They'll come to the game with no built-in way to tell if they keep slipping off ledges because the latency's like six frames instead of the ~ one it was originally, or because they just suck at it.
For an example people here might be more familiar with, it's like how you can't even see bad kerning until you learn what it is, then start to notice it a lot more.
When you scan in a film you need to dust bust it, and generally clean it up (because there are physical scars on the film from going through the projector. Theres also a shit tone of dust, that needs to be physically or digitally removed, ie "busted")
Ideally you'd use a non-real time scanner like this: https://www.filmlight.ltd.uk/products/northlight/overview_nl... which will collect both colour and infrared. This can help automate dust and scratch removal.
If you're unluckly you'll use a telecine machine, https://www.ebay.co.uk/itm/283479247780 which runs much faster, but has less time to dustbust and properly register the film (so it'll warp more)
However! that doesnt affect the colour. Those colour changes are deliberate and are a result of grading. Ie, a colourist has gone through and made changes to make each scene feel more effective. Ideally they'd alter the colour for emotion, but that depends on who's making the decision.
the mechanics are written out here: https://www.secretbatcave.co.uk/film/digital-intermediary/
There are handheld tools (google hand blower bulb), but I would imagine film scanning uses something less manual
It just seems like there’s a lot of variability in each step to end up with an unintended colour, that will taken as the artist’s intent.
All steps before try to not affect the color and keep as much dynamic range as possible to give as much leeway as possible for the colorist.
Realistically, for Pixar and Disney (not people with limitwd funds, say), the color grade is much much more relevant to the final color than the specifics of digitizing.
The printers deffo make a difference to colour, but I came from VFX world where we put a macbeth chart in for each shot so we could adjust the colour afterwards. (https://en.wikipedia.org/wiki/ColorChecker) We'd have a whole team working on making sure that colour was accurate.
The scanners we used (northlight) were calibrated a lot so my understanding is that if you scanned the same film twice it was meant to be pixel perfect (We did rescans for various reasons and it supposedly matched up enough to do effects work. but that might have been proxies, ie low resolution scans that were done for speed)
Also the printers should, if they are good match it properly, thats what you're paying them for. I know that we did have a person that calibrated film projectors for colour, but I never asked them _how_ they did it.
For toy story its a bit harder because you are digitising a whole finished movie, you don't have the colour chart in every shot to keep the colour consistent. I know for adverts the telecine people did loads of fiddling to make the colour consistent, but I assumed that was because the spirit 4k was a bit shit.
I never dealt with actual finished prints, because the colourist/DI people sent the finished graded off to someone like Deluxe to print out
That has been something I've wondered about since seeing frame comparisons of (probably) telecine'ed prints of The Matrix vs. the myriad home video releases.
I mean they should be calibrated, so they have a different feel, but they shouldn't be wildly different like the screen shots.
I know the spirit operators did magic, but they were in the advertising team, and I was in film so I was never allowed to visit the sexy telecine room.
https://www.youtube.com/watch?v=lPU-kXEhSgk
TL;DW, different physical rolls of film sent to different movie theaters can have slightly different coloring, if they were done by different people or different companies or even if someone just did their job differently that day. Film color was not an exact science and not always perfectly repeatable, and depended on chemistry.
Is that because you're just leaving the film out in a big pile, or because it decays rapidly?
I would have expected film to be stored in containers.
The room that the scanners used to be in were temperature and dust controlled, everyone was supposed to wear dust jackets when you enter.
---
> see the 35 mm trailer for reference
The article makes heavy use of referring to scans of trailers to show what colours, grain, sharpness, etc. looked like. This is quite problematic, because you are replying on a scan done by someone on the Internet to accurately depict what something looked like in a commercial cinema. Now, I am not a colour scientist (far from it!), but I am a motion picture film hobbyist and so can speak a bit about some of the potential issues.
When projected in a movie theatre, light is generated by a short-arc xenon lamp. This has a very particular output light spectrum, and the entire movie process is calibrated and designed to work with this. The reflectors (mirrors) in the lamphouse are tuned to it, the films are colour graded for it, and then the film recorders (cameras) are calibrated knowing that this will be how it is shown.
When a film is scanned, it is not lit by a xenon short-arc lamp, instead various other illumination methods are used depending on the scanner. CRTs and LEDs are common. Commercial scanners are, on the whole, designed to scan negative film. It's where the money is - and so they are setup to work with that, which is very different to positive movie release film stock. Scanners therefore have different profiles to try and capture the different film stocks, but in general, today's workflow involves scanning something in, and then colour correcting post-scan, to meet an artist's expectations/desires.
Scanning and accurately capturing what is on a piece of film is something that is really quite challenging, and not something that any commercial scanner today does, or claims to do.
The YouTube channels referenced are FT Depot, and 35mm Movie Trailers Scans. FT Depot uses a Lasergraphics 6.5K HDR scanner, which is a quite high end one today. It does have profiles for individual film stocks, so you can set that and then get a good scan, but even the sales brochure of it says:
> Many common negative film types are carefully characterized at Lasergraphics to allow our scanning software to compensate for variation. The result is more accurate color reproduction and less time spent color grading.
Note that it says that less time is spent colour grading - it is still not expected that it will accurately capture exactly what was on the film. It also specifies negative, I don't know whether it has positive stock profiles as I am not lucky enough to have worked with one - for this, I will assume it does.
The "scanner" used by 35mm Movie Trailers Scans is a DIY, homemade film scanner that (I think, at least the last time I spoke to them) uses an IMX-183 sensor. They have both a colour sensor and a monochrome sensor, I am not sure what was used to capture the scans linked in the video. Regardless of what was used, in such a scanner that doesn't have the benefit of film stock profiles, etc. there is no way to create a scan that accurately captures what was on the film, without some serious calibration and processing which isn't being done here. At best, you can make a scan, and then manually adjust it by eye afterwards to what you think looks good, or what you think the film looks like, but without doing this on a colour calibrated display with the original projected side-by-side for reference, this is not going to be that close to what it actually looked like.
Now, I don't want to come off as bashing a DIY scanner - I have made one too, and they are great! I love seeing the scans from them, especially old adverts, logos, snipes, etc. that aren't available anywhere else. But, it is not controversial at all to say that this is not colour calibrated in any way, and in no way reflects what one actually saw in a cinema when that trailer was projected.
All this is to say that statements like the following in the article are pretty misleading - as the differences may not be attributable to the direct-digital-release process at all, and could just be that a camera white balance was set wrong, or some post processing to what "looked good" came out different to the original:
> At times, especially in the colors, they’re almost unrecognizable
> Compared to the theatrical release, the look had changed. It was sharp and grainless, and the colors were kind of different
I don't disagree with the premise of the article - recording an image to film, and then scanning it in for a release _will_ result in a different look to doing a direct-digital workflow. That's why major Hollywood films spend money recording and scanning film to get the "film look" (although that's another can of worms!). It's just not an accurate comparison to put two images side by side, when one is of a trailer scan of unknown accuracy.
Some examples:
https://www.reddit.com/r/Gameboy/comments/bvqaec/why_and_how...
https://www.nesdev.org/wiki/PPU_palettes#2C02
In addition to CRTs having variable properties, it turns out a lot of consoles (understandably!) cheat a little bit when generating a composite signal. The PPU's voltages are slightly out of spec, its timing is weird to work around a color artifact issue, and it generates a square wave for the chroma carrier rather than an ideal sine wave, which produces even more fun problems near the edges. So we've got all of that going on, and then the varying properties of how each TV chooses to interpret the signal. Then we throw electrons at phosphors and the pesky real world and human perception gets involved... it's a real mess!
This video is related to that issue
And with the second version of the GBA SP and the GB Micro, colors were very saturated. Particularly on the SP. If anything, cranking up the saturation on an emulator would get you closer to how things looked on those models, while heavily desaturating would get you closer to the look on earlier models.
That's certainly the case. The super low screen brightness of the first GBA was a major problem, because you often literally couldn't see things properly under less than perfect ambient light. So compensating for low brightness was more important than compensating for low color saturation, which is merely an aesthetic issue.
https://user-images.githubusercontent.com/7229541/215890834-...
It blew my mind when I finally learnt this, as I spent years of my childhood playing games that looked like the examples on the left, not realising the colours were due to the RGB monitor I had.
Also, are you able to tell me the name of the game in the second row in that screenshot?
I run into this same failure mode often. We introduce purposeful scaffolding in the workflow that isn’t meant to stand alone, but exists solely to ensure the final output behaves as intended. Months later, someone is pitching how we should “lean into the bold saturated greens,” not realising the topic only exists because we specifically wanted neutral greens in the final output. The scaffold becomes the building.
In our work this kind of nuance isn’t optional, it is the project. If we lose track of which decisions are compensations and which are targets, outcomes drift badly and quietly, and everything built after is optimised for the wrong goal.
I’d genuinely value advice on preventing this. Is there a good name or framework for this pattern? Something concise that distinguishes a process artefact from product intent, and helps teams course-correct early without sounding like a semantics debate?
So I guess try separating your compensations from the original work and create a workflow that automatically applies them
My solution is decision documents. I write down the business problem, background on how we got here, my recommended solution, alternative solutions with discussion about their relative strengths and weaknesses, and finally and executive summary that states the whole affirmative recommendation in half a page.
Then I send that doc to the business owners to review and critique. I meet with them and chase down ground truth. Yes it works like this NOW but what SHOULD it be?
We iterate until everyone is excited about the revision, then we implement.
The second is that excitement typically falls with each iteration, even while everyone agrees that each is better than the previous. Excitement follows more strongly from newness than rightness.
That is the nature of evolutionary processes and it's the reason people (and animals; you can find plenty of work on e.g. "superstition in chickens") are reluctant to change working systems.
- the fashion for unpainted marble statues and architecture
- the aesthetic of running film slightly too fast in the projector (or slightly too slow in the camera) for an old-timey effect
- the pops and hiss of analog vinyl records, deliberately added by digital hip-hop artists
- electric guitar distortion pedals designed to mimic the sound of overheated tube amps or speaker cones torn from being blown out
Most people would barely notice it as it's waaaay more subtle than your distorted guitar example. But it's there.
Part of the likeable sound of albums made on tape is the particular combination of old-time compressors used to make sure enough level gets to the tape, plus the way tape compresses the signal again on recording by it's nature.
Motion blur happens with real vision, so anything without blur would look odd. There's cinematic exaggeration, of course.
24 FPS is indeed entirely artificial, but I wouldn't call it a fetish: if you've grown with 24 FPS movies, a higher frame rate will paradoxically look artificial! It's not a snobby thing, maybe it's an "uncanny valley" thing? To me higher frame rates (as in how The Hobbit was released) make the actors look fake, almost like automatons or puppets. I know it makes no objective sense, but at the same time it's not a fetishization. I also cannot get used to it, it doesn't go away as I get immersed in the movie (it doesn't help that The Hobbit is trash, of course, but that's a tangent).
Grain, I'd argue, is the true fetish. There's no grain in real life (unless you have a visual impairment). You forget fast about the lack of grain if you're immersed in the movie. I like grain, but it's 100% an esthetic preference, i.e. a fetish.
The solution is 60fps at 1/60s. Panning looks pretty natural again, as does most other motion, and you get clarity for fast-moving objects. You can play around with different framerates, but imo anything more than 1/120s (180 degree shutter in film speak) will start severely degrading the watch experience.
I've been doing a good bit of filming of cars at autocross and road course circuits the past two years, and I've received a number of compliments on the smoothness and clarity of the footage - "how does that video out of your dslr [note: it's a Lumix G9 mirrorless] look so good" is a common one. The answer is 60fps, 1/60s shutter, and lots of in-body and in-lens stabilization so my by-hand tracking shots aren't wildly swinging around. At 24/25/30fps everything either degrades into a blurry mess, or is too choppy to be enjoyable, but at 60fps and 1/500s or 1/1000s, it looks like a (crappy) video game.
[EDIT] I mean, IIRC that was 48fps, not 60, so you'd think they'd get the shutter timing right, but man, something was wrong with it.
Some of your fancier, brighter (because you lose some apparent brightness by cutting the light for fractions of a second) home digital projectors can convincingly mimic the effect, but otherwise, you'll never quite get things like 24fps panning judder down to imperceptible levels, like a real film projector can.
"Motion smoothing" on TVs is the first thing I disable, I really hate it.
I think I've seen like one out of a couple dozen where the motion smoothing was already off.
You watch the video with your eyes so it's not possible to get "odd"-looking lack of blur. There's no need to add extra motion blur on top of the naturally occurring blur.
In practice, I think the kind of blur that happens when you're looking at a physical object vs an object projected on a crisp, lit screen, with postprocessing/color grading/light meant for the screen, is different. I'm also not sure whatever is captured by a camera looks the same in motion than what you see with your eyes; in effect even the best camera is always introducing a distortion, so it has to be corrected somehow. The camera is "faking" movement, it's just that it's more convincing than a simple cartoon as a sequence of static drawings. (Note I'm speaking from intuition, I'm not making a formal claim!).
That's why (IMO) you don't need "motion blur" effects for live theater, but you do for cinema and TV shows: real physical objects and people vs whatever exists on a flat surface that emits light.
The most natural level of motion blur for a moving picture to exhibit is not that traditionally exhibited by 24fps film, but it is equally not none (unless your motion picture is recorded at such high frame rate that it substantially exceeds the reaction time of your eyes, which is rather infeasible)
Now, don't get me wrong, I'm a fan of pixel art and retro games.
But this reminds me of when people complained that the latest Monkey Island didn't use pixel art, and Ron Gilbert had to explain the original "The Curse of Monkey Island" wasn't "a pixel art game" either, it was a "state of the art game (for that time)", and it was never his intention to make retro games.
Many classic games had pixel art by accident; it was the most feasible technology at the time.
Monkey Island II's art was slightly more comic-like than say The Last Crusade but still with realistic proportions and movements so that was the expectation before CoMI.
The art style changing to silly-comic is what got people riled up.
(Also a correction: by original I meant "Secret of" but mistyped "Curse of").
I meant Return to Monkey Island (2022), which was no more abrupt a change than say, "The Curse of Monkey Island" (1997).
Monkey Island was always "silly comic", it's its sine qua non.
People whined because they wanted a retro game, they wanted "the same style" (pixels) as the original "Secret", but Ron Gilbert was pretty explicit about this: "Secret" looked what it looked like due to limitations of the time, he wasn't "going for that style", it was just the style that they managed with pixel art. Monkey Island was a state-of-the-art game for its time.
So my example is fully within the terms of the concept we're describing: people growing attached to technical limitations, or in the original words:
> [...] examples of "fetishizing accidental properties of physical artworks that the original artists might have considered undesirable degradations"
The industry decided on 24 FPS as something of an average of the multiple existing company standards and it was fast enough to provide smooth motion, avoid flicker, and not use too much film ($$$).
Overtime it became “the film look”. One hundred-ish years later we still record TV shows and movies in it that we want to look “good” as opposed to “fake” like a soap opera.
And it’s all happenstance. The movie industry could’ve moved to something higher at any point other than inertia. With TV being 60i it would have made plenty of sense to go to 30p for film to allow them to show it on TV better once that became a thing.
But by then it was enshrined.
But for your point, back during the pal/ntsc analog days, the physical color of the cars was set so when viewed on analog broadcast, the color would be correct (very similar to film scanning).
He worked for a different team but brought in a small piece of ferrari bodywork and it was more of a day-glo red-orange than the delicious red we all think of with ferrari.
This is one of the tradeoffs of maintaining backwards compatibility and stewardship -- you are required to keep track of each "cause" of that backwards compatibility. And since the number of "causes" can quickly become enumerable, that's usually what prompts people to reinvent the wheel.
And when I say reinvent the wheel, I am NOT describing what is effectively a software port. I am talking about going back to ground zero, and building the framework from the ground up, considering ONLY the needs of the task at hand. It's the most effective way to prune these needless requirements.
(opposite meaning)
Funnily enough, e- means "out" (more fundamentally "from") and in- means "in(to)", so that's not an unexpected way to form opposite words.
But in this case, innumerable begins with a different in- meaning "not". (Compare inhabit or immiserate, though.)
Arguably true in general, but in this specific case everything I said was already true in Latin.
I first heard about this when reading an article or book about Jimi Hendrix making choices based on what the output sounded like on AM radio. Contrast that with the contemporary recordings of The Beatles, in which George Martin was oriented toward what sounded best in the studio and home hi-fi (which was pretty amazing if you could afford decent German and Japanese components).
Even today, after digital transfers and remasters and high-end speakers and headphones, Hendrix’s late 60s studio recordings don’t hold a candle anything the Beatles did from Revolver on.
In the modern day, this has one extremely noticeable effect: audio releases used to assume that you were going to play your music on a big, expensive stereo system, and they tried to create the illusion of the different members of the band standing in different places.
But today you listen to music on headphones, and it's very weird to have, for example, the bassline playing in one ear while the rest of the music plays in your other ear.
If you're listening in a room with two speakers, having widely panned sounds and limited use of reverb sounds great. The room will mix the two speakers somewhat together and add a sense of space. The result sounds like a couple of instruments playing in a room, which is sort of is.
But if you're listening with a tiny speaker directly next to each ear canal, then all of that mixing and creating a sense of space must be baked into the two audio channels themselves. You have to be more judicious with panning to avoid creating an effect that couldn't possibly be heard in a real space and add some more reverb to create a spatial environment.
Don't ask me how it works but I know gaming headsets try to emulate a surround setup.
One example:
> The crossfeed feature is great for classic tracks with hard-panned mixes. It takes instruments concentrated on one channel and balances them out, creating a much more natural listening experience — like hearing the track on a full stereo system.
https://www.youtube.com/watch?v=3Gmex_4hreQ
If you want a recent-ish album to listen to that has good sound, try Daft Punk's Random Access Memories (which won the Best Engineered Album Grammy award in 2014). Or anything engineered by Alan Parsons (he's in this list many times)
https://en.wikipedia.org/wiki/Grammy_Award_for_Best_Engineer...
Is this still a problem? Your example video is from nearly twenty years ago, RAM is over a decade old. I think the advent of streaming (and perhaps lessons learned) have made this less of a problem. I can't remember hearing any recent examples (but I also don't listen to a lot of music that might be victim to the practice); the Wikipedia article lacks any examples from the last decade https://en.wikipedia.org/wiki/Loudness_war
Thankfully there have been some remasters that have undone the damage. Three Cheers for Sweet Revenge and Absolution come to mind.
Because of this it generally makes more sense these days to just make your music have an appropriate dynamic range for the content/intended usage. Some stuff still gets slammed with compression/limiters, but it's mostly club music from what I can tell.
And of course it would have all the dirty words removed or changed. Like Steve Miller Band's "funky kicks going down in the city" in Jet Airliner
I still don't know if the compression in the Loudness War was because of esthetics, or because of the studios wanting to save money and only pay for the radio edit. Possibly both - reduced production costs and not having to pay big-name engineers. "My sister's cousin has this plug-in for his laptop and all you do is click a button"...
Upping the gain increases the relative "oomph" of the bass at the cost of some treble, right?
As a 90s kid with a bumping system in my Honda, I can confidently say we were all about that bass long before Megan Trainor came around. Everyone had the CD they used to demo their system.
Because of that, I think the loudness wars were driven by consumer tastes more than people will admit (because then we'd have to admit we all had poor taste). Young people really loved music with way too much bass. I remember my mom (a talented musician) complaining that my taste in music was all bass.
Of course, hip hop and rap in the 90s were really bass heavy, but so was a lot of rock music. RHCP, Korn, Limp Bizkit, and Slipknot come to my mind as 90s rock bands that had tons of bass in their music.
Freak on a Leash in particular is a song that I feel like doesn't "translate" well to modern sound system setups. Listening to it on a setup with a massive subwoofer just hits different.
It wasn't the bass, but rather the guitar.
The bass player tuned the strings down a full step to be quite loose, and turned the treble up which gave it this really clicky tone that sounded like a bunch of tictacs being thrown down an empty concrete stairwell.
He wanted it to be percussive to cut through the monster lows of the guitar.
https://www.izotope.com/en/learn/mastering-trends?srsltid=Af...
I have an Audio Developer Conference talk about this topic if you care to follow the history of it. I have softened my stance a bit on the criticism of the 90’s (yeah, people were using lookahead limiting over exuberantly because of its newness) but the meat of the talk may be of interest anyway.
There's a crowdsourced database of dynamic range metrics for music at:
You can see some 2025 releases are good but many are still loudness war victims. Even though streaming services normalize loudness, dynamic range compression will make music sound better on phone speakers, so there's still reason to do it.
IMO, music production peaked in the 80s, when essentially every mainstream release sounded good.
Whoa there! Audio components were about the only thing the British still excelled at by that time.
I was specifically thinking of the components my father got through the Army PX in the 60s and the hi-fi gear I would see at some friends' houses in the decades that followed ... sometimes tech that never really took hold, such as reel-to-reel audio. Most of it was Japanese, and sometimes German.
I still have a pair of his 1967 Sansui speakers in the basement (one with a blown woofer, unfortunately) and a working Yamaha natural sound receiver sitting next to my desk from about a decade later.
These versions were for radio only and thought of as cheap when done in person.
Later this was recorded, and being the only versions recorded, later generations thought that this is how the masters of the time did things, when really they would be booed off stage (so to speak).
It’s a bit of family history that passed this info on due to being multiple generations of playing the violin.
In Toy Story's case, the digital master should have had "correct" colors, and the tweaking done in the transfer to film step. It's the responsibility of the transfer process to make sure that the colors are right.
Now, counter arguments could be that the animators needed to work with awareness of how film changes things; or that animators (in the hand-painted era) always had to adjust colors slightly.
---
I think the real issue is that Disney should know enough to tweak the colors of the digital releases to match what the artists intended.
Could it be the case that generating each digital master required thousands of render hours?
They had a custom built film printer and could make adjustments there.
Otherwise, I wish I worked at a place like Oxide that does RFDs. https://rfd.shared.oxide.computer Just a single place with artifacts of a formal process for writing shit down.
In your example, writing down "The greens are oversaturated by X% because we will lose a lot of it in the transfer process to film" goes a long way in at least making people aware of the decision and why it was made, at least then the "hey actually the boosted greens look kinda nice" can prompt a "yeah but we only did that because of the medium we were shipping on, it's wrong"
Even with complete attention to detail, the final renders would be color graded using Flame, or Inferno, or some other tool and all of those edits would also be stored and reproducible in the pipeline.
Pixar must have a very similar system and maybe a Pixar engineer can comment. My somewhat educated assumption is that these DVD releases were created outside of the Pixar toolchain by grabbing some version of a render that was never intended as a direct to digital release. This may have happened as a result of ignorance, indifference, a lack of a proper budget or some other extenuating circumstance. It isn't likely John Lasseter or some other Pixar creative really wanted the final output to look like this.
I suspect having shader plugins for TV and movie watching will become a thing.
"The input is supposed to be 24 FPS, so please find those frames from the input signal. Use AI to try to remove compression artifacts. Regrade digital for Kodak 35mm film. Then, flash each frame twice, with blackness in-between to emulate how movie theaters would project each frame twice. Moderate denoise. Add film grain."
I don't actually know what kind of filters I'd want, but I expect some people will have very strong opinions about the best way to watch given movies. I imagine browsing settings, like browsing user-contributed Controller settings on Steam Deck...
Some of the more advanced CRT shaders actually attempt to mathematically model how the video gets distorted by the CRT and even the component video. If the effects of converting to film are so well-understood that Pixar can adapt their film for the process then it out to be able to post-process the video in a way that reproduces those artifacts.
I don't think its possible for it ever to be exactly the same since the display technology of a monitor is fundamentally different from a film projector(or a CRT) but it should be possible to get it good enough that its indistinguishable from an photo of the film being displayed on a modern monitor (ie the colors aren't completely different like in the comparisons in the article.
BTW TFA didn't mention this but about 15 years ago they rerendered toy story and toy story 2 for a new theatrical run when those gimmicky 3d glasses were popular. If that's the version thats being distributed today on Disney plus and bluray (IDK but i feel like it probably is) then that could potentially be a more significant factor in ruining the color balance than not having been converted to film.
Noodle made a charming video about going mad researching this: https://www.youtube.com/watch?v=lPU-kXEhSgk
Having said all that. One of the most interesting aspects of conversations around the true version of films and such is that just because of the way time works the vast majority of people's first experience with any film will definitely NOT be a in a theater.
The DVD was such a huge seller and coincided with the format really catching on. The Matrix was the "must have" DVD to show off the format and for many was likely one of the first DVDs they ever purchased.
It was also the go-to movie to show off DivX rips.
The popularity of The Matrix is closely linked with a surge in DVD popularity. IIRC DVD player prices became more affordable right around 2000 which opened it up to more people.
This is shocking to say the least.
My boring, 17" consumer Trinitron monitor in 1995 could do 1600x1200 IIRC.
Max resolution and pixel density (plus, for a long time, color gamut, contrast/depth-of-black, latency, et c) on typical monitors took a huge dive when LCDs replaced CRTs. "High res" in 1995 would probably qualify as fairly high-res today, too, if not quite top-of-the-heap. Only expensive, relatively recent displays in the consumer space really beat those '90s CRTs overall to a degree where it's not at least a close call (though they may still be a little worse in some ways)
Toy Story is the only Pixar movie ever released on Laserdisc (along with all their shorts, in the same box set). Disney also released a lot of their 90s animation on Laserdisc.
So if you're a true cinephile, seek out the Laserdisc versions.
And I don't think I'm even being bitten by a nostalgia bug as per se because it was already a nostalgic fad long gone from any cinema near me when I grew up.
The challenge is that everybody's memory is different, and sometime those memories are "I wish the graphics were rock sharp without the artifacts of the CRT". Other times our memories are of the crappy TV we were given as kids that was on its last legs and went black & white and flickered a lot.
The reality is that no matter what the intentions of the original animation teams were, the pipeline of artwork through film transfer to projection to reflection to the viewer's own eyeballs and brain has enough variety to it that it's simply too variable -- and too personal -- the really say what is correct.
Anecdote: one of the local theaters I grew up with was extremely poorly maintained, had a patched rip on one of the several dirty screens, and had projectors that would barely get through an hour of film without needing a "bump" from the projectionist (allowing the audience to go out and get more refreshments halfway through most films). No amount of intentionality by the production companies of the many films I saw there could have accounted for any of that. But I saw many of my favorite movies there.
I've come down with the opinion that these things are like wine. A good wine is the one you enjoy. I have preferences for these things, but they sometimes change, and other people are allowed to enjoy things in their own way.
I do in fact still have Toy Story on VHS and recently watched a bit of it with my toddler. And while I'm sure the Blu-ray or streamed version is higher resolution, wide screen, and otherwise carries more overall video and audio data than our tape I personally got a bit of extra joy out of watching the tape version on our old TV.
I never considered the color differences pointed out in the article here, and I'm not sure how they appear on home VHS vs on 35mm. Maybe that is a small part of what makes the tape more appealing to me although I don't think it's the full reason. Some feelings are difficult to put into words. Tapes on a full aspect ratio CRT just give me a certain feeling or have a specific character that I still love to this day.
To be the man responsible for the creation of Pixar, Industrial Light and Magic, Skywalker sound, LucasFilm games, THX, and Kerner Optical is a very impressive accomplishment for him and that's secondary to the main accomplishment he's known for in StarWars.
Now, while I liked the first three, the first one always has a special place, because at the time it was really quite new-ish. Fully computer animated movies were quite rare. Pixar did several short videos before Toy Story, and I think there were some other movies too, give or take, but Toy Story kind of changed everything past that. Unfortunately many other computer-generated movies are absolute garbage nowadays. The big movie makers want money and don't care about anything else, so they ruin the interest of people who are not super-young anymore, because let's face it: older people are less likely to watch the latest marvel 3D animated zero-story movie that is a clone of prior clones.
It would be nice if AI, despite it also sucking to no ends, could allow us to produce 3D movies with little effort. I have a fantasy game world my local pen and paper RPG group built. Would be interesting to feed it a ton of data (we have generated all that already over decades) and come up with an interesting movie that relates the story of a part of this. This is just one example of many more. Quality-wise I still like Toy Story - I find it historically important, and it was also good at the respective time (all first three actually, although the storylines got progressively weaker; I did like Ken and Barbie though, but too much of the story seemed to go into wanting to milk out more money selling toys rather than telling a story. Tim Allen as Buzz Lightyear was always great though.)
It might be a fun experiment to make custom rips of these movies that look more like their theatrical releases. I'm curious how close one can get without needing to source an actual 35mm print.
voltaireodactyl•2mo ago
cjohnson318•2mo ago