[QUOTE="2Chalupas"]
What you said is all wrong. There's no such thing as "native resolution" for films "all throughout history", because films aren't shot digitally all throughout history. They were shot on FILM, an analog format with no native resolution. There are theoretical limits from what information you can extract, but there's no such thing as a "native resolution".
Heck, there have already been a few select films utilizing 8K processes, let alone 4K. Classics such as Ben Hur and Lawrence of Arabia were shot on 65mm film, and in preparations for blu-ray release were actually scanned at 8K resolution before being downscaled to 1080P for home video release. 4K is actually pretty standard these days, not for home video, but along the mastering process. Many 35mm films utilized a 4K process in their "remastering" for blu-ray. 4K supposedly is about "as good as it gets" as far as extracting data off of 35mm film elements. Obviously for some really old films shot on 35mm a 4K scan will add almost no value, but for the most sharp and pristine 35mm films then 4K will be an added benefit. But more importantly for the "evidence" you seek, it already exists in the production pipeline (i.e. the "masters" are already sitting there waiting in 4k for a large quantity of films). I'm not going down the list of films canned in 4K already, because it would probably be too long.
peterw007
Nice point. You did prove me wrong.
But even if the film technically retains more details, it doesn't mean the user will be able to perceive those details.
(For reference, HD is referring to 1080p resolution)
There is an international study on this issue, called Image Resolution of 35mm Film in Theatrical Presentation. It was conducted by Hank Mahler (CBS, United States), Vittorio Baroncini (Fondazione Ugo Bordoni, Italy), and Mattieu Sintas (CST, France).
In the study, MTF measurements were used to determine the typical resolution of theatrical release prints and answer prints in normal operation, utilizing existing state-of-the-art 35mm film, processing, printing, and projection.
The prints were projected in six movie theaters in various countries, and a panel of experts made the assessments of the projected images using a well-defined formula.
As the study indicates, perceived differences between HD and 35mm film are quickly disappearing. Notice I use the word "perceived." This is important because we are not shooting a movie for laboratory study, but rather for audiences.
At this point, the typical audience cannot see the difference between HD and 35mm. Even professionals have a hard time telling them apart.
Industry Expert
Who cares about 4K if people can't see the difference on anything but a massive screen?
Initially it will only be for home theater freaks (cinephiles/audiophiles) to watch movies at best possible quality.
I only have a 42" myself and it's not like I'm going to pay $50,000 (or even $5000) or whatever for a 4K set or a 4K projector. For me it's just something to keep an eye on, a projector might be a good way to go initially when the prices become reasonable. I agree with you that having a 24" or 34" set with 4K resolution is rather pointless, but that isn't going to happen for many many years anyway - kind of like how 1080p worked it's way from high end down to the smallest screen now being 1080p. Initially it will be just projectors and the very biggest screens. HDTV's basically pushed 42" screens into the maintream, so 4K maybe will bump up the "standard" size to like 70" or perhaps more people will look at having 4k projectors onto an even larger screen ;)
I don't really buy into "studies" like that. You can find studies that claim people can't tell MP3 files from uncompressed audio, and that's kind of rediculous if you are listening on anything but an IPOD. For gaming purposes it's like the average gamer seeing they can't see 60fps vs 30fps, or can't see screen tearing that other people clearly notice. Dubious claims at best.
Log in to comment