So there’s this music video floating around
the Interwebs.
Something about giving your cardiac muscle as a Christmas gift,
and being distraught upon its subsequent re-gifting?
Anyway, I checked it out, and people are just
FREAKING OUT
about how good the video looks.
“The quality is so good, looks like people
in 2019 trying to dress like people in the ‘80s!”
“Why is nobody talking about
how amazing this 4K transfer is”
Well, I am, and in fact so many people are that it’s
getting news coverage.
And some people are left wondering how they managed to do it.
Well, the answer is simple. The music video
was shot on film, and somebody had the master
(or a very good print) transferred in 4K.
But wait, you say, "I was alive in the ‘80s, and everything looked all fuzzy all the time.
There’s no way my favorite hits from MTV would survive into the 2020’s!"
You’re right!
But also.
You’re wrong!
It all depends on how the material you’re watching has been produced.
Was it shot on
film?
Or was it shot on tape?
And while there was never a clear rule-of-thumb for what technology was used under what circumstances,
there are a few tell-tales which will reveal which is
which.
So let’s talk about those, as well as what the difference is between film and tape.
Just as a note, this discussion takes place prior to the advent of
digital video and high definition television.
So, for the most part, we’re talking about stuff made
in the 1990’s or earlier,
with a hint of the early 2000’s peppered in.
Photography may seem like a trivial thing
today, but it wasn’t so easy for the vast
majority of its history.
The goal of this video isn’t to go through the entire history of photography,
so we’re gonna fast forward to about 1900 when we’ve settled on this film stuff.
Photographic film consists of
a substrate covered in a photosensitive goop
called the emulsion.
Silver halide crystals
in this goop will undergo a chemical change
when they are hit with photons.
Specifically,
tiny seed particles of metallic silver will
form on the surface of these crystals when
incoming photons liberate an electron from
individual atoms on the halide crystals.
If you focus light onto the film with a lens
and expose it briefly with the aid of a shutter,
the incoming photons will forms what’s called
the latent image.
Wherever the film has been exposed to light,
some of those crystals will
now have trace amounts of metallic silver
hanging out on the surface.
To obtain a usable
image, a chemical process called development
will cause the metallic silver in the exposed
halide crystals to spread and turn the entire
crystal into solid silver.
The metallic silver
then blocks light from going through the film,
and the result is that areas exposed with
light become dark, and a negative image is
now captured on the film.
This is what film negatives look like.
You can see the images quite clearly,
and if we use magnification we can see the individual
silver crystals.
This is known as the film’s grain, and depending on the properties of the film,
these grains may be larger, smaller, or even differently shaped.
The grain of the
film is what defines the image.
While these certainly can’t be called pixels, at least
not in the way we use that term today,
they represent the smallest amount of detail that
the film can capture.
However, because the grains are distributed randomly,
and since they are themselves found in all sorts of shapes and sizes,
we can’t exactly define
the resolution in any numerical terms other than
“X number of grains per millimeter
on average”.
Over the course of film’s development, and
I mean technological development, we created
better film that was more sensitive to light
(sometimes known as faster film),
we created color films,
we created sharper films,
and we created safer film that wouldn’t burst into flames if the nitrate substrate were to get too hot.
But throughout the entire history of film, into present day, the basics are the same.
Tiny grains of silver halides react to light, and these form the base of the image after development.
In color films the silver halides are used as a sort of intermediary
and are replaced with color dyes in the development process,
but the original grain structure
remains intact and visible.
Of course, where’s there’s photography,
motion pictures are soon to follow!
The motion picture cameras used in Hollywood would typically use film just like this,
though instead of being run horizontally through a camera like these have been,
it would be run vertically, using a smaller portion of the film with each exposure.
This film is 35mm across which gave it the name
35mm.
(shocker)
A typical motion picture
camera would expose the film 24 times in a second,
creating a frame rate of 24 frames
per second.
And if you create a positive print from these original negatives,
then run it through a projector which will shine light through it and project it onto a surface,
you’ll be able to recreate the illusion of movement!
This was the only way we knew how to make
moving pictures happen for a long while.
We took the technology of the photograph, sped
it up, and projected it.
Each individual film frame was itself just a photograph,
made up of those tiny silver grains,
and with surprisingly high visual fidelity.
And we were all having fun watching these amazing movies at the cinema,
enjoying our popcorn, socializing with the
neighbors, and then a little thing happened
called television.
Television.
An invention with limitless potential.
When it works.
One day, I predict millions of people might learn Latin and Greek sitting in front of their TV sets.
Anyway, with television, we were building images in a whole new way.
With ELECTRONICS.
Analog television is some fascinating stuff, and I’ve made a number of videos about it,
which you can check out in this playlist.
But here’s the gist.
A scanning electron beam creates a pattern
of light on the face of a cathode ray tube (CRT)
called a raster.
This raster is a series of
horizontal lines.
Variations in the intensity of the beam will create variations in brightness as the line is drawn.
By synchronizing the scanned raster of the television with the scanning raster of a television camera,
we can instantly transmit moving images across
the globe.
All it takes is a television set and favorable receiving conditions.
In the earliest days of television, all television
was produced live.
We didn’t have a way to store television signals to be rebroadcast later.
Eventually, we found that we could  just adapt motion picture film for use in television
using devices called telecines and kinescopes.
While perhaps a primitive
and brute-force way to go about it, these worked.
All you needed to do was to film a
television screen to capture it on film.
And to rebroadcast it, you point a television
camera at a movie screen and tada!
Now, to be clear it was a little less janky than that,
but indeed the principle was
“film a TV”
and “Broadcast a movie”
But this was kind of annoying. For one thing,
television was created using interlaced video,
meaning we had a refresh rate of 60 hz in
the US.
Typically, when stored on film, the frame rate would be cut in half to 30 frames per second,
if not lower to 24, reducing the fluidity of motion.
But the most annoying thing was
that film is expensive!
You can’t re-use it, it takes time and effort to develop it,
and in general it presented a limitation to television production.
Enter, videotape!
The Quadruplex recorder
by Ampex was the first real success in this field.
Suddenly, we had a way to store the
electronic signals of television…
electronically.
By writing them directly onto tape, we’d
have the full 60 hz refresh rate, in fact
we’d have a literal television signal ready
to be re-transmitted at any time, and most
importantly, the tape could be re-used.
Now television could be recorded earlier in the day, broadcast later that evening,
and the same tape could be re-used for the next day.
Much to the chagrin or archivists.
But anyway...
Here’s the key thing.
Videotape is simply
storing a television signal.
And that television signal, unlike our friend Film, is not a physical object
made of microscopic silver crystals that we can see with our own eyes.
Granted, the tape is made of microscopic particles,
but they don’t hold an image. It’s just
the television raster frozen in time, which
means it does have a limited and defined resolution.
That raster?
That’s as much resolution as you can ever have, and for NTSC televisions,
like this one, that’s only 480 lines.
PAL countries got closer to 600 lines, but still.
That’s it.
I’m gonna stick to NTSC numbers from now on, so back to 480.
Those 480 lines are sort of like the speed
limit of analog television.
Television is a technical standard, after all.
You can’t exceed 480 lines of resolution or else you’ve made something that no television set can display.
So if you’re using a television camera, you’ve got those 480 lines.
No more.
No less.
The best videotape will record the television signal almost verbatim, but that’s still just…
are you ready?
480 lines.
Which today we call Standard Definition.
So, anything that was mastered on tape will
forever be limited to those 480 lines of resolution.
We can’t get any more detail out of it unless
we start doing some goofy AI stuff.
But film?
That’s an entirely different story.
Film isn’t an electronic signal following a technical standard.
It’s just a bunch of microscopic
silver crystals!
So even though we might have used film to make something for TV,
if we still have the film we can go back,
transfer the film with modern technology,
and suddenly
have the same thing in MUCH higher quality.
When shooting with film, the result will be
as sharp as the person could focus and as
fine as the grain of film is.
This is why some older television shows, like
Star Trek the Next Generation,
are looking great into the 2020’s.
The producers of
TNG had decided to shoot on film.
Why might they do that? Well, while it would make the
production more expensive, it also gave them
tons of flexibility in the editing.
Because film has a high level of detail, you can crop the frame without impacting the quality
as it appears on-screen.
But best of all, this decision paid future dividends.
Thanks to shooting on film, we aren’t stuck with a bunch of tapes in standard definition.
We have the original film negatives that we can
scan at modern resolutions and enjoy in the
highest quality possible.
But lots of television shows didn’t go that
route.
It’s easy to argue that,
"well if we’re making this for TV, why would we use
film?
If everyone’s watching this on a standard definition television,
what’s the benefit
of exceeding that quality aside from, maybe,
some freedom in the edit bay?"
And so, other shows like Star Trek Voyager, shot directly to tape.
Quick correction / clarification.
The actual live-action was shot on film, which
in hindsight seems pretty obvious considering
the refresh rate is definitely NOT 60 Hz...
I haven’t actually seen any of Voyager until
now… forgive me.
The catch with Voyager
was that the film was immediately transferred
onto videotape,
and all of the editing and special effects --so basically the entirety of post-production-- were done on tape at
NTSC resolution. Anyway...
Which in fairness didn’t matter at all back
in the day since it was only seen on the tube
TVs of the day.
In fact you might argue itwas better than film since you had a true 60Hz refresh rate again.
If you like that
sort of thing.
But that decision means we cannot improve
the quality beyond how it was in 1995.
It’s stuck that way forever.
The video is hard-coded
onto magnetic tape, either in the analog domain
or (in later years) in the digital domain,
and there’s literally no room for improvement.
If it was shot in standard definition, standard
definition it shall forever be.
And so, that music video?
It was simply shot
on this stuff!
While nobody saw it on film back in the ‘80s, at least aside from the people doing the editing,
that film was always there in storage somewhere,
ready to be transferred
a second time using today’s technology.
And now that’s it’s been done, it’s
blowing everybody’s mind.
So. How can you tell if something’s been
shot on film vs. tape?
Well, first of all, it it wasn’t produced for television, as
in, if it’s a movie,
it was shot on film.
Digital cinema wasn’t viable until the 21st century, so if it’s a movie from the
‘90s or earlier, that’s film.
And this is why so many old movies have been reissued on Blu-Ray with great results.
We have the film prints to scan in HD, or even 4K.
Unless of course they’ve been lost.
Now if it was produced for television, the
easiest thing to look for is the soap opera effect.
This is the name given for the oddly
smooth motion commonly seen in soap operas
which were, I believe without exception, always
shot to tape.
Television programming that was shot on tape will be just as smooth as that,
because the temporal resolution of 60 Hz is maintained.
In contrast, film is usually at 24 frames
per second, even when used for television,
so the motion won’t be nearly as smooth.
If you see a so-called “cinematic” framerate
for television programming, odds are that
was shot on film.
You might on rare occasion see something filmed at 30 frames per second,
which is the frame rate I’m using with this camera, but I can’t see much evidence of this being a thing.
In the US, anyway, most TV that was shot on film was done at 24 frames per second,
and 3:2 pulldown was used to fit the 60 hz refresh rate of television.
Now the refresh rate may not tell you the
whole story.
For instance, lots of times you’ll find something online that was originally recorded in 60 hz interlaced,
but now exists as 30 frames per second progressive.
So that can be hard to tell.
But if you ever see the buttery smooth motion of true analog video, that’s for sure on tape.
Again. We’re talking old stuff.
People seem to increasingly like 60hz video these days which…
yeah sure whatever floats your boat.
Another thing to look for are actual film
artifacts.
If you see specks or scratches,
odds are that’s film.
Those sorts of things
just don’t happen on videotape.
It’s not like there aren’t any artifacts at all with
tape, but they’re usually very minor and
incredibly brief. At least on anything recorded
professionally.
And if you see any sort of grain in the image… that’s film.
Something recorded from tape wouldn’t have grain to begin with, so whenever it’s visible, that footage was
(at some point anyway) on film.
Note that digital noise can look sorta like
grain, but you’re not likely to see that
on things produced professionally for television.
Other caveat, dot crawl on analog video tape
kinda looks like grain, but it’s not that
random.
One of the easiest film artifacts to spot
is known as gate weave.
Because film traveling through a camera is a mechanical affair,
the frame can’t ever be perfectly aligned with each exposure.
Slight misalignments in the
camera’s gate with the film from one frame to the next
create a slight shakiness to the end result,
so if the frame ever seems slightly unstable,
there’s a pretty good chance that was on
film.
A lot of modern film restorations have removed gate weave digitally, so
this can’t be entirely relied upon as a means to spot a film source.
What do you mean this sounds
dubbed in?
This was totally recorded at the same time as everything el --
A lot of television programming in the ‘60s,
‘70’s, and ‘80s would use a mix of film and tape.
Often you’d see tape being used
in the studio, and film being used for location shoots.
One particular example is Monty Python’s
Flying Circus.
Here, tape is clearly used whenever on-set, and film is used whenever on-location.
A large part of this is simply down to the fact that video equipment was for many years
huge, monstrous stuff that wasn’t particularly portable,
so for location
shoots, a much less bulky, all-in-one 16mm film camera
(and a sound guy on the side)
will fit the bill just fine.
In this case, it’s easier to notice film artifacts because
they used 16mm film, and not 35mm,
but even just looking for the difference in motion
between studio sketches and location sketches
will help you spot the difference.
But the number one sign of something shot
on tape is…
it looks like it was shot on tape.
OK, this isn’t helpful, I know, but
honestly you sorta get a feel for this after a while.
Analog video is just kinda smooth
and soft and fuzzy, and film just…
isn’t.
If you’ve seen a tape of something that
was originally on film, like for instance
a VHS tape of pretty much any movie, you’ll
get that smooth look, sure.
But even there…
you can often tell a film source from a tape
source.
It just has a… look to it.
Yeah that’s not helpful.
Well, here, let’s look at another example.
Cosmos.
Just like Monty Python, Carl Sagan
was being shot on tape in the studio, and
on film when on location.
Even though this is just from a VHS copy, you can definitely see a difference between the studio’s video
cameras and the film cameras.
The huge asterisk on this video, of course,
is that digital video in standard definition
exists.
Ever heard of DVDs?
Those aren’t analog, but they are still capped at the 480 lines that this thing uses.
But now we have
each of those lines subdivided digitally into
720 pixels.
And in the professional realm,
there were plenty of digital recording formats
being used in the latter part of the standard
definition era.
So the “tape look” isn’t everything, but the same limitations apply.
So what sort of detail can we get out of something
shot on film?
Well, this depends on a whole bunch of factors, mainly to do with film stock.
Just as consumers used to be able to pick between their Kodak Golds, their Fujichromes,
and their dollar store brands,
different qualities of film stocks were offered for use in motion pictures.
Some were great.
Others not so much.
Another thing about film that sorta hangs
around even to this day is that faster films,
meaning those more sensitive to light, tended
to have physically larger film grains.
So films used in low-light scenes would have
a grainier image, just like your camera today
looks like crap when you turn the ISO up to
12,800.
Which, by the way, that ISO is literally the same thing as the film speed you see on a film cartridge.
Ain’t that some fun history for ya?
But if you want to come up with an average,
you could argue that many 35mm motion picture
films have an image roughly equivalent to
today’s 4K.
But, see, it gets tricky to even say that.
Because the grain is random,
if you want to digitize film and capture all
its quality, you need to basically just increase
the resolution until it doesn’t seem to
make a difference anymore.
And at that point,
you can say “well the resolution is roughly this”.
But I think it’s safe to say that
only the WORST film stocks would be incapable
of producing an image that today we’d call
“Full HD” with 35mm film.
If I were in charge, I’d say all film transfers should
be done in 4K so that the best of the best
can be seen to their full potential.
And then of course, there’s other film formats
like 16mm, which would probably rarely benefit
from anything over 720P.
Or 70mm. 
Which you should probably scan in 8k.
Or IMAX, which you should just give up trying.
Film can hold
incredible detail, and while that’s certainly
not always the case, I’d argue it’s better
to go a little overkill and capture more data
than you need than to have to keep scanning
the print every decade or so.
That’s the thing. Old film won’t change
unless, you know, it degrades really badly.
But our scanning techniques have, and will
continue to do so.
One day we’ll be able to get all the quality that we possibly can out of it.
Perhaps we already have.
But electronic signals on tape for television are just that.
They’re built to draw 480 lines on a CRT, to a standard that was set in the 1940’s.
And so long as that’s the standard we’re working in, that’s what we’re stuck with,
and that’s as good as it’s gonna get.
But of course, here in the 21st century, we
have the luxury of digital video.
Which is always perfect.
[camera beeps]
[laughter]
♫ analogly smooth jazz ♫
Hey, so while discussing this topic on Twitter,
I realized I should have added a little something
to the script, which… well I forgot to do
so here’s me with a helpful post script!
Huh. A literal PS.
Go figure.
Anyway this
whole “lots of TV was shot on film” thing
is the reason why often times a Blu-Ray re-release
of something that was made for television
actually makes sense.
If you thought there was no point to
something like that, well you might be wrong.
So long as the show was mastered to film before
being transferred to tape, and those negatives
(or prints) are somewhere in storage, a true
high definition remaster is possible. In the
case of Star Trek Voyager, and shows like
it, in theory there could be an HD re-release,
however it would require duplicating essentially
all of the post-production work, so it’s
a fair bet that won’t happen.
Plus, we don’t
even know if those negatives were kept.
...something about giving your cardiac muscle
as a Christmas gift and being distraught upon
its subseq… nope.
Already messed it up!
*sigh* why did I keep going?
There was a…
I made a mistake.
This was the only way… eugh.
This was the
only way we… stop it!
They worked. All you needed to do… eugh
blebi buh
Those four hun…
Those four hundred eighty
line…
those four hundred eighty lines….
Four hundred eighty.
Four hundred eighty.
The best videotape… yeah no no no.
The emPHAsis
is all wrong.
Join us next week as the adventures of Freddy the Feces continue!
Actually, I take that back.
It's Freddie the Feces
Like it always was.
