This Restored Lunar Landing Footage Will Steal Your Breath

Here's a legitimate question about the moon landing: was it filmed by a potato?

Fair enough they're about 384,400 km from Earth and it was in 1969...

But even Ghostbusters got a remake. 

Ghostbusters!

Anyway back to the moon landings.
Finally, someone has turned the grainy, poor frames per second (FPS) footage into a smooth and palatable video experience.

Even after NASA failed to revamp the fifty-year-old footage.

And that someone isn't someone at all...
It's a computer fitted with a form of artificial intelligence (AI).

Photo and film restoration specialist DutchSteamMachine has created markedly outstanding footage from the originals that it almost seems surreal.

"I really wanted to provide an experience on this old footage that has not been seen before."

That has been achieved and then some.

Have a look at Charlie Duke and John Young crewing a lunar rover from the Apollo 16 mission. You can tell it's elevated from an original 12 frames per second to 60 just by how silky smooth it is.

PS: That's 5 times the frame refreshment.

 

 

Crazy, right?

But wait, there's more...

Here's enhanced footage of the Apollo 15 lunar landing site near Hadley Rille. 

 

 

And here's good old Neil A taking those historic first steps on the moon in crispy detail. If you've seen the original, and its lack of clarity, you'll appreciate how special it is.

 

 

But how was it actually done?

The AI used is Depth Aware video frame interpolation (DAIN) and it's not some fancy expensive filmmakers program. 

It's a completely free and open-source option for any filmmaker.
Although, you do need a high-end GPU and the cooling fans to go with it and just 5 minutes of footage can take between six to 20 hours to process.

The AI takes two frames far apart and merges them seamlessly to create more frames in between for a much gentler experience on the eyes and one that we are more accustomed to seeing in daily life. 

It also removes a lot of jumpiness and blur from the footage, which is how our brain attempts to process the poor imagery our eyes are feeding it. 

Hey guys, it's Clint the Intern here! Kathy from HR says if I clear out a bunch of the stickers and iron-on patches I'll be one step closer to getting a paycheque!
They're a subtle way of spreading ARSE with fellow Aussie on your favourite jacket or slapped on your bumper.
Have a look here and I'll catch you on the social media!
- Clintern

 

"People have used the same AI programs to bring old film recordings from the 1900s back to life, in high definition and colour, this technique seemed like a great thing to apply to much newer footage." the YouTuber said. 

To get the footage up to par with what we're seeing here,  DutchSteamMachine explained:

"First I set out to find the highest quality source videos, which I thankfully found as high-bitrate 720p video files. So the quality problem was solved. It is important to start with the highest possible source and edit from there. However, most of the sequences were still very choppy. This is because to spare film and record for long periods of time, most of the rover footage was shot at 12, 6 or even 1 frame(s) per second. While people have previously tried to apply stabilisation and/or other types of frame-blending to ease this effect, I have never really been satisfied with it.

I split the source file up into individual PNG frames, input them to the AI together with the input framerate (1, 6, 12 or 24) and the desired output framerate by rate of interpolation (2x, 4x, 8x). The AI starts using my GPU and looks at two real, consecutive frames. Using algorithms, it analyzes movements of objects in the two frames and renders entirely new ones. With an interpolation rate of for example;  5x, it is able to render 5 'fake' frames from just 2 real frames.

If footage was recorded at 12fps and the interpolation rate is set to 5x, the final framerate will be 60, meaning that with just 12 real frames it made 48 'fake' frames. Both are then exported back to a video and played back at 60fps with both the real and fake frames.

Finally, I apply colour correction, as often the source files have a blue or orange tint to them. I synchronize the footage with audio and if possible, also television and photos taken at the same time. Sometimes two 16mm cameras were running at the same time, so I can play those back next to each other."

Know someone who lived through the lunar landing?

Have a kid who'd be more wowed with this footage than the original?

Or just have a mate that thinks this is great?

Share and tag them to spread ARSE as we thrust Australia into the deep unknown...

#Space_Aus