NASA has accumulated many archives that there is no one to put in order due to the lack of funding for the agency. Therefore, it has to entrust this work to private individuals, such as DutchSteamMachine, which is engaged in "modernization" of old video chronicles. One of the latest works of the project is devoted to the landing of American astronauts on the moon - the original video with a frequency of 12 FPS was improved to 60 FPS.
Depth-Aware video frame INterpolation (DAIN) technology has become the main tool for processing old video that was shot at a low frequency. It is open source AI, free and actively developing. It takes two consecutive frames of the original video and analyzes them to understand which objects have moved and where. The AI then generates intermediate frames that visualize this displacement process. Since the film was initially "jerky" (due to the low frame rate), such saturation with new frames is good for it, adds smoothness to movements and improves the clarity of the picture.
The hardest part is finding the right source and choosing the right processing parameters. AI can create twice as many new frames, and eight times more, which makes it possible to get 48 more "virtual" frames out of 12 real frames and fit them all into 1 second of video. This is how the 60 FPS image, impossible for the times of the Apollo 11 mission, is created, which is indistinguishable from historical records.
DutchSteamMachine's techniques are great for restoring any low quality historical video, including archival recordings and century-old silent films. True, it takes a lot of computing power and a fair amount of time - five minutes of video can be generated for a whole day. But if it helps to preserve and popularize the historical footage, then it's definitely worth it.