JFK Zapruder Hoax - David Mantik Part 1

Article Updated on May 9, 2020.

DO YOU SEE SOMETHING WRONG WITH THE ZAPRUDER FILM?

Was The Apparatus for Production of Light Effects in Composite Photography Used To Create The Zapruder Film?

”Front projection was invented by Will Jenkins. For this he holds patent #2727427, issued on December 20, 1955 for an "Apparatus for Production of Light Effects in Composite Photography" and patent 2727429, issued the same day for an "Apparatus for Production of Composite Photographic Effects."

It was first experimented with in 1949, shortly after the invention of Scotchlite, and had appeared in feature films by 1963, when the Japanese film Matango used it extensively for its yacht scenes. Another early appearance was in 1966, during the filming of 2001: A Space Odyssey. The actors in ape suits were filmed on a stage at Elstree Studios and combined with footage of Africa (the effect is revealed in the leopard's glowing eyes reflecting back the light). Dennis Muren used a very similar solution for his 1967 debut film Equinox, although Muren's technique didn't employ Scotchlite. Two British films released in 1969, On Her Majesty's Secret Service and The Assassination Bureau, used the technique, as did the 1968 film Where Eagles Dare.” – Wikipedia

Zapruder+hoax.jpg

The Apparatus for Production of Composite Photographic Effects

“This invention relates to apparatus for the production of composite photographic effects. More particularly it relates to apparatus whereby realistic scenes may be photographically recorded in which the building, handling, and maintenance of stage scenery may be largely dispensed with. 

The handling of scenery has always been a major problem in the staging of any kind of show. The advent of television made the problem even worse since live scenes cannot be repeated and corrected. Furthermore split second timing is often necessary in staging a quarter hour or half hour television show, hence the cost of handling scenery can be enormous. The advertiser pays the cost of the show, but the high cost of television production has made the medium too expensive for many advertisers. Reduction or elimination of constructive scenery has up to now lowered the quality of the show. Thus the progress of television broadcasting has been seriously impeded.

It is an object of this invention to provide apparatus for the simplified, cheap and efficient, production of still pictures, and of motion picture or television performances. It is a further object to provide apparatus whereby unusual effects may be readily and cheaply produced on motion picture or television sets. 

These objects are obtained in a surprisingly simple and efficient manner. The apparatus of the invention includes at least one back-drop having a surface covered with a reflex reflecting surface. The staging of the show takes place on the acting-set in front of this backdrop. One or more cameras for recording the performance are located at a convenient place in front of the backdrop so that the lens of the camera takes in the action on the acting-set. Two or more sheets of plane transparent material are positioned at spaced intervals in front of the camera lens or lenses and between the camera and backdrop. Two projectors or more are so located that their light first strikes the plane transparent sheets; the light from each projector first strikes a single plane transparent sheet. A portion of the light from each projector is thus reflected to the reflex reflecting surface that serves as a backdrop. The relative positions of the camera or cameras, the projectors, and the plane transparent sheets are so adjusted that the lens of the camera or cameras receive both the reflected light from the projectors and the light from the scene being enacted.” – Google Patents

JFK Zapruder Hoax - David Mantik Part 1


Special and Visual Effects History

"The multiplane camera is a motion-picture camera used in the traditional animation process that moves a number of pieces of artwork past the camera at various speeds and at various distances from one another. This creates a sense of parallax or depth.

Various parts of the artwork layers are left transparent to allow other layers to be seen behind them. The movements are calculated and photographed frame by frame, with the result being an illusion of depth by having several layers of artwork moving at different speeds: the further away from the camera, the slower the speed. The multiplane effect is sometimes referred to as a parallax process.

An interesting variation is to have the background and foreground move in opposite directions. This creates an effect of rotation. An early example is the scene in Walt Disney's Snow White and the Seven Dwarfs where the Evil Queen drinks her potion, and the surroundings appear to spin around her." • Wikipedia

Walt Disney's MultiPlane Camera (Filmed Feb. 13, 1957)

Slit-scan Potography

“Originally used in static photography to achieve blurriness or deformity, the slit-scan technique was perfected for the creation of spectacular animations. It enables the cinematographer to create a psychedelic flow of colors. Though this type of effect is now often created through computer animation, slit-scan is a mechanical technique.

John Whitney developed it for the film Vertigo for the opening credits. He sent some test sequences on film to Stanley Kubrick. The technique was adapted by Douglas Trumbull for 2001: A Space Odyssey in 1968 for the "star gate" sequence. It required a custom built machine.

This type of effect was revived in other productions, for films and television alike. For instance, slit-scan was used by Bernard Lodge to create the Doctor Who title sequences for Jon Pertwee and Tom Baker used between December 1973 and January 1980. Slit-scan was also used in Star Trek: The Next Generation (1987–1994) to create the "stretching" of the starship Enterprise-D when it engaged warp drive. Due to the expense and difficulty of this technique, the same three warp-entry shots, all created by Industrial Light and Magic for the series pilot, were reused throughout the series virtually every time the ship went into warp. Slit-scan photography was also used on Interstellar for scenes in the tesseract at the end of the movie.” • Wikipedia

The History and Science of the Slit Scan Effect used in Stanley Kubrick's 2001: A Space Odyssey

Douglas Trumbull: Effects Pioneer

“Douglas Trumbull, the industry pioneer behind the special effects of 2001: A Space Odyssey, Close Encounters of the Third Kind, and Blade Runner joins post-secondary students and faculty to discuss his remarkable career in visual effects and his own directorial projects.This Higher Learning event was held on December 9, 2010 at TIFF Bell Lightbox.” • TIFF Originals

Douglas Trumbull's early work was at Graphic Films in Los Angeles. The small animation and graphic arts studio produced a film called To the Moon and Beyond about spaceflight for the 1964 New York World's Fair. Trumbull, the son of a mechanical engineer and an artist, worked at Graphic Films as an illustrator and airbrush artist. The spaceflight film caught the attention of director Stanley Kubrick, who was beginning work on the project that would become 2001: A Space Odyssey . Kubrick hired director Con Pederson from Graphic Films and the company was to work on visual effects for the film. When Kubrick decided to move all production to England, he cancelled the contract with Graphic Films. Trumbull wanted to keep working on the film as he had already done considerable pre-production work, so he cold-called Kubrick after obtaining the director's home phone number from Pederson. Kubrick hired Trumbull and flew him to London for the production of 2001: A Space Odyssey. Trumbull's first task was to create the dozens of animations seen in the data display screens in the Aries moon shuttle and the Discovery. They looked like computer graphics, but they were created by photographing and animating reproductions of charts and graphs from technical publications. Trumbull initially created the shots using a number of Rube Goldberg-like contraptions he built with gears and motors ordered from a scientific equipment supply house. Kubrick gave the young effects technician creative freedom and encouragement: "He would say ... 'What do you need to do it?' and I would have complete carte blanche, which was wild as a young guy", Trumbull recalled. "I was 23–24 when I started the movie, and was 25 by the time I was doing the Star Gate. He would say, 'What do you need?' and I'd say, 'Well, I need to go into town and buy some weird bearings and some stuff' and he would send me off to town in his Bentley, with a driver, into London. It was great!" • Wikipedia

DOUGLAS TRUMBULL | Master Class | Higher Learning

You Will Believe A Man Can Fly

“Front projection was chosen as the main method for shooting Christopher Reeve's flying scenes in Superman. However, they still faced the problem of having Reeve actually fly in front of the camera. Effects wizard Zoran Perisic patented a new refinement to front projection that involved placing a zoom lens on both the movie camera and the projector. These zoom lenses are synchronized to zoom in and out simultaneously in the same direction. As the projection lens zooms in, it projects a smaller image on the screen; the camera lens zooms in at the same time, and to the same degree, so that the projected image (the background plate) appears unchanged, as seen through the camera. However the subject placed in front of the front projection screen appears to have moved closer to the camera; thus Superman flies towards the camera. Perisic called this technique "Zoptic". The process was also used in two of the Superman sequels (but not used in the fourth movie due to budget constraints), Return to Oz, Radio Flyer, High Road to China, Deal of the Century, Megaforce, Thief of Baghdad, Greatest American Hero (TV), as well as Perisic's films as director, Sky Bandits (also known as Gunbus) and The Phoenix and the Magic Carpet.” • Wikipedia

Zoran Perisic discussing the Zoptic process on Superman The Movie

Front Projection

“In contrast to rear projection, in front projection the background image is projected onto both the performer and a highly reflective background screen, with the result that the projected image is bounced off the screen and into the lens of a camera. This is achieved by having a screen made of a retroreflective material such as Scotchlite, a product of the 3M company that is also used to make screens for movie theaters. Such material is made from millions of glass beads affixed to the surface of the cloth. These glass beads reflect light back only in the direction from which it came, far more efficiently than any common surface.

The actor (or subject) performs in front of the reflective screen with a movie camera pointing straight at them. Just in front of the camera is a one-way mirror angled at 45 degrees. At 90 degrees to the camera is a projector which projects an image of the background onto the mirror which reflects the image onto the performer and the highly reflective screen; the image is too faint to appear on the actor but shows up clearly on the screen. In this way, the actor becomes their own matte. The combined image is transmitted through the mirror and recorded by the camera. The technique is shown and explained in the "making-of-documentary" of the 1972 sci-fi film Silent Running.

Front projection was invented by Philip V. Palmquist who, while working at 3M Corporation, received a patent on the technology and also won an Academy Award for the invention. It was first experimented with in 1949, shortly after the invention of Scotchlite, and had appeared in feature films by 1963, when the Japanese film Matango used it extensively for its yacht scenes. Another early appearance was in 1966, during the filming of 2001: A Space Odyssey. The actors in ape suits were filmed on a stage at Elstree Studios and combined with footage of Africa (the effect is revealed in the leopard's glowing eyes reflecting back the light). Dennis Muren used a very similar solution for his 1967 debut film Equinox, although Muren's technique didn't employ Scotchlite. Two British films released in 1969, On Her Majesty's Secret Service and The Assassination Bureau, used the technique, as did the 1969 film Where Eagles Dare.” • Wikipedia

 

Special & Visual Effects Today: The Mandalorian Series

A clever wrap-around version of the rear projection effect merges with realtime, realistic computer-generated 3D graphics. I see no reason why this same technique couldn’t be used with the front projection effect. I suppose if you have a large set with huge wrap-around LCD monitors, you do not need to bother with the more complicated front projection effect setup, as both techniques allow you to avoid post-production color correction to get the actors and props to visually match the background.

The Virtual Production of The Mandalorian, Season One

Forging new paths for filmmakers on "The Mandalorian"

“Fortunately, Jon Favreau is way ahead of the curve. His pioneering vision for filming The Mandalorian presented an opportunity to turn the conventional filmmaking paradigm on its head. When we first met with Jon, he was excited to bring more real-time interactivity and collaboration back into the production process. It was clear he was willing to experiment with new workflows and take risks to achieve that goal. Ultimately, these early talks evolved into a groundbreaking virtual production methodology: shooting the series on a stage surrounded by massive LED walls displaying dynamic digital sets, with the ability to react to and manipulate this digital content in real time during live production. Working together with ILM, we drew up plans for how the pieces would fit together. The result was an ambitious new system and a suite of technologies to be deployed at a scale that had never been attempted for the fast-paced nature of episodic television production.

By the time shooting began, Unreal Engine was running on four synchronized PCs to drive the pixels on the LED walls in real time. At the same time, three Unreal operators could simultaneously manipulate the virtual scene, lighting, and effects on the walls. The crew inside the LED volume was also able to control the scene remotely from an iPad, working side-by-side with the director and DP. This virtual production workflow was used to film more than half of The Mandalorian Season 1, enabling the filmmakers to eliminate location shoots, capture a significant amount of complex VFX shots with accurate lighting and reflections in-camera, and iterate on scenes together in real time while on set. The combination of Unreal Engine’s real-time capabilities and the immersive LED screens enabled a creative flexibility previously unimaginable.” • Unreal Engine