To understand the production challenges of "digital video" including the live capture of digital video footage, the recording of video information from "time-based media" such as animation software, and video montage processes such as "compositing."
The term "digital video" refers to moving pictures that are stored on a computer hard disk for editing or playback. Digital video files vary in terms of their image size (measured in horizontal and vertical pixels) and their frame rate (measured in frames per second.)
For sixty years, from the introduction of television at the 1939 World's Fair to the shift to digital standards in the late 1990s, video and broadcast were analog--continuous optically-based information. Analog standards use a lot of bandwidth to broadcast a single channel and the number of channels available using the analog system is relatively limited. In addition, all the pictures are 4x3 format. The picture is only one third wider than it is high. In 1998 the FCC (Federal Communications Commission) approved a new digital broadcast standard that allowed a lot more channels and eighteen different broadcast formats (e.g. 480i, 480p, 1080i.
Of these eighteen digital video formats, eight are considered High Definition or HD. The rest are considered standard definition, SD, for which the picture is not as good. All HD formats are wide-screen, which means a 16x9 width to height ratio. Some of the new SD formats are also in the “wide” 16x9 format; the rest are still 4x3.
Televisions and video monitors display pictures as a series of still pictures called frames. If we show consecutive frames fast enough, objects in the series of pictures appear to be in motion. The frame rate is how fast the frames are displayed. Televisions use a frame rate of 30 frames per second(1) By comparison, traditional film cameras generally capture information at 24 frames per second.
A recent and profound innovation in special effects has been the development of computer generated imagery, or CGI, which has changed nearly every aspect of motion picture special effects. Digital compositing allows far more control and creative freedom than optical compositing, and does not degrade the image like analog (optical) processes. Digital imagery has enabled technicians to create detailed models, matte "paintings," and even fully-realized characters with the malleability of computer software. In the new millennium, many soap operas began to use incorptate CGI into many storylines, such as seeing a dead character or hanging off of a bridge. The most spectacular use of CGI has been the creation of photographically-realistic images of fantasy creations. Images could be created in a computer using the techniques of animated cartoons or model animation. In 1993, stop-motion animators working on the realistic dinosaurs of Steven Spielberg's Jurassic Park were retrained in the use of computer input devices. By 1995, films such as Toy Story underscored that the distinction between live-action films and animated films was no longer clear. Other landmark examples include a moving stained-glass window in Young Sherlock Holmes, a tentacle of water in The Abyss, the remastered Yoda from Attack of the Clones, a 'liquid metal' villain in Terminator 2: Judgment Day, and hordes of armies of fantastic creatures in The Lord of the Rings trilogy.(3)
Compositing is a broad term for creating complex visual effects made up of several different visual sources--e.g., 2D video, 3D animation, still digital photography, digital typography. There are several compositing methods, e.g. bluescreen and rotoscoping. Among well known professional compositing software is Shake and Discreet Combustion. For use on a smaller scale, Apple Motion or Adobe After Effects is used. These are much cheaper, easier to use for the beginner and can run on a powerful home computer. (2)