FILM HISTORY

The history of film began in the 1890s, with the invention of the first motion-picture cameras and the establishment of the first film production companies and cinemas. The films of the 1890s were under a minute long and until 1927, motion pictures were produced without sound. The first eleven years of motion pictures show the cinema moving from a novelty to an established large-scale entertainment industry. The films became several minutes long consisting of several shots. The first rotating camera for taking panning shots was built in 1897. The first film studios were built in 1897. Special effects were introduced and film continuity, involving action moving from one sequence into another, began to be used. In 1900, continuity of action across successive shots was achieved and the close-up shot was introduced. Most films of this period were what came to be called “chase films”. The first use of animation in movies was in 1899. The first feature length multi-reel film was a 1906 Australian production. The first successful permanent theatre showing only films was “The Nickelodeon” in Pittsburgh in 1905. By about 1910, actors began to receive screen credit for their roles, and the way to the creation of film stars was opened. Regular newsreels were exhibited from 1910 and soon became a popular way for finding out the news. Overall, from about 1910, American films had the largest share of the market in all European countries except France.

New film techniques that were introduced in this period include the use of artificial lighting, fire effects and Low-key lighting (i.e. lighting in which most of the frame is dark) for enhanced atmosphere during sinister scenes. As films grew longer, specialist writers were employed to simplify more complex stories derived from novels or plays into a form that could be contained on one reel. Genres began to be used as categories; the main division was into comedy and drama, but these categories were further subdivided. The years of the First World War were a complex transitional period for the film industry. The exhibition of films changed from short one-reel programmes to feature films. Exhibition venues became larger and began charging higher prices. By 1914, continuity cinema was the established mode of commercial cinema. One of the advanced continuity techniques involved an accurate and smooth transition from one shot to another.

D. W. Griffith had the highest standing amongst American directors in the industry, because of the dramatic excitement he conveyed to the audience through his films. The American industry, or “Hollywood”, as it was becoming known after its new geographical center in California, gained the position it has held, more or less, ever since: film factory for the world and exporting its product to most countries on earth. By the 1920s, the United States reached what is still its era of greatest-ever output, producing an average of 800 feature films annually,[1] or 82% of the global total (Eyman, 1997). During late 1927, Warners released The Jazz Singer, the first synchronized dialogue (and singing) in a feature film. By the end of 1929, Hollywood was almost all-talkie, with several competing sound systems (soon to be standardized). Sound saved the Hollywood studio system in the face of the Great Depression (Parkinson, 1995). Thus began what is now often called “The Golden Age of Hollywood”, which refers roughly to the period beginning with the introduction of sound until the late 1940s. The American cinema reached its peak of efficiently manufactured glamour and global appeal during this period. The top actors of the era are now thought of as the classic film stars, such as Clark Gable, Katharine Hepburn, Humphrey Bogart, Greta Garbo, and the greatest box office draw of the 1930s, child performer Shirley Temple.

The desire for wartime propaganda created a renaissance in the film industry in Britain, with realistic war dramas. The onset of US involvement in World War II also brought a proliferation of films as both patriotism and propaganda. The House Un-American Activities Committee investigated Hollywood in the early 1950s. During the immediate post-war years the cinematic industry was also threatened by television, and the increasing popularity of the medium meant that some film theatres would bankrupt and close. Following the end of World War II in the 1940s, the following decade, the 1950s, marked a ‘Golden Age’ for non-English world cinema. During the 1960s, the studio system in Hollywood declined, because many films were now being made on location in other countries, or using studio facilities abroad. The New Hollywood was the period following the decline of the studio system during the 1950s and 1960s and the end of the production code, (which was replaced in 1968 by the MPAA film rating system). During the 1970s, filmmakers increasingly depicted explicit sexual content and showed gunfight and battle scenes that included graphic images of bloody deaths.

During the 1980s, audiences began increasingly watching films on their home VCRs. In the early part of that decade, the film studios tried legal action to ban home ownership of VCRs as a violation of copyright, which proved unsuccessful. Eventually, the sale and rental of films on home video became a significant “second venue” for exhibition of films, and an additional source of revenue for the film industries. The Lucas–Spielberg combine would dominate “Hollywood” cinema for much of the 1980s, and lead to much imitation. The early 1990s saw the development of a commercially successful independent cinema in the United States.