Pages

  • Facebook
  • Facebook
  • Instagram
  • Nordic Days

Blogpost I

 



The release of CGI-filled films like Avatar (2010) and the recent digital resurrection of actor Peter Cushing (1913-1994) - who 'appeared' posthumously in Star Wars adventure Rogue One (2016) - give some weight to the ongoing discussion that critiques the 'overkill' of CGI (Computer Generated Imagery) in movies these days. Of course, one might question the ethics of bringing an actor back to life in this manner - without their consent. And one could also point out the somewhat unsettling uncanny valley effect of these 3D generated images, particularly as the technology keeps getting more advanced and the characters become almost.. too perfect. 
But are these CGI movies fundamentally different from conventionally produced films? First, we will explore a short history of special effects to establish its omnipresence in film history. After that, we will examine CGI in terms of the work of digital media theorist Lev Manovich.


Above: Peter Cushing in 1977 in Star Wars IV: A New Hope
Below: a 3D generated image of Peter Cushing in Rogue One (2016)

Special effects in cinema have been in use almost as long as the medium itself. One of the first special effects (SFX) in film history was invented by accident. In his memoirs, French pioneering filmmaker Georges Méliès (1861-1938) reflects on filming a street scene in 1896 when his camera jammed for a few seconds.[1] The disappearance of a few consecutive pictures on the reel gave the viewer the illusion of a transformation or disappearance. Méliès immediately noticed the potential of this switch-effect and his passion for creating illusions within cinema was born.  


Pictured: Georges Méliès showing us his love for special effects in an early twentieth century filmreel


Another form of SFX is the use of stop motion. If you take several photographs of an object in slightly different positions and quickly display those photos in a consecutive order it will seem like the object is moving. This is an effective way of showing monsters that are hard to find in real life; like a gigantic gorilla in King Kong (1933) or dinosaurs in the movie The Lost World (1925). 


Image from King Kong (1933) that demonstrates the stop motion effect

As technology progressed, other filmmakers reveled in the joys of creating special effects in movies - like George Lucas, who created the world of Star Wars: Episode IV: A New Hope (1977). Because the needed technology wasn’t available yet, Lucas started his own company ILM. Miniatures were used to create the illusion of new worlds in outer space. Viewing backstage photographs might result in disillusionment because the production appears to be quite unprofessional and low-budget, but when you see the final result (below) you'll note that it eventually paid off.



With the advent of computers in the 1980s new opportunities arose for ambitious filmmakers. Films like Tron (1982), WarGames (1983) and Terminator (1984) showed the relatively inexperienced public the possibilities of filmmaking in this computerised age. James Cameron proved to be another film pioneer when his film Avatar was hailed as 'the future of movies', for its ambitious combination of 3D filming and CGI. He used human-based performance capture, which was first used to visualise the character of Gollum in the Lord of the Rings trilogy)


Performance capture of actor Andy Serkis in Planet of the Apes for a lifelike result


Digital media theorist Lev Manovich has broached the subject of CGI in terms of its goals: photorealism (more on that later) and a natural interface where the computer would be invisible; assimilated by its public. Although in Manovich' opinion, CGI generally looks too clean, pixellated and video game-like with unnatural human movementsWe could argue that technology has evolved immensely since Manovich wrote his article in 2001, but there is more to his theory.[2]
The thing is that the public generally measures the success of special effects in terms of how natural it feels to the viewer - the extent to which the CGI blends in. The problem with that is that we are not actually looking for ‘natural’ when we watch a film; we are (subconsciously) looking for moving images that resemble those that we already know. Cinema is not always a portrayal of reality, it has its own grammar, technological flaws and (deliberate and unintended) conventions that we have internalised over time. Not just regarding editing and sound, but also when it comes to visual images.[3] Envision, for example, a movie scene in which someone is looking through a binocular. In film, the view would in many cases be shown as a lopsided eight shape with a black frame. In reality, our brain turns what we see through a binocular into a single round shape with blurry edges. Yet after seeing this sideways eight shape for many times in film, we have been conditioned to recognise this as the more 'natural' point of view, an illusion of realism.


Still from the film Moonrise Kingdom (2012) that shows us how we see binoculars in film


This is also related to the technical shortcomings of cinema as a medium. The public is looking for photorealism (as captured by the camera) instead of realism. Fast moving objects for example appear to be blurry on film, the so-called ‘motion blur’. As a result of regularly seeing this in film, the public has grown to expect seeing this 'imperfection'. The animated short film The Adventures of André and Wally B. (1984) was the first CGI film that simulated this effect. In this case the photorealism of the older medium is favoured over the synthetic realism that CGI can provide.[4) 


Creators of computerised effects should also be wary that the effects don’t end up looking too sharp, detailed or perfect - in which case it would contrast too heavily with the more grainy and imperfect film background. This means that creators of computerised effects in film have to reduce the quality or adjust the CGI to fit our expectations. This also applies in a literal sense: it has to fit in the rectangular frame that we’ve grown accustomed to. Furthermore, it means a limitation of options, because CGI offers options that film simply cannot (or will not) absorb. Think for example of immersivity - the feeling of being ‘present’ in this digital world, for example Virtual Reality. As we see in video games, it potentially offers the viewer a sense of agency and interactivity in this synthetic realism. Yet, in film CGI technology is still forced to rely on restrictive old codes of cinematic reality.[5]
      Cultural sociologists Richard Howells and Joaquim Negreiros argue that the text (the message that the film conveys) that a medium provides is the most important factor. Yet neither the delivery system, nor the medium, text/message or grammar of film intrinsically change when filmmakers employ special effects.[6] Of course, there is an obvious difference when it comes to the material of CGI effects compared to more traditional film methods: CGI is made up by pixels and is based in mathematics. But this is a technological difference that doesn’t change the message/text of the film. Yet, the potential for immersiveness and accompanying interactivity could elevate the CGI to a new level.[7]


Furthermore, the public might not always be aware of the use of CGI in movies these days. One might reasonably expect CGI in outwardly fantastical films, but it is also seen in films where you wouldn't necessarily expect this level of computer editing - like in the film The Wolf of Wall Street (2013). In this case, CGI is mainly used for safety reasons and as a means of saving money, whereas these effects could have easily been created without SFX. Its use of special effects does not make it a radically different film since it lacks an interactive aspect, doesn't change the message/text of the film and conforms to older cinematic codes.

In conclusion, special effects have been employed by filmmakers since the advent of cinema. The objective - to create a world as realistic as possible, fantasy or not - doesn't differ much from that of early filmmakers in the 1890s. The delivery system, medium and message/text remain relatively similar as well.
 Yet CGI filmmakers do not strive for realism by employing their synthethic realism. They have to strive for the photorealism that the public is used to; it has to look like it has been filmed. And if CGI creators have to conform to the set conditions and shortcomings of cinema and its photorealism, one could argue that CGI films are not fundamentally different from more conventional films. The interactive aspects and immersive character of computer animation are what makes it the most radical, and this just isn't incorporated in film today. As long as CGI has to answer to the classic conventions of the cinematic medium, it will inevitably just be a part of it. 


360 degree film: the future?

BONUS: In terms of interactivity in film, Facebook is offering us a glimpse into what the future might hold when it comes to their mobile 360 degree film option. These are interactive films, viewable from your phone, where a certain type of movement takes place (e.g. a plane that is flying, a ski lift that is rolling, a person who is walking). Yet as a viewer you have interactive control in terms of what angle you take. You are given a sense of agency and are allowed a 360 degree view from the situation, simply by using your fingers or by moving your phone around. 







[1] Miriam Rosen, "Méliès, Georges", in: John Wakeman, World Film Directors: Volume I, 1890–1945 (New York 1987) 747–765 
[2] Lev Manovich, The Language of New Media (Cambridge 2001) 177-211
[3] Ibidem, 176-190
[4] Ibidem, 192, 199
[5] Ibidem, 180, 200
[6] Richard Howells and Joaquim Negreiros, Visual Culture (Cambridge, 2012), 263-294
[7] Manovich, The Language of New Media, 180-182

G O  T O  B L O G  P O S T  I I



BewarenBewaren
BewarenBewaren