Showing posts with label Exposition (Editorial). Show all posts
Showing posts with label Exposition (Editorial). Show all posts

15 December 2009

Exposition - The $500,000,000 Man

Have you ever wondered what half a billion dollars looks like? If recent (unofficial) estimates are to be believed, that's exactly what you'll see if you buy a ticket to Avatar. Throw fourteen-odd years of development into the mix, an unprecedented level of hype, and a promise to revolutionise cinema as we know it, and you have quite a lot invested in a three hour experience. With the fifteen dollar cost of admission seemingly very paltry by comparison, I did my best to put any reservations aside and see whether the much prophesied 'second coming of James Cameron' could bring about a cinematic equivalent of the rapture promised by that other individual whose initials he shares. Indeed, the risk of being branded a heretic seems just as likely to result from criticising one as the other, but criticism must be made nevertheless. And although it runs against my natural instincts to pre-empt an examination with points from the conclusion, I find myself given little option in this instance. In short, the reason I have not and shall never dedicate a review to Avatar is that it runs contrary to the professed purpose of this blog, which is to delve deeper and analyse the cultural subtext. Whatever it cost to add a third dimension to the film visually, the real expense has been the loss of its cerebral equivalent. As such, it is all I can do to offer a shallow reflection upon some of the foremost points that presented themselves to me personally.

The Third Dimension, Mk.II

Various assertions have been made to the affect that Avatar will herald in a new age of 3-D cinema, eliminating the pitfalls of earlier technology, providing a greater level of audience immersion, and thus affording an overall better experience. Whether in theory or in practice, none of this is true. To its credit, Avatar avoids the in-your-face projectile obsession that relegates the majority of 3-D films to gimmick status. At its best the technology is barely perceptible, but if the measure of its success is the extent to which you don't notice it, one obviously has to question the benefit. The presence of a few bewitchingly subtle instances where it really does evoke a sense of magic was not, however, enough to offset the number of times when I found myself struggling against it, which unfortunately seemed to occur in the scenes I was most interested in. Perhaps I am something of an anomaly in the manner in which I view films, but time and again I found that if I was not watching a specific area in the frame the illusion completely fell apart. For those who don't naturally focus on the object or depth of field chosen by the director, it almost feels as though you're being unfairly punished for not surrendering to his authorial vision, to the extent that I found myself resenting the implication that I was somehow wrong to be looking at this, rather than that. Again, perhaps I am something of an oddity, but even with perfect 20 / 20 vision I frequently encountered scenes where the depth of field actually inverted itself on me, causing annoyance rather than immersion (and a left eye that has remained painful since I woke up this morning). I can only assume that those who believe this to be the next evolution in cinema also maintain the superiority of pop-up books over the traditional kind.

The Best Things in Film Are Free

Adding to my frustration with the 3-D technology was the apprehension that, while I was grappling with it, I was missing out on some of the major points in the story. This frustration rapidly gave way to disappointment, however, when I realised that I wasn't missing out on anything at all. There is a plot, in the sense that it recounts a series of linear events, but unlike the visuals it never approaches even an illusion of depth. Essentially, those who have seen Fern Gully and Pocahontas have already seen Avatar, and no attempt is made to hide its derivative nature, with the protagonist even sharing initials with John Smith. Proclaiming this a bad story makes about as much sense as accusing flat-bread of being a bad loaf, but with so much time and money invested in other aspects of the production it is incredible that no effort seems to have been invested in forming an interesting narrative. This is all the more confounding when you consider that, with a proven writer/director at the reigns, the story development is effectively free. Perhaps more disappointing are the number of potentially interesting story directions that are never even explored. At a point in time where human interaction is increasingly transacted through various electronic media – be it online games, social networking sites, or even your mobile phone to some extent – a film with the word 'avatar' as its very title should have a lot to say about the affect this has had on current society (bear in mind that what you're reading right now is also presented under the guise of an avatar). It could have dealt with the psychological implications of splitting your life between two distinct bodies/realities – whether this might alienate yourself from the idea of being synonymous with a body at all – but this is given only cursory and simplistic treatment. If stripping down the potential subtexts in favour of a single, straightforward narrative was a calculated attempt to drive-home the didactic element, then Avatar must be considered an outright failure. The heroes and villains are propelled so far into the rarefied poles of noble savage and exploitative conqueror as to loose any realistic credibility, while the story may be even be interpreted as imparting a message that seems to contradict its own moralistic stance. Presenting the choice between an unpleasant reality or the prospect of escape into a fantasy world, Avatar seems to advocate the latter. At a time when increasing numbers of people seem to be inclined to withdraw from the difficulties of society altogether, preferring to immerse themselves in an idealised virtual world, the propriety of this message seems rather dubious.

Unbridled Imagination 

James Cameron had such a creative vision for Avatar that only recent technological advances would allow him to see it realised on the screen, if internet folklore is to be believed. Why then, I found myself pondering in the cinema, is everything so very, very familiar? Most of the plants are green and look just like terrestrial plants; evolution seems to have favoured six-limbed locomotion on this planet, but that still looks very much like a monkey, a rhino, a panther, a jackal, a horse, and a pterodactyl; the people are big and blue, but similar enough in sensory input that the human mind can easily interpret and control a body crafted in their image. Now, I understand the theory behind convergent evolution of this kind, but this is not a documentary, so does everything really have to be that intuitively familiar? Personally, I would be somewhat disappointed if I were to travel five years into outer space and find myself on a planet effectively the same as earth, albeit where everything seems to have run afoul of a serial fluoro-bomber. Avatar was supposed to be an act of unbridled imagination, but it turns out that the presentation is as uninspired and derivative as the story element. Walking out of the cinema I was convinced that I'd seen all this before, but when I tried to think of specific films I couldn't. That was when I realised that it wasn't film that Avatar was borrowing from, but video games. I'd been watching Halo vs Warcraft for the past three hours. As I think back on the audience, however, I had to give credit to the sleight of hand being played there, because the majority were clearly not people who would be at all familiar with the gaming medium. For them, Avatar would be unlike anything they had ever seen before, just as it was promised. Film aficionados would find a few other things a little too familiar though, such as some iconic creature noises lifted from Jurassic Park, and at least three musical cues in common with Cameron's own Aliens. It is a common misapprehension that the imagination has no limits, but even with that admission it is common sense that you shouldn't borrow elements from other high-profile films and leave them undisguised when being unique and incomparable is one of your main selling-points.

A Science-Fiction Renaissance 

Apparently I missed the obituary proclaiming the death of science-fiction film, but it doesn't matter because I was told that Avatar was about to bring it back to life, bigger and better than before. What we ended up with was certainly bigger, but I'm hesitant to even suggest that it was the same creature that was put to death, let alone entertain the idea that it's necessarily better for it. Perhaps I have too pedantic a criteria regarding what falls within the bounds of science-fiction, but to me Avatar had no more science in it than something like The Lord of the Rings (which, you'll note, is more or less faithful to Newton's laws of physics). Spaceships and aliens do not a science-fiction film make, nor does the act of simply setting something in the future. For me, the defining quality of science-fiction is the presence of plausible, scientific speculation that contributes to the story in a meaningful capacity. This does not mean that every facet of the technology need be explained or even explicable based on current scientific understanding, but it does require a certain respect for plausibility. Inventing a fictitious mineral, making it the entire motivation for the narrative, and then calling attention to the very implausibility of that element with a name like 'unobtanium' crosses the boundary between genre convention and parody. Fantasy makes allowance for the presence of the inexplicable, as does science-fiction; the crucial difference is that science-fiction must at least make a pretence toward explaining it. Any legitimate speculation in the film, which is primarily restricted to the human side, is simply things we've seen before (drawing particularly on Cameron's Aliens), and hardly the catalyst for a science-fiction renaissance. Without spoiling the conclusion, Avatar puts the final nail in its coffin by relying on the most literal form of deus ex machina.

Avatar is by no means a terrible, nor even a bad film. Perhaps its only real transgression is promising something unique and revolutionary, when it ultimately delivers only mediocrity. Issues with the 3-D technology aside, James Cameron has achieved one of the most flawless visual presentations of an imaginary world in the history of cinema. The problem is that so much time and resource should be given to material that proves unworthy of such treatment. It is unique only insofar as it combines elements that have never been put together in such a way before, but for each of those elements a better, fuller treatment can be found elsewhere. It is the epitome of perfect execution, unfortunately dedicated to the realisation of a pointless enterprise.

1 November 2009

Exposition - A Take on Alternatives

There was a time when the feature film conveyed a sense of permanence, as if the story you were watching had been set in stone. Indeed, of the mere hundred years or so during which the medium has existed, this has been the case for all but the last third of that period. As a child of the eighties, I have only a very small recollection of this time, and yet it is an era which remains powerfully vivid in my mind. Oftentimes the only inkling you had of an upcoming film would be the preview shown before another feature. Occasionally there would be a poster or cardboard diorama set up in the cinema three or four months in advance. On rare occasions there might be a brief article in one of the larger newspapers suggesting that filming had commenced on a sequel to one of the successful films from two or three years ago. Whatever the case, when you did eventually settle down in that mildew-scented cinema and waited for the red curtains to draw aside in their jerky fashion, it never once entered my mind that I would be seeing anything but a perfect vision. Hollywood was just like Willy Wonka's chocolate factory. The inner workings of the industry remained largely unknown and mysterious, clouding popular perception to a point where it seemed inconceivable that a film could have ended up in any state other than the way we saw it on the screen. Without any knowledge of the politics, disputes, and outright failures going on behind the scenes, films always seemed to emerge with an unearthly, somewhat eerie sense of immaculate completion.

"We are the music-makers, and we are the dreamers of dreams"
Looking back over the past twenty years, it is startling how much the situation has changed within such a short period. And the most appreciable influence behind this shift is, of course, the advent of the internet. While there has always been a subculture, made up of enthusiasts, which have made it their business to appraise themselves of all the news and inner-workings of the film industry that their circumstances would allow, the mass transferral of information afforded by the internet has seen an unprecedented increase in the amount of available material and the number of people with access to it. Time was when it actually meant something to say that you had an interest in film, the hallmarks of which were a basement or attic crammed to the brim with film cases and projection equipment. Cut to the present day, where saying that you like film is akin to expressing your interest in food or simple respiration. What was once the sole province of a dedicated subculture has become a staple of culture at large, to the point where virtually everyone these days would classify themselves as some species of film enthusiast. In fact, one might say that very little is left of the original audience, that great horde which used to include the cinema as part of their weekly or monthly entertainment, to be consumed, enjoyed, and then largely forgotten. When a film screens these days it is not met by the humble consumer, but ruthlessly dissected by a throng of self-proclaimed, self-righteous critics and technical experts – yours truly included.

No longer perceived as a kind of immaculate vision, the general attitude toward film has shifted toward a 'work-in-progress' mentality: the seamless white sheet now exposed as the patch-work conglomeration it really was all along. Indeed, with the proliferation of insider-information now blazoned on the web, the attitude toward modern films is very often determined before production even begins – the seams picked apart before the industry even has a chance to lay out the template. Increasingly, the result of this pressure seems to be a frantic attempt on the part of film-makers and studios to placate vocal consumers rather than maintain their original vision. More often than not, however, the result is a bastardised hybrid of creative concept and reactionary marketing that fails to satisfy either party. But even if we rightly identify the internet as playing a significant role in this shift, is it really the only, or indeed the original cause? By the same token, can we really blame the millions of film enthusiasts – whose curiosity perhaps outstrips their tact and diligence – for the increasing number of creatively compromised films, or does the fault really lie on the industry side?

To an extent, I think that film-makers have actually brought this situation upon themselves. On the studio side, nothing but greed can account for the recent trend toward approving sequels before an original instalment has even been tested in the market. The considerable upsurge in remakes has also led to a climate wherein films are leant to direct comparisons, and the inevitable conclusion that one or the other must necessarily be bad if the other is deemed to be good. Over-saturation is perhaps the best way to describe the current phenomenon, and its origins can be traced all the way back to the era of my idealised childhood. You see, with the advent of (relatively) affordable home-viewer technology, such as Laserdisc and VHS tapes, the general populace began to see tangible proof of an element of film-making which had always existed but, until then, generally remained within the realm of urban legend: the extended cut. The famously absent spider pit sequence in the original King Kong is one of the best known examples, but it wasn't until the early eighties that extended cuts began to be widely disseminated, triggering a fundamental shift in the attitude toward film. The infamous Caligula – released in 1979 – is a tempting candidate upon whom to pin the role of popularising alternative versions. The controversy surrounding its content certainly propelled it to a level of intense public scrutiny, but the details of its troubled production disqualifies it to some extent on a technicality. After all, it wasn't that director Tinto Brass was issuing multiple versions of his own volition, more that his Caligula was competing with a version augmented by Giancarlo Lui and Bob Guccione, as if they were entirely separate films.
 

It wasn't until the mid eighties, when cable TV stations began to show an extended version of Michael Cimino's Heaven's Gate, that the public was really exposed to the idea that a director could favour a version of the film which was substantially different from the cinema release. It was this particular example which also brought the term Director's Cut out of its original context as part of the film-making process and introduced it to the consumer vocabulary. Even so, for the most part the eighties continued under the auspice that what you saw on the screen was what you were ideally meant to get, at least until two of the biggest names in modern film-making unwittingly pioneered what has since grown into a staple of the industry. As fate would have it, 1992 saw two of the most influential science-fiction films reissued to audiences, with Ridley Scott releasing a Director's Cut of Blade Runner, and James Cameron offering a Special Edition of Aliens. The ramifications for the home-viewer were nothing short of revolutionary. Suddenly it wasn't simply a matter of whether you liked the movie or not, but which version you regarded as being superior. Films could now be comparable to themselves, creating a schism which – regardless of where you fell on the issue – denied any concept of an immaculate, definitive version. With the cinema complex still top-dog in the market, however, the issue was still largely relegated to the home-viewing enthusiast.

While these examples certainly acted as harbingers on the horizon of future possibility, they were destined to remain something of an oddity through to the late nineties. As pillars of the industry, Scott and Cameron had opened minds to the possibility that great films could exist in an alternate form which was even arguably superior. However, each had come about through a set of particular circumstances, with neither director exhibiting any desire to make it a staple in their creative repertoire. It took another pillar of the science-fiction catalogue to really set-off the boom industry in alternate versions, a dubious honour that fell to George Lucas and his 1997 theatrical release of the Star Wars trilogy in a revised Special Edition. This time there was no escaping the schism, and to complicate the issue further, this divide would have generational implications. Where Scott and Cameron had been content to allow their alternate version to stand on their own merits, Lucas took the contentious step of weighing in heavily, even overbearingly, on the issue. His intent with these re-releases was to establish a definitive version, taking advantage of technology which was not available to him at the time.
 

For those that had embraced the original Star Wars trilogy it was simply too bad – it was being put out to pasture, with the openly professed intent that they might be forgotten as a new generation grew up in the shadow of these revised iterations. The older generation had, unfortunately, fallen for an incomplete, blemished version, but now it was time to upgrade to something certifiably superior. Needless to say, many resented the implicit suggestion that they had been duped into embracing a flawed vision, and doubly so now that the unconditional love they had shown for this cash cow had put enough money in the farmer's coffer that he was at liberty to take it out the back and shoot it. A precedent had been set and a means of increasing revenue simultaneously proven, and it was not long before others took the idea and ran with it, including the unimaginatively titled release of The Exorcist: The Version You've Never Seen. With the subsequent advent the DVD format and its ability to convey data in a non-linear fashion, the market in alternate versions expanded rapidly from this point on, delivering a bastard child in the form of endless reissues and repackages.

As always, however, it only takes a few glorious triumphs for us to forgive a trying campaign of misadventure, and in the realm of alternate versions there have certainly been a few films that seem to justify the phenomenon. The 2003 release of an Alien 3 in Assembly Cut form, for instance, offers an approximation of the film we may have seen had a young David Fincher not been encumbered with a terminally ill production from the outset. As a mark of dedication to the original source material, Peter Jackson put considerable effort into an Extended Edition of each instalment in The Lord of the Rings trilogy, providing an affirmative example of how to successfully negotiate the rift between the casual and dedicated audience. Deserving the most praise of all, however, is Ridley Scott, who has recently returned to place his indelible mark on the phenomenon which he ostensibly created with a truly transformative Director's Cut of the otherwise deeply flawed Kingdom of Heaven. Since then he has delved into the depths of his back-catalogue to reissue Blade Runner in a deluxe set, delivering not only a revised Final Cut, but a staggering four other versions in a definitive sign of support for the dedicated enthusiast. As such, while we may lament the advent of alternative cuts for unleashing a marketing farce in recent years, it is certain that we would never have experienced some truly spectacular film evolutions had the phenomenon never taken root.


Ridley Scott finishes what he started with Blade Runner

15 September 2009

Exposition - The Price of Interaction


Shortly after tackling the subject of game-to-film adaptations, I began to ponder something. If you were to compare my collection of films and videogames you'd find them approximately equal in number. And yet, if you were to calculate the difference in their initial retail value, you would probably find the games weighing in at least two, more likely even three times the cost of the films. Now I'll admit that my knowledge of the finance and production side of the film industry is very limited, and more so for game development, but even so, I cannot reconcile myself with the notion that this should be the case. As we shall see, none of the vague inferences that can be treated with a degree of certainty offer any compelling evidence in defence of such a pronounced discrepancy. To my mind it is simply a matter of evaluating the issue according to three simple points of comparison.


Investment

If we use my personal library as an example, we've established that the ratio of games to films is fairly balanced, while the initial retail value of the former is probably close to three times that of the latter, based on the fact that the price of a premium game title is generally three times that of your standard two-disc DVD at their time of release. If we were to add up the budgets invested in each title and then compare the two forms of media, however, I have no doubt that the discrepancy would be reversed, perhaps to the same order of magnitude, or even more, in favour of film. Whether it is due to a genuine lack of availability, or simply the product of lesser consumer interest, information regarding the size of a typical game budget is far less accessible than that of film, which is often the subject of widespread promotional use. Allowing for the inevitable exceptions, however, I find it difficult to imagine that the typical game budget would be on par with the sixty-five million dollar figure generally attributed to the average Hollywood production, let alone in excess of it. This is not even taking into account the matter of mammoth, blockbuster-aspiring budgets which inevitably become must-haves for your average movie enthusiast. Glancing over my own collection, the likes of Bram Stoker's Dracula, Kingdom of Heaven, and the Lord of the Rings trilogy would have to come fairly close to balancing out my entire game library entirely on their own, as far as representing investment value is concerned. Adding further weight to this argument is the fact that the price of a typical two-disc DVD release fluctuates very little, despite the actual film budget, and quite often in adverse proportion to it. When you consider the initial cinema release, the point becomes still more pronounced, for it costs no more to see the latest big-budget extravaganza than it does to attend a small independent feature. Insofar as investment value is concerned then, there seems no reason why games should cost any more than your average DVD, let alone nearly triple the amount.

Expenditure

Comparing raw budget data is all well and good, but it's only fair to point out that the development of interactive media is very different from that of film, and thus incurs some unique expenses. One of the most frequently cited by game developers is the cost of development kits. With the industry dominated by three key players – Microsoft, Sony, and Nintendo – the purportedly steep cost of the proprietary software required to make games for their respective consoles is largely unavoidable. In addition to this is the necessity of licensing and tailoring existing game engines to suit each specific project, or else incurring the additional time required to make your own from scratch. Then there is the fact that development time is often a good third to a half longer than that of your typical film, if not several times more. All of this must be taken into consideration, and yet, when we really do so, these differences amount to very little. Yes, game development involves a number of costs and considerations peculiar to itself, but then so does film. The time from pre-production to final release may be longer, but even the largest game developments involve far smaller crews than your typical film production. The cost of proprietary software may be an unavoidable expense, but this has its equivalent in the film industry too, including everything from the physical film and camera equipment, to proprietary editing software and the necessity of distributing through private cinema companies. The cost of subsequent distribution can be dismissed outright, as both games and film use exactly the same media, be it DVD, Blu-ray, or digital. Indeed, the game industry is generally able to avoid some of the more significant dents in a typical film budget, such as the cost of first-billing actors, the necessity of location fees, and the astronomical price of insurance at every level of filming. As such, the extent to which the game industry bewails its tenuous profitability on the basis of its unique expenses is a little hard to swallow.

Duration

When you consider that the average film runs for about two hours, being able to play a game for up to ten to fifteen hours for only three times the price suddenly seems more reasonable. The way that games are marketed demonstrates that the industry is aware of the power of this suggestion, and the hope that, while games are not cheap, the consumer nevertheless feels they are getting good value. When you apply even a little critical though to this principle, however, it is immediately exposed as a fraud. In order to illustrate my point I would like to turn your attention to the image below. There we see three forms of media dealing with the same license: a film, a television series, and a game all based on the popular Terminator franchise. Despite widespread criticism of the game for its short length, someone intending to buy the upcoming Terminator: Salvation DVD might still justify the full retail price of the game with the assurance that they are getting at least two-and-a-half times the duration of the film out of it, meaning that they cost roughly the same to experience per hour. First of all, this ignores the obvious fact that most people do not buy a DVD when they only intend to watch a film once. Secondly, how do we apply this measure of value to the television series, which extends to roughly twice the length of the game, and five times the length of the film? Surely if the measure of value is duration, then longer films should be more expensive, or else priced according to the number of times someone is likely to watch it. The principle is even less convincing when it comes to games, for there is no consistent ratio between the amount of effort and expenditure that is invested in a title and the amount of time it is likely to be played by the consumer. If this were true, the cost of something as simple as Tetris, with no real limit to the experience, should be more expensive than the Terminator: Salvation game. For that matter, how could one compare a sandbox game, where you are simply presented with a set of tools and then allowed to entertain yourself, to another where meticulous effort has gone into crafting a story of more limited duration?


For an outsider, with little real knowledge of the game industry, it is difficult to pinpoint where the profit from the exorbitant cost of games is going. However, with Sony openly admitting to the fact that it incurs a loss with every Playstation 3 console it produces, and Microsoft able to absorb the cost of an endemic failure rate in its Xbox 360, it is certain that the profit margins carved out of software sales must be playing a significant role in keeping their operations viable. The seemingly endless cycle of acquisition, merging, and disbanding of game development studios, despite the continued profits of large publishing companies is also conspicuous, even where specific titles have performed admirably on the market. And while the developers may deserve our sympathy for having much of the risk and little of the reward for each project passed squarely onto them, it is the consumer who ultimately pays. Indeed, the only point of difference that appears to have any real merit in the price discrepancy between games and film is the size of the market. Though they are cheaper to make, games simply don't have the same market share as the lucrative film industry, so in order to derive similar margins, the price is set universally high. In effect, people who buy games are paying a subsidy to the industry on behalf of all those who don't, which has the subsequent affect of discouraging new consumers. Still, if everyone woke up tomorrow a certified gamer would anyone seriously expect the price of games to fall accordingly? There is simply no incentive to lower prices when so many are prepared to pay.

10 July 2009

Exposition - Dystopia Myopia

I have an aversion to buzzwords. This aversion stems from the fact that they take perfectly good, functional terms and debase them to a point where they no longer have any specific meaning. Once they enter popular lexicon it's virtually impossible to check their progress; isolated cases of misapplication rapidly instigate a domino effect, and before you know it the word has lost much of its original value. An earlier article on the recent proliferation of the term 'reboot' covered some of this ground – although that case is unusual in that there seems to have been no consensus on what the word meant in the first place – but I feel an unconquerable urge to return to the subject in relation to misuse of a far more enduring term, which is the word 'dystopia' and its various conjugations. This is an evocative word, and one that people are understandably fond of using, but increasingly I seem to come across instances of application which dilute or misrepresent its rather unique meaning. More than anything else, this article is intended to be a personal rationalisation, explaining what the term dystopia designates to me as an individual, with the hope that it might inspire others to do the same.

Without resorting to an encyclopaedic citation, there is some background information which is essential to this discussion. Briefly, the necessary points to bear in mind are a) that the word 'dystopia' was coined as an antonym of 'utopia', b) that the word 'utopia' was itself not coined until the fifteenth century, in a philosophical work of the same name by Sir Thomas More, and c) that by the time 'dystopia' was coined, the generally accepted meaning of 'utopia' had divurged from its own original meaning. The last of these points is critical only insofar as it negates the viability of looking directly to More in order to define the meaning of 'dystopia'. Otherwise, it is sufficient to acknowledge the fact that there are two definitions of 'utopia', the first relating primarily to its original use, and the second, more commonly accepted and widely applied meaning. Of these two only the latter concerns us here. So what does 'utopia' mean?

Generally, when we use the word 'utopia' it carries connotations of perfection, total fulfilment, or an ideal state. The last is particularly important in regard to its alternative possible meaning: not as a static mode of being, but as 'the State' of proverbial socio-political significance. The word 'utopia' is not synonymous with others like 'paradise', which tend to suggest a simpler way of life, devoid of negative aspects because they forgo complexity. Paradise is certainly what any true utopia would deliver, but the means of reaching that goal relies on perfecting each individual facet in a finely-tuned machine, rather than dismantling it in order to get back to basics. In other words, a utopian world is one in which society works flawlessly for the satisfaction and betterment of its people, at once delivering and being an ideal state. Utopia is thus a means of attaining paradise, but not all paradises are utopian. The paradise of the Bible, for instance, is not utopian simply because it requires more than two individuals to constitute a society. So what does 'dystopia' mean?

First of all, it's worth noting the unusual spelling. The much more common dis- prefix would result in a negating reversal of the term 'utopia', but instead we have the rarer dys-, which carries a much stronger connotation of actual negative traits, rather than simple mirror opposition. 'Dysfunction', for instance, denotes the aberrant operation of a system or machine, not the cessation of function itself. Indeed, regardless of the actual etymology of the term, it would not be improper to treat 'dystopia' as a contraction of 'dysfunctional-utopia', which suggests a much clearer definition without altering the proper meaning. If we define a utopia as a social system which promotes the value and well-being of its citizens, then a dystopia is surely a similar system whose function produces the equivalent negative affect, i.e. the exploitation and devaluation of individual rights for the benefit of the state. Just as there must be a society in order for their to be a utopia, the same is also true of a dystopia. At their heart, each is quintessentially defined through the relationship between individuals and a state or social system of which they are a part, the latter existing for the benefit of the former in a utopia, and the former used to serve the latter in a dystopia. So what are some examples?

A bleak horizon, but is it dystopian?
One of the more frequent misapplications in recent weeks has been in relation to the Terminator franchise. If we use the definition outlined above, however, this error is clear for two reasons, being a) that the pivotal event of Judgement Day obliterated much of human society, and b) that the small numbers who did survive have actually formed a social order in which the value of human life is paramount. While these survivors certainly endure a harsh existence, as part of a martial society necessitated by the threat of annihilation, theirs is nevertheless a system that not only work but also affirms the value of each human being against that of their enemies, the machines. This is representative of by far the most common misapplication of the term dystopia, whereby it is perceived as being synonymous with any post-apocalyptic world. The world of Judge Dredd, by contrast, is a proper example of both, where catastrophic events have resulted in the creation of huge, overpopulated enclaves. The ensuing upsurge in conflict and crime sees the introduction of a harsh system of justice arbitrated without the right of appeal, with the end result being a pronounced decline both in the quality of human life and its perceived value.

A dreadful vision of the future
It's also important to bear in mind that fictional worlds need not be either dystopian or utopian in their entirety, and yet still feature one, the other, or both. The Star Trek series is perhaps the most ready example, for although it depicts a universe in a state of ongoing inter-species conflict, human society on its own has made great advances toward a state of utopia by abolishing money, achieving real equality in terms of race and gender, and embracing a system of utility whereby an individual's role is dictated by their aptitude and abilities. The original Stargate film depicts the reverse, where a team of realistic modern extraction makes contact with a human society utterly bent to the will of a despotic alien overlord whose effort to maintain their oppression includes banning all literacy. The truth is, the amount of dystopian or utopian societies that can exist in any work of fiction is limited only by the number of distinct states or equivalent social orders which feature in it overall.

An exaggeration of the present day
Of course, there is nothing which dictates that a society need be either wholly utopian or dystopian. It is more accurate and far more practical to imagine a delineated scale extending between these two poles, with each example falling more or less to one side or the other. This brings me to what will not doubt prove to be a controversial case, which is the film Blade Runner. In terms of presenting a speculative vision of the near future, this film is almost without equal. Unfortunately I cannot be as resolute about the common perception that this vision is dystopian. What we see of the year 2019 is certainly far from utopian, but so fleeting and restricted is this exposure that we should be hesitant before leaping to the opposite conclusion. It is a world in which morally bankrupt corporations are shown to unleash horrors upon the general populace and their own creations alike, of over-crowded streets bathed in the garish light and incessant noise of mass-advertisement, filled with poverty and greed, injustice and suffering. But is this substantially different from the world we live in today? In fact, one could argue that the core of Blade Runner's success in presenting a captivating, believable future derives from the fact that it merely reflects current society, with all its positive and negative attributes, and magnifies them. The corporations may be bigger, the builders taller, the streets more crowded, but in terms of the underlying dynamics it all feels so intuitively familiar.

16 June 2009

Exposition - Reboot the World

When the first pseudo-announcements began to circulate concerning Predators going into pre-production there was a degree of consternation among fans of the existing Predator films. Part of this was simple incredulity that anyone should wish to invest in the project, considering the dubious quality of a script that had been in open circulation for almost fifteen years. On more objective grounds, the greatest field of speculation involved the ambiguous, somewhat conflicting statements being made in regard to the nature of the project as a whole, and its intended relationship with the existing films. The problem, it seemed, stemmed from the possible range of interpretation denoted through the repetitive use of a single term, and that term was 'reboot'.

Was the 1987 classic going to be pressed into the increasingly populous ranks of films slated for a remake? Was it going to retell the story of an elite military rescue team in the jungles of Central America with an extraterrestrial hunter on their tail, or would it follow some new premise, intended to override or somehow erase the existing films? Was it simply going be a new, disconnected episode in an ongoing series, as the second film had been to the first? Subsequent comments would resolve the matter somewhat by including the term 'sequel', among other nebulous qualifications, but this still raises the question of what, if anything, the term 'reboot' actually denotes.

Resumption or replacement?
As with any new medium, the evolution of film and filming techniques during the past century gave rise to a new technical jargon, a large part of which has also migrated into popular usage. In some instances, existing terms have accrued alternate or additional connotations to what they signified prior to the invention of cinema, some examples being 'cut', 'frame', and indeed 'film' itself. Others, like 'slow-motion', 'close-up', and 'freeze-frame', are almost entirely owing to a common familiarity with the technique of film-making. The advent of the DVD format in the past two decades, or perhaps merely coinciding with its introduction, has been the subject of yet another proliferation in film-centric jargon. This time, however, the impetus has largely been driven by marketing concerns. Some of the more prominent examples are terms such as 'director's cut', 'special edition', 'original vision', and 'definitive version', whose actual meaning is difficult to define, owing to a culture of liberal interpretation or outright misuse. 

Even amidst such a pervasive glossary of dubious provenance, the term 'reboot' somehow seems more vacuous and prone to confuse. In this instance, the burden does not seem to rest on the usual field of post-production marketing, however, for I've yet to encounter the term emblazoned in bold type on a DVD case. Indeed, as the Predators example readily attests, use of the term appears to be restricted primarily to the relatively brief period between first announcement and the beginning of actual pre-production, and therefore most likely to spring from the mouths of over-enthusiastic producers and other studio representatives, or else the media outlets reporting on these early scoops. And while identifying the source of this recent phenomenon is a valuable point, it still brings us no closer to understanding what the term actually denotes.

By far the greatest impediment in this pursuit is that there simply doesn't appear to be any consensus, and certainly nothing in the way of consistent application. Aside from the technical definition, the first time I encountered the term 'reboot' was in reference to Batman Begins. In this particular context the insinuation was more or less understood to convey the fact that this instalment was not a sequel or in any way related to the series of four films that began with Batman in 1989 and culminated with Batman & Robin. As such, the provenance of the term presumes some link with its use in the field of computing and electronics, where a 'reboot' is understood to be a means of returning a system or device to a default state, removing any undesirable deviations that may have taken place during its operation. Even at this point the analogy is less than satisfying, while in the wake of Batman Begins and The Dark Knight the situation has continued to deteriorate.

Confusion ensues
What degree of difference, for instance, delineates a reboot from a remake? If a break in narrative continuity is the determining factor, couldn't the 1989 version of Batman be considered a reboot of the earlier production starring Adam West? Is it enough to call something a reboot simply because it reinvigorates a franchise that has long lain dormant? Would that make the recent Battlestar Galactica series a reboot, despite the fact that it maintains a tenuous continuity with the original series? Is it actually possible for a reboot to function as a sequel or prequel as well, and what would that make Caprica? Surely the only justification for a new term entering the lexicon is that it can impart a clear and concise understanding of a concept that current jargon cannot adequately convey. Any addition that is not a step toward efficiency, either in terms of the concept or its communication, is necessarily a step toward redundancy, at best, if not outright confusion.

The very fact that we need to ask these questions suggests that the term 'reboot' utterly fails to satisfy these criteria. Unless industry or the media are prepared to assert a unanimous definition, therefore, there is little ground on which anyone can defend its continued use. If there are adequate terms to describe the different scenarios outlined above, they should be used; if not, then clearly there is a deficiency in the lexicon, which can only be addressed with concise and meaningful candidates. As it stands, the term 'reboot' is patently inadequate for the task.

- UPDATE -

In a timely reinforcement of the conclusions I reached two years ago, ongoing Predator series producer John Davis recently had this to say in an interview conducted with Collider:
Collider: You started with Predator and I really dug the reboot you did with Robert Rodriguez. How successful was that for the studio in terms of maybe making another one? 
Davis: You know, those Predator movies... Tom Rothman said this to me, “Man, they all seem to make money.” I get a big check every year on my net points off of the original Predator. You know how hard it is to get net points on a studio movie, right? It was hugely profitable. It far exceeded its revenue on DVD than in theaters by three or four times. 
Collider: You’re talking about the original, right?

Davis: Yes. I talked to Arnold about rebooting Predator and doing something in terms of that. I think in terms of right now, it needs to rest for a couple of years.  I can’t see why if we can’t be clever we can’t reinvent it again.

Collider: I actually really dug it. I thought Rodriguez did a great job with it.

Davis: Yeah. It was really fun.

Collider: It was really good. That’s why as a fan I’d love to see another one.

Davis: Yeah. He changed the setting. He put it on another planet.  You have to keep changing the setting.  You have to find a clever way to do it.  If we were going to do it with Arnold; it was like, “Does it make sense to go back and to put him with a young team?” So maybe it’s 20 years later, you have retired, and you are the one person who has survived one of these encounters. Is that a reboot in the fact that you are in it with a group of young guys? Is that a reboot? You just have to figure out a way to reboot it. Rodriguez rebooted it. It’s all in the planet. The sequel to the first one rebooted it. We should’ve had Arnold in the movie. The deal broke down over $250,000, which is a shame.  But it was moved from the jungle to the city. You have to create a freshness about it. When we did Alien vs. Predator we kind of rebooted it because we put the two pieces together.  You just have to give it enough time to come up with a new freshness.
So there you have it: a media representative and an over-enthusiastic producer throwing the term 'reboot' around with such abandon that it's clear that neither has a firm idea of what they, let alone the other means by it. And apparently every iteration in the Predator series has been a reboot, so it isn't even correct to think of it as a series, but rather an ongoing process of reiteration. Once again, I renew my call for this term to be struck from the lexicon.

1 June 2009

Exposition - The Blame is Afoot

Many of you have doubtless reached the same conclusion I have, which is that the normal processes of common sense and rational thought don't seem to fully apply within the film industry. Matters of individual taste and personal opinion notwithstanding, it is no great challenge to compile an objective catalogue of irrefutable lapses in judgement. This extends from the relatively minute, such as specific dialogue and individual scenes, through matters of increasing magnitude, such as casting decisions and narrative flaws, to the very top, where we cannot help but question the very existence of the project as a whole. Indeed, the film industry has always been rife with instances of questionable judgement, where our natural inclination to wonder precisely what they were thinking yields no rational or satisfying answer. While we might recognise this as an unwelcome truth, however, this should not be taken as grounds to complacently accept it.

The recent release of a trailer for the Guy Ritchie adaptation of Sherlock Holmes presents it as a case-study in precisely this sort of faulty reasoning. And while my ire is perhaps more keen in regard to this particular example, owing to the fact that I am a keen devotee of the original works by Arthur Conan Doyle, I am resolved not to let this article devolve into the pointless nit-picking of a wronged fan. Such an engagement would, in fact, obscure the much deeper and more resounding argument to be made against projects of this nature. In order to reach this point we should begin by looking at how the powers-that-be might conceivably believe that such a project would make good sense in the first place. What this doesn't take into account, however, is the fact that there are, in the broadest terms, two distinctly different incarnations of Sherlock Holmes. For those who have little or no direct knowledge of the character in literature or film, the Holmes archetype is nevertheless such a pervasive figure in popular culture as to leave many unsure whether he is indeed fictional or an actual historical personage. Falling into the opposing category are those with some degree of familiarity with the source material, either gleaned directly from the original literature, or pieced together through the multitude of film and television adaptations, whose level of fidelity varies significantly. To the former group, Sherlock Holmes is a memorable quip and a mental image, while to the latter he is a more rounded character, with a defined personality and other distinctive traits. Both groups with instantly recognise the name, but there is a vast discrepancy in terms of how each will respond to departure from the established material.

With any adaptation there comes a point where straying too far from the source material begins to raise questions about the validity of maintaining any tenuous link at all. A familiar name or two sown throughout an utterly unrecognisable field does not an adaptation make, and in so doing the project is effectively denied an opportunity to stand on its own merits. As a fan of the Arthur Conan Doyle stories I would happily subject myself to The Adventures of Silas Harper: Consulting Detective and Amateur Pugilist Extraordinaire, and probably enjoy it as much as I did the popular resurrection of The Mummy, but not if the project attempts to pass itself off as Sherlock Holmes, with no justification other than for the sake of expedient marketing.An immediate point of attraction for any project dealing with Sherlock Holmes is the instant, albeit vague, recognition carried by the name. As far as fictional celebrities are concerned, the extent to which this particular character has infiltrated the common cultural milieu is almost without parallel, with the likes of Dracula and Frankenstein perhaps coming closest. Mention the name to someone in any setting and you are almost certain to evoke the mental image of a hawkish man in a chequered tan cloak and deerstalker hat, and a misquotation so ubiquitous that I need not even set it down here. In terms of fostering awareness for your project, Sherlock Holmes is a dream come true for any marketing department: it doesn't require a set up or explanation, simply because it commands immediate recognition.

In short, the majority will probably not notice, the purists will resent it, and those who fall somewhere in between will make up their mind based on the individual merits of the film. And this is where we encounter the inherent flaw in this kind of approach, for in employing a figure so widely recognised, it is those to whom the name signifies the most that are also most likely to resent the radical departures. To argue that recognition of the name itself somehow offsets this outcome is a total nonsense. The first Indiana Jones might have easily passed itself off as a 'reinterpretation' or 'modernisation' of the established Allan Quatermain character, in order to bolster the level of recognition inherent in the name. The fact that this wasn't done means that dedicated Quatermain fans are afforded sufficient leeway to approach the film as a simple homage, rather than incur the backlash of being seen as a rival or usurper of the name.

Ultimately, this kind of approach boils down to a rather cynical and blatantly condescending attitude toward the common audience, assuming that people will only give credence to an established name and are therefore incapable of discerning the value of an original premise based on its own merits. Indiana Jones and countless other examples indicate otherwise.