Coming Off As A Bit Of A Tool

Your awesome Tagline

1 note

How Not To Tell A Story

The art of storytelling is dead. Long gone are the days of sitting in a circle, perhaps around a warm campfire, and listening to tales passed down from generation to generation. Now-a-days it seemingly takes a professional to attempt to string together sentences in such a way that entertains, rather than annoys, frustrates or evokes swift and justified violence against the speaker. Now, I don’t claim to be an expert on storytelling, and am certainly not above scrutiny. However, as I listened in to a slew of random conversations around campus, I came up with four pieces of advice that I feel anyone who plans to open their mouths at some point in the future must heed, lest fire rain down uponeth their souls.

1. Some stories are so amazing they must be told. Most aren’t.

Sometimes in life, something so amazingly unexpected happens. Stars align, and the universe twists coincidences to create a worthy narrative. But while this does happen, it is not as frequent as commonly thought. Most true stories are predictable and boring, and should not be told. So you cooked a lasagne whilst watching The Price is Right reruns? Woopdi-doo.

Similarly, avoid dream retellings at all cost. It is guaranteed that nobody listening will want to hear about the time you dreamt you were in a boy band with Mozart and the Queen. It might mean something very important to you. But it’s just you.

2. “And then I was like…” does not equate to “I said…”

You’ve all heard it.

“So he was like, you know?’

“I know, right?”

“And I was like, omg”.

This second problem is common in the age demographic of 10-30 year olds, and runs rampant in recaps friends give friends of conversations they have had with others. I’m not going to lie; I’ve found myself under its control at times. I understand that repetition of the words ‘so he said’ and ‘so I said’ can become irritating, but that does not excuse the fact that if someone says something, they say it. They do not resemble it.

3. Never misuse the word ‘literally’

This is not literally the worst thing someone can do, but it’s up there. In my imagination, a group of philosophers and wordsmiths came together at some point in the distant past to have an important meeting about the growing problem of exaggeration in speech.

“Oh no!” an unnamed member of the council proclaimed, ‘everyone is using too many superlatives when what they are describing is clearly not as remarkable as they make it seem. What shall we do?”

After much deliberation, this council decided to create a new word. A virgin word, indestructible by its very definition. “Literally”. The rule was simple;  use only when you mean the very words you say. If something literally scares you to death, then you should be dead. If you’re heart has literally exploded with happiness, good luck with life insurance. I can’t imagine what the elders of the word council would say when they heard that the word they had created to solve the problem of exaggeration was inversely encouraging it! They are literally rolling in their graves.

4. Once you’re done. Shut up.

So you’ve told your story. You’ve set it up, introduced the complication and then BAM. You punchlined the crap out of you’re audience. They are laughing so hard they look like they just can’t get enough. They have. The worst thing you can do at this point is keep talking.

Imagine you’re telling some friends the story of the time that you told a teacher he reminded you of watermelon.

“And then I said, Hey teach, you remind me of a watermelon”, you say, and laughing ensues from the friends you are telling.  This is the end of the story.  If you continue with such lines as ‘and then I laughed’ or ‘ and then he walked off and I sat down and ate a vegemite sandwich for 15 minutes until the bell rang and then I went to geography’, you’ve ruined the whole story, and everyone hates you.

Hopefully this advice helps some offenders of the above crimes, but I can’t say I’m hopeful. Perhaps it’s time to jump ship completely and become a mime.

0 notes

Wordal Vomit


I am sad and alone. It could be either that I am sad because I’m alone, or I’m alone because I’m sad. Nobody likes those that are sad, and tend to leave them alone. The lonelier you are, the sadder you become, and eventually you find that you are a loner, and that makes you the saddest. Then you become a sadist and then people really leave you alone. Either way, I am sad. I am alone. Two mutually inclusive facts that encompass my current state of being. My current state of living is New York: The loneliest state in this sad, bitter skin of a world.

Bitter skin. I haven’t yet. Though, I’m sure she would taste like grapes. She’s from Greece, after all. Found her at a penny arcade with old joystick grease covering her wine drenched soles. The sun was bright. And shot solar rays down to ripen her skin, so tender that I’d just have to bite soft to get to through her oily thin. To get to her souls.

Penny. Her name or some sort of word. I’d penned it into my diary of accumulated letters that sound like currency. AC DC. Here there are four. And one. Hear? From the look of her forehead she’d obviously won. A band of brothers, and brothels filled with slots for coins. Not loins, you fucker. Get your mind off the gutter.I haven’t yet. Though, I’m sure she would spill like lakes. Penny.

The queen of the race. Disgrace. A first for her fists. A fighter. I’d rather not though. I’d rather take her home. I checked her name in my book. Fame from a look. Just above Nickelback. I’d want my nickel back. She’s just worth a Penny. Penny. Not a Rocky. Though built like a stone. Stoner? No, but I’d like to own her. Another band of brothers.. Another fist in her face. There’s blood on her brow. I bow. Like a ship. Shit.

She’s down for the count. And out. I go to her side. Aside: I’ve banned all brothers from my lovers ride. My lovers stride. Lover? I haven’t yet. Though I’m sure we would mate like apes. Grapes. I kiss her hand to test the taste. What a waste. Another problem faced. Her face is bright like solar, polar. No. Not this time. Though I’m sure she’d….what was I saying again? I seem to have lost my train


I am sad and alone.

0 notes


             For the past few years, Soundwave has had some ass-kicking acts, but as the ticket prices keep rising, and the quality of the line up keeps falling, then the one factor that stops me from going back is the ridiculousness that is the Soundwave Experience.  Putting festivals outside was a good idea. At a festival, everyone can just go do their own thing. No more claustrophobic fits of suffocation, no more fire-hazards and small venues. It’s just comfortable.                   


That being said, if the guy who put festivals outside is a genius, then the Soundwave organisers are brain dead hamsters who get their kicks out of lighting themselves on fire. Their first mistake was putting Soundwave in the middle of summer. Their second mistake was everything else they attempt. They didn’t even attempt to solve the problem that was skin cancer. I can’t complain about over priced water. It’s a rite of passage at public events. But I can complain when that water sells out within an hour or two.

It’s simple math.  A scorching summer sun minus any form of shade plus a crap load of dust minus the ability to leave until sundown plus incredibly long lines for water minus the availability of said water and then times that by the tens of thousands of sweaty dehydrated and brutally sunburnt bodies all screaming for the sweet, sweet relief of death. Ok, that’s fairly complex maths, but it all equals hell on earth.

Add to that the fact that, at the start of the day it took 2 hours to get into the place. They didn’t’ even delay the first few acts, meaning that if you weren’t in there by 9 on the dot (aka everyone) then your favourite band was playing to no-one. Add to that the fact that it took over 3 hours to get out of the place at the end, as only one exit was open, and you’ve got yourself a disaster.  Thank God it only happens one day a year.

0 notes

Semiotics and The Apple Homepage

          Apple Inc. (previously Apple Computers Inc.) is a well-known company that creates, markets and sells technological products. Like all companies, Apple has worked to create itself a recognizable image, or brand. Unlike other corporations however, Apple is arguably known more for its brand recognition than its products. In the late 1980’s John Sculley, a marketing executive from Pepsi turned Apple into the biggest computer company in the world, increasing the advertising budget to $100 million, up from the then $15 million (Leander Kahney, 2002). This revolutionized the company and shifted the focus from the products, to the company brand. Marc Gobé (2001), an advertising analyst said ‘Without the brand, Apple would be dead. Absolutely. Completely. The brand is all they’ve got. The power of their branding is all that keeps them alive. It’s got nothing to do with products.’ Apple creates this brand to imply certain ideas about its products to the audience. Certain connotations are branded onto their products, simply because they are made by Apple. But what are these connotations, and how are they being communicated? They, like all companies, use signs and codes to communicate meaning through to their audience, in a bid to convince the audience that they need Apple products. Apple, then, provides an excellent case study of how commodity signs are being used through advertising, to create connotations about an idea and also to create a unified image for the company.

A large part of Apple’s branding strategy focuses on emotion. Many associations spring to mind. These include ‘…lifestyle; imagination; liberty…innovation; passion; hopes, dreams and aspirations, and also power-to-the-people through technology.’ (Marketing Minds, 2008) These are interesting ideas, but how they came to be, and how they are communicated to the audience through commodity signs is equally interesting. If we look at a facet of Apple advertising, these signs become apparent.

Take, for example, the Apple webpage. On the top left hand corner, we see the recently renovated apple logo. This logo is monochromatic but has a glossy filter over it, and is in the shape of an apple with a bite taken out of it. This sign, like all signs, is made up of a signifier and a signified. On the simplest level, the signifier is the pixels and colours that create the image of the logo, and the signified is the concept of an apple. On this level, the relationship between signifier and signification is iconic. The shape is a mimetic representation.

 However, on another level, the sign this duo creates becomes a signifier for its own signified concept. This in turn, creates an additional symbolic sign. As Sandra E. Moriarty (2002, p. 20) reminds us, ‘only in…the symbolic sign…is the meaning arbitrary’. This means that our understanding of culture, and our presupposed ideas of the company give the meanings created by the sign to us. This sign, then, creates two sets of meaning. The first is the denotation of the Apple Company, but the second is the connotation of the cultural identity of the company and its product. The shape of an Apple may connote ideas of health, simplicity and sweetness and the minimalist aspect of the logo compliment those ideas of simplicity in the product.  The image of the logo is strategically placed on the screen, in the top-left corner, as websites usually use this space for a ‘home’ button. Rather than having a button labelled ‘home’, the apple logo is used instead, giving the sense that the entire website was purposely designed to immerse the viewer into the ‘Apple experience’.

There are many other symbolic signs throughout the websites home page. The use of negative space in the borders connote a sense of cleanliness, as well as control and ease-of-use. In the middle of the screen is a slideshow of images depicting Apple’s various products that scroll randomly and indefinitely as you click the Apple logo and refresh the page. Most of these items have touch screens, which connote a sense of freedom for the user, being able to do anything they wish with the device. This idea is propagated through the use of text such as “Multi-touch” and a variety of differently coloured devices. The large use of iconic signs representing human faces connotes connection between people, communication and to a degree, popularity, with Apple trying to tell it’s viewer that if you buy the product, you will have higher status, and therefore more people to talk to. Some of the items are photographed with a high shot, making the product look small. This connotes the device as unobtrusive and within the control of the user. It also makes the product look masterable, simple and undaunting to those that would describe themselves as computer illiterate. It is interesting that one of the connotations of the Apple logo is power-to-the-people through technology when it is arguably Apple who controls its products very heavily. Still, this points out the fact that signs and connotations aren’t necessarily influenced by reality, and more by what Apple thinks will help their product movement.

Apple knows the power of the brand, and changes it’s advertising accordingly. An example of this is on the Ipad themed homepage. Rather than simply telling us what “The Ipad Is…”, it uses simply the words “Ipad is…”. Albeit grammatically incorrect, it instantly changes the Ipad from being simply a product, to being a lifestyle.

An article in Marketing Minds(2008) said that, ‘The Apple brand personality is also about simplicity and the removal of complexity from people’s lives; people-driven product design; and about being a really humanistic company with a heartfelt connection with its customers.’ This is very explicitly communicated through the multiple commodity signs, codes and visual themes present on the Apple website. Apple wants you to think that it sells high status, independence, communication, indie credibility, simplicity, style, and above all, happiness. Apple uses these commodity signs to convince the viewer that their life is cluttered and unmanaged, and that if they bought their products, they could ultimately be a happier person. But ultimately, as Apple is dealing with Symbolic Signs, the power of the connotation lies with culture, and what the public infer into Apple’s products. As Joleen Deatherage (2009) puts it, ‘Apple is a great example of how people’s connections with brands transcend commerce. Less than two decades ago, the company was near death, but people said we’ve got to support it. By establishing an emotional connection with its customers, Apple proved a brand is built by its audience and lives in its audience’s mind.‘

In this way, users become loyal to a brand. However, it is not the products that keep customers loyal (there are alternative products out there) but the connotations of the signs, codes and visual themes that together give Apple a memorable identity people can attach to.




Word Count: 1027


‘Apple’s Branding Strategy’ 2008, Marketing Minds, viewed 12 September 2010. <>

Deatherage, J 2009, ‘The importance of nonprofit branding’, Philanthropy Journal, viewed 13 September 2010


Gobé, M  2001, ‘Emotional branding: The new paradigm for connecting brands to people’, Allworth Press, New York


Kahney, L 2002, ‘Apple: It’s all about the brand’, Condé Nast Digital, viewed 10September 2010. <>


Moriarty, S 2002, ‘The Symbiotics of Semiotics and Visual Communication’, Journal of Visual Literacy, vol. 22, No. 1, pp. 20




0 notes

The Karate Kid

            There’s nothing I love more than a good old fashion mosh pit. I love to invade the personal space of, and inadvertently sexually harass, any person squished up against me. I can’t argue that music is made to move to and that everyone has their own way to express this. That being said, there is a certain type of gig-goer who takes dancing to its extreme; a regular concert crasher whom I will never understand. The karate kid.

You know this person. It’s the guy who loves to meet new friends by kicking them in the face. It’s the half naked guy who forms a little circle for him to punch himself out in. It’s the guy with three teeth left.

A wise old man by the name of Eminem once said, ‘Lose yourself to the music, the moment. You own it, you better never let it go’. Sure, Eminem has a valid point about the overloading power of music, but look closer at his words. Not once does Eminem encourage his audience to lose other important things, such as their tempers, clothing and all round dignity.  They kick, they scream, the run around in little circles doing tai kwon dork moves well beyond their belt colours. They seem to do everything BUT listen to the music. It’s as if all these guy’s daddy issues are manifesting themselves as a plight to beat down anyone within a metre radius.  Now I understand bottled up emotions. I’m writing this after all. But why can’t they do what I do? Cry into a pillow and hug my stuffed monkey till all the hate is gone. At least I’m not bothering anyone else. The general population does not hate me. Nevertheless, these kids keep coming back, forming more zombie pits, punching their way to happiness and causing more head traumas in the process. It turns out assault is all the rage these days.

0 notes

Criticism Vs The World

            So there I am, staring down the barrel of a loaded customer, asking myself the question, ‘What should I recommend?’ It’s the same story every night.  I sit, separated only by a window and computer from the public, with complete strangers asking me what films are worth watching.  

In a way, working at the cinema makes me a makeshift critic. I see a film, think about its strengths and weaknesses, and weigh it up against other movies I’ve seen. Then I use that opinion to either persuade the general public to invest their time into an experience or to avoid it like the plague.  But why do they ask my opinion, rather than, say, another member of the crowd waiting to buy tickets? My only guess is because they think of me as an expert. In their mind, if I work at the megaplex it only follows that I have seen a large array of films, and thus, can tell them what to risk their money on.

But that almost never works the way it should. I supposedly know what’s hot and what’s not. I consider myself a film geek, and spend most of my time reading up on movie news, previews and reviews. I feel that I get a pretty fair understanding of the general consensus film critics share about a particular movie. One thing about this communal opinion amongst critics stands out, though.  It almost never meshes with that film’s box office revenue.

It seems formulaic.  The overall acclaim given to a film is quite often inversely proportional to the money it makes at the box office, and vice versa. If a film is panned, it seems to top the charts, where as if a film is received well by professionals, it quickly fades into obscurity within the mainstream market. The recent, Scott Pilgrim Vs The World is an excellent example of this. I know that what I truthfully endorse, the average western-suburban moviegoer will be bored to tears with. I know that what I wouldn’t be caught dead watching will undoubtedly win the teen choice award for best summer flick (Ala Transformers 2).

So then what’s the point of film criticism at all? Who is it for?

In the lecture on film criticism, Noel King asked the audience a question. Why is it that, in the past few years, people have become so interested in the top 10 movies, and in the box office takings of said films. My answer to this is that people have realized they can trust the generalized, individual free opinion of the public being more than that of some artsy-fartsy culture man sitting behind a desk typing to no-one about the finer points of the applications of the auteur theory in the recent mumblecore movement. They can trust the public more, because they are part of the public. The quickest way for the busy-type to judge a films quality without wasting time reading a review or, god-forbid, actually watching the movie, is to trust the hundreds of thousands of people that saw the film. They voted with their money, in the same way the reader will vote with their own. This makes every Average Joe a critic, in the same way that I, as an employee of a cinema am a critic, and anyone with a keyboard, Internet access and something to say is a critic. The old lamentation of ‘Everyone’s a critic’ has ceremoniously passed into the land of cliché. However, it has never wrung so true. Democracy has infected criticism.

But classical critics will not take this sitting down. They will fight to the death.

Meagan Morris, in her article The practice of reviewing, calls into question the separation between critic and the ‘Average-Man’. She believes that Critics have historically believed their opinions more important than the Average-Man’s because they were intelligent, trained and experienced. They believed that their training had somehow affected their gut reaction about films, and thus made their impressions of a film superior to those untrained. In the critics mind, the general public are unfit to analyse a film as they are too caught up in the advertising, hype and fandom of a film. In the lecture on interviewing, the lecturer mentioned that she thought it was important to be a fan of someone before interviewing them. Film critics, in the past, have posed the opposite of this. They believe that to be truly objective (or as close to it as humanly possible) one must go into a movie experience without excitement or pre-conceived expectations of the film. However, with the rise of Internet culture, and web-based criticism, this has changed. In this way, the role of the professional critic is in a state of flux. If the author is dead, then so most certainly is the analyst. And critics know this.  In my opinion, to keep their jobs alive, they must begin to ask the question of where their intentions lie.

In the mind of some critics, their role is to inject some ‘culture’ into the cold lives of the public, who are ever so content with cut-and-paste chick-flicks and Michael Bay movies. It’s the critic taking on the Hollywood system in a poetic battle between them and the world. The critical world is correct, and the general public is foolish and wrong.

Sure reviewers love to say ‘There’s no such thing as objectivity. We purely express our opinion’, but in their writing, and by extension, the general air around them, this belief of theirs is not well translated. Critics, in the past, have attacked and turned against their own, when their opinions differ. Take the recent example of Armond White. A contrarian by heart, Armond White had increased his reader base by generally opposing the opinions of the general populous of critics in his writing. But instead of taking on board Armond White’s opinions and participating in intelligent discussions wit him, generally critics have discredited him, questioning how he earned his qualifications. However (and of course with the rare exception), Armond White seems to agree much more often with the general public.

So then whom should critics aim for? Should they talk ‘down’ to the Average-Man, telling them that what they enjoy is wrong, and that they should invest in higher brow entertainment, or should they write without the notions of ‘good’ and ‘bad cinema, and with the knowledge of ‘different strokes for different folks’? I believe critics should stop relying on the 5 point scale of determining whether a movie is good or bad, worth seeing, dvding or leaving. Rather, they should write about whom a certain movie would and wouldn’t appeal to. Perhaps they should start catering for the infinite number of possible opinions out there, rather than the niche market of people whom already agree with everything they say, purely because they’re professionals.

0 notes

Bioshock and Call of Duty 4: Modern Warfare

        Over the past two decades, The First Person Shooter (or FPS) has become a staple genre of the video game industry. With its increased popularity, many different variation on the cross-haired ‘point and shoot’ have come about. While some FPS’s covet after realism, others delve into the fantastic. Whichever route a game designer decides to take, many elements of any FPS are shared.  Yet, it is the differences from the norm that allow a game to stand out from the pile of copy-paste games. Two recent examples of the genre are Bioshock, developed by Irrational Games and released in 2008, and Call of Duty 4: Modern Warfare, developed by Infinity Ward and released in 2007. These games share quite a few similarities, but as we will see, they retain their spectacle through their variations on the genre. Both games fall on the opposite sides of the coin that is the First-person-Shooter. Where Call of Duty attempts to be as realistic a war simulator as possible, Bioshock very cleverly merges the ideas of reality and fantasy.  With this in mind, the gameplay mechanics of both games are quite fittingly very different from one another. So too are the story elements of both games. And yet, although Call of Duty 4: Modern Warfare and Bioshock are quintessentially first-person shooters, we can see that they have both been influenced by other genres. To understand both of these games, and how they differ from each other, it is first important to understand the conventions of First-Person Shooters.

Two things make a game by definition a first person shooter.  In both Bioshock and Call of Duty 4, the player takes control of a single protagonist, and the game is viewed, quite literally, through their eyes. The player perceives what their character perceives exactly. Both games are also shooting games, and thus have multiple guns and other weapons available to players. Combining these two prerequisites for the genre, many FPS games share a similar aesthetic of frame. In the bottom right hand corner of the main gameplay screen, the player sees either the characters hand or the gun they are holding. This, along with a gun reticule often placed in the centre of the screen, creates a sense of place, and establishes the point of view of the player, allowing them to know exactly where to shoot.

Gameplay wise, these products are very different. The stakes created in both games differ. In Call of Duty 4: Modern Warfare, if you’re character dies, the game is over and you must restart from your last save point. However, in Bioshock when you die, you are transported to a ‘Vita-chamber’ and all damage to enemies and ammo lost stays constant.  A similarity between both games is how quickly they throw you back into the game even after you’ve died. As mentioned, Bioshock doesn’t even pause between the player dying and being transported to the nearest Vita-chamber to start over. While Call of Duty 4 does pause, it is only for a moment, and then you are unceremoniously plummeted back into gameplay. Both of these methods somewhat allow the realism not to be broken, but there are two additional reason why they have been included. Firstly, they encourage the player to continuously replay. If it is as quick and easy as waiting 5 seconds to retry a level that you have failed, then less people will become frustrated with the game and stop playing.  On top of that, the second reason is that if the game is fairly forgiving for failing a mission, then it encourages the player to play at a difficulty that challenges them, and also to be creative in figuring out the best way to clear a level.  Many players, however, have found the simplicity and ease of dying and quickly returning to the game problematic and negative. Their arguments are that if dying in-game is just a small hitch, then there is no suspense, or even stress, whilst playing the game. The player is not necessarily encouraged to survive, or to play the game through without dying, and thus a true level of realism cannot be reached in either game. More advanced players bemoan the Vita-chamber in Bioshock as they believe less experienced players use this feature as a crutch, and it becomes difficult to differentiate between above average players and those which persisted to use Vita-chambers to progress through the game regardless of their skill level.

Both games have differing degrees of difficulty, which you can select at the start of the game. Determining what difficulty level a player should play at is different in both games. Bioshock uses a simple menu, displaying Easy, Medium and Hard. However, this can be changed in-game if the player is finding one difficulty too easy, or challenging. However, if you are playing on Xbox 360, changing the difficulty level may negate achievements that you can earn. In Call of Duty, however, a more dynamic system for determining the recommended difficulty level has been implemented. At the beginning of the game, the player’s character is taken through a pseudo-tutorial that takes the form of a soldier’s boot camp. After this short tutorial, the player is then required to complete a short drill. Depending on how fast the player completes this military exercise, the game will recommend a difficulty level it believes suit the player. Of course, the player can choose to disregard this recommendation, but the developers obviously believe that to get the best gameplay experience out of the game, their recommendation should be heeded. If the player decides that their chosen difficulty is too hard or too easy, they must restart the game and play from the beginning.

 Damage in both games is also displayed differently.  Both games have a HUD (Heads up display), but they have different features. In Bioshock, the player has a health bar, which, when empty, results in the death of the player. Yet in Call of Duty 4, the only sign you get of damage is a quick jolt of red that frames the screen for a second, and a slowing down of time. Healing, thus, can also be contrasted in both games. Without a health bar, you heal in Call of Duty 4 by not getting hurt for a short amount of time. This regenerative health bar is a popular element used in video games in the past few years, but Call of Duty 4 takes this fad and extends to its extreme in the fact that you can’t see the bar on screen. While the regenerative health bad decreases stakes, as the player can always find cover and heal, the fact that the player doesn’t know when they are about to die increases anticipation, tension and suspense. However, in Bioshock, you must pick up health packs and use them appropriately. This sets up another layer of gameplay, as there is a limited amount of health packs in each level. Also limited in Bioshock, is ammunition for your weapons. Unlike in Call of Duty 4, where for all intents and purposes, you never have to worry about running out of bullets, in Bioshock, you are constantly running out of ammo.

The main way that Bioshock differs from Call of Duty 4, however, is the use of supernatural elements. The main character in Bioshock essentially has super powers, which allows for an interesting combat dynamic. Rather than relying solely on wasting bullets on enemies, the player can create combination attacks, or combos, to find creative ways to defeat an enemy, such as stunning a foe with an electric attack, and then beating them with a crowbar.  Call of Duty 4 is much more conservative and realistic in that manner. Rather than using combos and variable supernatural attacks, the game relies on the tried and true formula of cover-based combat, where the player is expected to run and gun down enemies, to find cover to hide behind, and then run and gun down more enemies.  However, without this feature, Call of Duty 4 creates variety in its weaponry by differentiating the weapons to such a large degree.

On a more stylistic level, both games use visuals and audio to create a virtual atmosphere in the game. Call of Duty 4 places the player in the middle of a war. Dully-coloured assets create a sense of grittiness and realism. Call of Duty 4 tries to make the player feel lost in the centre of a battle much bigger than them.  Similarly, Bioshock also creates atmosphere with textures and sounds, although it’s use of colour is different. Rather than using muted colours, Bioshock uses bright, primary colours, and plays with lighting to evoke emotion in the audience.

 Both games use sound to differing extents, and for different purposes. Call of Duty uses both diegetic and non-diegetic sound to create atmosphere. Orchestral music plays through the levels, giving the game a sense of sweeping grandeur. This orchestral music is juxtaposed with the sounds of gunshots, explosions and screaming as constant diegetic sound. This allows the player to see the tension of war, between heroism and honour, and horror and sorrow. . Set in an underwater city named Rapture, Bioshock, however, is able to utilize diegetic sound in ways impossible in Call of Duty 4.  One example of this is the circus music played at vending machines scattered through the game. Sweet music plays through the PA system in certain levels, which compared with the often disturbing things happening on screen emphasise the horrors of the city of Rapture, and establish Rapture as a Utopia gone wrong. Bioshock also uses sound in the form of tape-recordings found in the levels to further the story.

Both games have a story attached to them, but to differing extents. We see Bioshock through the eyes of a protagonist named ‘Jack’, and more back-story is revealed to the player through the exploration of the levels in the game. Through each literal ‘level’ of Rapture, we are exposed to the history of the place. What started out as a Utopia for the worlds more gifted people, quickly turned into a dystopia when Plasmids (chemicals which give people superpowers) were invented. Through the tape recordings found in the game, as well as such in game techniques as grafitti written on the walls, the player sees that these plasmids gave the people of Rapture God-Complexes and led to the demise of the society. In Call of Duty 4 the story takes more of a backseat to the gameplay. The war story in Call of Duty 4 was obviously a late-game addition. The player fills the shoes of a new recruit nicknamed Soap, who joins the American army and must fight certain battles against an array of terrorist groups. Whereas in Bioshock, the story feels instrumental to the success of the game, Call of Duty 4 focuses much more on the ‘arcade’ style of gameplay, where you can pick up and play any level regardless of what you know about the back-story and motivation of the character. Bioshock includes the gamer in an interesting way, in that it allows the gamer to affect the direction of the game by making certain moral choices, whereas Call of Duty 4 is mostly linear, with the gamer feeling swept away in a world which they have no control over.

While excellent examples of the First-Person Shooter genre, both games have incorporated elements of other genres, including RPG’s, Puzzle games and Platformers, to increase their complexity. For example, Bioshock has an upgrade feature, allowing players to upgrade their weapons and also their plasmids (superpowers). Bioshock also allows players to hack machinery in the game. When doing this, players are faced with a classic puzzle game where they are required to match up tubes leading from the source of a fluid to it’s destination. If the player does not achieve this in the time allocated, the machine is not hacked properly, and either explodes or activates an alarm. This raises another element featured in Bioshock, which isn’t necessarily known in the FPS genre, stealth, in which players must sneak around watch lights and other security measures. This is a feature more common in Platformers and Third Person Shooters. In Call of Duty 4, RPG elements are also included, such as experience points and ranking. Players gain experience by completing levels, and as their rank raises, so too do their options of diversifying character class. As well as RPG elements, Call of Duty 4 has elements of Platformers built into it. One example of this is an early level where players finds themselves on a sinking boat, and have to run through the level, avoiding obstacles whilst following an increasingly difficult path to safety.

As can be seen, even though these games share many of the staples of FPS games, these examples try to achieve different goals, for two different audiences. On a gameplay level, as well as on a story level, these games strive for opposite sides of the First-Person shooter genre. Where Call of Duty 4: Modern Warfare tried for realism as much as possible, Bioshock is more concerned with the fantastic. And yet, even with these differences, both games are very concerned with the over all experience of the game, Both use such elements and speedy recovery, diegetic sound, different difficulty levels and elements borrowed from other genres to give the player an immersive experience unbroken by the game calling attention to itself as a game.

0 notes

Choice and the Core of Characterization

           Correctly emulating and representing realistic characters on screen is an accomplishment both highly regarded and sought after by filmmakers. There are three ways in which creators may approach this. These are through exposition, dialogue or action. As is accepted, explicit and substantial use of exposition is not encouraged. This is where the classic mantra ‘show, don’t tell’ rears its cliché, yet accurate, head. Relying on dialogue is equally flawed. As characters, like humans, are complex, it is unwise to represent them as one-dimensional beings used only to service and continue the narrative. As such, characters must not always say how they feel. Like humans, they have hidden agendas, lie to others, lie to themselves, and should never be overly articulate in describing their emotions. As this is the case, relying on a character’s dialogue to understand them is not good enough. Speech and truth do not necessarily mesh. So if both exposition and dialogue cannot stand on their own to convey realistic characters, then only one option remains. What better way to accurately communicate the nuances of both a character’s personality and growth, than to show the way he or she interacts with the settings and events that surround them? 

There is a strong relationship between truth and choice. If we accept that a character will not make decisions against their own beliefs regarding events that surround him, then a close examination of a character’s choices can lead us to their core. This explains why choice in both fiction and non-fiction has been a theme that has reoccurred through the ages of literature. When looking at texts, then, an insight about the concepts of choice is revealed.

             Many texts focus on the trauma, which is engrained with making difficult choices. Complex decisions create drama, suspense and reveal the true nature of character. This trauma is two pronged. It is both personal and communal. It is personal because, whilst everyone may have an opinion with which they try to convince a decision maker, very often a single person must alone decide. In many ways, this causes separation between the decision maker, and those affected by it. Tension is created, when the people a choice affects disagree with the person making the decision. This is a common theme in prose, where the main character must make decisions for what they believe is the greater good. The trauma will then spread to other around them because decisions made by an individual, especially if that individual is in power, affect a multitude of people. Caution, therefore, must be taken when making decisions, because as many texts tell us, if a bad choice is made, they are very often impossible to fully correct. And yet, another common theme in writing, which directly opposes the importance of choice, is the debate between free will and fate. However, some texts flirt with the concept that, no matter what decisions you think you are making, ultimately, you cannot escape your own destiny. Ultimately however, choices are important because, even if they do not change the outcome of a plot, they can reveal insight into and develop character.

Making choices is difficult. Because of this fact, we often delegate decision making to others. This leads to societies with people in charge of other people. And while roles of power, like King or Leader, seem to have their advantages, the responsibility it comes with to make decisions for a large number of people is as daunting as it is imperative. One of the reasons for this is because, while no-body wants the responsibility of making the decisions, they want their opinions to be heard. In no text does this ring truer than in Kenneth Branagh’s adaptation of William Shakespeare’s play Henry V (1989).  Branagh’s Henry is constantly assaulted with advisors and friends giving him their opinions on what they feel are good choices for the country. However, when Henry finally makes the decisions, it is on Henry’s head that the consequences lay. For example, there are many scenes of advisory meetings between the King and his council in which the council members plead to Henry to go to war with France. Yet, when Henry does make this decision, and deploys his troops, Williams, one of his soldiers exclaims:

 ‘…I am afeard there are few die
well that die in a battle; for how can they
charitably dispose of any thing, when blood is their
argument? Now, if these men do not die well, it
will be a black matter for the king that led them to
it; whom to disobey were against all proportion of

Here, Williams is placing the blame of any unwanted consequences for the decision made by Henry onto him and him alone. Branagh’s camera lingers on this soldier, and uses the same angle to capture him as he does for the King, implying that the man’s opinion is of equal importance as the King’s. His status as decision maker separates Henry from the common man. The film makes a point to show this by using depth in the shot showing both the King and his soldiers. Henry, clad in an identity-hiding cloak, is placed in the back of the frame, looking towards the group of soldiers in the foreground, clearly separated from the rest of the group. This motif is repeated a few scenes later, where Henry sits upright in the midst of sleeping men, unable to share in the restfulness of those which slumber but inches from himself. Henry’s separation from the subjects he himself must make decisions for creates a paradox, as finally having the power to make choices for his people is what makes him unequipped to do so well.

Yet, the film disregards this, and paints Henry in a somewhat capable light. In the first aforementioned scene, the lighting on Henry is from the moon, drenching him in hue’s of blue, whilst the light on the soldiers comes from a camp-fire, similarly illuminating them with red. This shows the difference between the cool head of the King, and the fiery tempers of the everyday troop.  In the second scene, Henry is portrayed as the only conscious being among many unconscious. This shows Henry to be both awake physically, and signifies his focus. Similarly, the costuming sets up Henry as dignified, as his garb consists of chainmail and royal emblems. Again, Henry is seen amongst the soldiers, but whilst they are on foot, he is seen on the back of a white horse, both looking down on, but also effectively raising the spirits of the multitudes, which implies that Henry is not only in a position of leadership, but it is so rightfully. It is here where a tension is created, between choosing what will make those it affects happy, and doing what is believed to be right. Henry comes upon this complex decision at multiple times in the film. Firstly, and most importantly to the plot, Henry decides to go to war with France, knowing full well that many of his English subjects will die. Henry must, as he does in the beginning of the film, weigh up the positives and negatives to invading France, and makes the choice to invade not out of anger (as he may appear to in the original play) but out of logic. Similarly, Henry is faced with this dilemma when he comes across an old friend of his who had been caught stealing from a church. Rather than doing what was easy, and would probably make more people pleased with him, Henry decides to allow the man to be hanged.  Branagh’s tears shed in the scene show that this visibly upset the King, and a cut to a flashback consisting of the thief and the King laughing with each other as younger men, reinforces the difficulty of this decision, and how much it alienates the King from his own humanity. But this separation is almost a defence mechanism, as it somewhat softens the blow of sending his friend to the gallows. It allowed Henry to make the decision based on justice, rather than his emotions, thus making him arguably an adept decision maker.

Dealing with similar issues of decision-making is the character Sal, in Danny Boyle’s film The Beach (2000). She, as founder of a small utopian society on a beach in Indonesia, has been delegated the role of ‘leader’ and thus, decision maker. Whenever a member of the society cause trauma to themselves or any other member, and especially if they jeopardize the wellbeing of the community at large, it is Sal whom everyone relies on to solve the problem. However, just like King Henry V, Sal finds that she must make decisions other find unethical.  Rather than revealing their secret beach to the public by getting help for a member of the society who was bitten by a shark, she decides to let that person die to save the community.

Also like Henry, however, Sal is separated from those whom she has come to represent. She is alienated from the utopia she herself has worked to create, purely because she will do anything to protect it. This, unlike Henry V, weighs heavily on Sal. Through the film she becomes so disconnected from other humans, that she begins to lose her basic human and cultural ideals. One example of this is when Sal takes advantage of Richard, sleeping with him, and cheating on her boyfriend. She does this, assumedly to feel connected to someone. This is something the film makes a point to show she hasn’t felt in some time. And yet, the audience sees this scene mostly through a veil of cloth, through which only shadows can be seen. This conveys that, even though Sal and Richard are having a sexual encounter, they are not connected, and Sal is still separated. It is either that Sal has been dehumanised by the choices she has had to make, or she has purposefully made herself this way, knowing that when the society turns on her, she will have the strength to make the correct decisions, for the greater good. As Barsam and Monohan tell us in their book Looking at Movies, low angle shots ‘tap into our instinctive association of figures who we must literally ‘look up to’ with figurative or literal power’ (p.7). Up until the last scenes with Sal, Boyle almost always shot Sal using a low angle shot, symbolizing her power, superiority and prowess. She was also shot lying down often, and in long flowing dresses, which conveyed peace, calm, and acceptance. This is contrasted with the final shots of Sal, when she has made the decision to kill Richard and save the Utopia. As this is an active action of killing, rather than the passive, ‘letting die’, Sal cracks under the pressure of this choice, and never recovers. From then on, Sal is shot from a high angle, making her seem small, scared and insignificant. Sal found that, although she thought she could choose to avoid the outside world, she was merely postponing reality.

In Athol Fugard’s debut novel, Tsotsi (1980), the title character experiences the same dilemma. Tsotsi finds that no matter what choices he believed he was making, in the end, nature had to run its course. Tsotsi, a serial killer living in the slums of South Africa, has a child thrust upon him by a number of circumstance. When he chooses not to kill the child, but to take care for it, this confuses Tsotsi, as he is so used to murder as the only option. And yet, both the baby and Tsotsi seemed destined to die.  It could be said that, from the moment Tsotsi took a life, his life was to be taken as punishment. Regardless of the fact that Tsotsi chose to take care of the child, both characters expire by the conclusion of the novel. But whilst the choices Tsotsi makes do not change the overall conclusion, they do develop Tsotsi into a more complex character than the cold-blood killer he was represented as originally. As Greg Lowe says in his review of the novel, ‘This [choice] catalyses a shower of fragments of memory from the past which pierce the cold, hermetically sealed darkness in which he resides, sending him into a psychological turmoil.’ Tsotsi’s choices to not only affect him, however. The novel shifts focus from Tsotsi to other characters rapidly, to show the way Tsotsi affects them and also to show us Tsotsi’s character development from an outside perspective. For example, the novel follows Gumboot for a time, allowing us to see both how Tsotsi’s actions affected him, ultimately leading to his death, but also letting the audience begin to see the drastic change that comes about in Tsotsi when he chooses to spare the child rather than killing him also.


Ultimately, these texts are just a few that bring to light thematic aspects of choice. Many, if not all, texts rely heavily on their characters decisions to both further plot and reveal insight into their core. We can see how choice is both inclusive and exclusive. A character must make choices that affect both themselves, and many people around them. Yet often they must make these choices alone. This separates the character from those that are affected by the choice made, and may ultimately lead to the downfall of the subject. While, on occasion, we may find that choices made have no strong bearing on many circumstances which further the plot, we can nevertheless see that choice is important for characters, as it shows who they really are, both to themselves, those around them, and to the audience. Therefore, unlike exposition or dialogue, we can rely on choice as an invitation into the character’s soul we are visiting.



The Beach. Dir. Danny Boyle. Adapted from a novel by Alex Garland.

         Perf. Leonardo Di Caprio. Twentieth Century Fox Films. 2000


Bersam, Richard, and Dave Monohan. Looking at Movies: An Introduction to Film
         3rd ed. New York: Norton Inc. 2010. Print.


Fugard, Athol. Tsotsi. New York: Grove Press. 2006. Print


Henry V. Dir. Kenneth Branagh. Adapted from a play by William Shakespeare.
        Perf. Kenneth Branagh. BBC, 1989.


Lowe, Greg. “Athol Fugard – Tsotsi.” Rev. of Fuard Athol: Tsotsi. Spike Magazine.

        19th   September 2010. <



The Beach. Dir. Danny Boyle. Adapted from a novel by Alex Garland.

         Perf. Leonardo Di Caprio. Twentieth Century Fox Films. 2000


Bersam, Richard, and Dave Monohan. Looking at Movies: An Introduction to Film
         3rd ed. New York: Norton Inc. 2010. Print.


Donaldson, Peter S. ‘Taking on Shakespeare: Kenneth Branagh’s “Henry V”’. Shakespeare Quarterly.Vol. 42, No. 1 (Spring, 1991), pp. 60-71. Folger Shakespeare Library.

Fugard, Athol. Tsotsi. New York: Grove Press. 2006 (Original publication 1980). Print


Henry V. Dir. Kenneth Branagh. Adapted from a play by William Shakespeare.
        Perf. Kenneth Branagh. BBC, 1989.


Lowe, Greg. “Athol Fugard – Tsotsi.” Rev. of Fuard Athol: Tsotsi. Spike Magazine.

        19th   September 2010. <


Meizer, Paul, and Kenneth Branagh. ‘Kenneth Branagh: With Utter Clarity.

         An   Interview’. TDR. Vol. 41, No. 2 (Summer, 1997), pp. 82-89.

        The MIT  Press.  Novel Analysis: Henry V. Oakwood Publishing Company.

       19th    September 2010. 


9 notes

Structuralism and The Goose Girl

           French Structuralism, similarly to Russian Formalism, is concerned with the science of literature, and literary works. Ferdinand de Saussure, an influential figure of early French Structuralism thought that, in the same way Science breaks down its subjects into smaller, more definable parts, so too is it possible to find meaning in a text by breaking it down into smaller parts, and analysing them with reference to other texts (Schleifer, R. & Rupp, G. 2005). This break down of larger stories into smaller fragments makes what we have come to accept as normal into something that we question, or something that appears strange, and we can begin to see a relationship between texts through the common usage of these elements. Using this, Saussure fit individual works into larger structures by defining a system that details the rules of a structure, or genre, known as the langue, and the individual representation of a story element know as the parole (Schleifer, R. & Rupp, G. 2005). Structuralists are interested in how stories fit into the overarching structure of a group of texts created by the langue of a form.  They pose that you can analyse a text based entirely on how it has been constructed to conform to this series of elements. After breaking down the story, and making the familiar strange, relationships between these literary ideas can be seen and analysed. Examples of these concepts can be extrapolated from the German fairytale The Goose-Girl. This tale was originally collected by The Brother’s Grimm in 1815, and then released in a second edition in 1819  (Lang, Andrew, ed., 1965). It is an interesting fairy tale, as many of its themes, characters, and plot points share similarities with other fairy tales, and as such, the tale fits into a larger structure of literary texts in general.

Looking at how The Goose-Girl fits into the langue of fairytales, we can see that many stereotypical aspects have been transferred to this tale that would be out of place in any other literary structure. The inclusion of magical beings, including a fairy and talking horse, is an example of this conformity to the langue of the structure. These specific aspects are shared by many of the Grimm’s collected tales, and thus do not feel out of place. This text does nothing to break from tradition and include elements that do not conform to the langue of the fairytale.

Building upon the method of analysis of the Russian Formalist Vladamir Propp,  French Structuralists consider how a literary text conforms to a certain number  of plot points, known as functions, or narratemes, which are seemingly present in all folktales (Barry, P 2002). Whilst not all of them must be fulfilled in one story, the structure and movement between narratemes in a story is vital. The Goose-Girl follows many of Propp’s thirty-one proposed functions.

In The Goose-Girl, The Princess fills the role of the hero. She leaves her home, fulfilling function 1. She is then told of the charm the lock of hair given to her by her mother holds and is inadvertently warned not to lose it; function 2. Not only does the plot continue to fit into these conventions, but so do the characters. As mentioned, the Hero of this tale is the Princess. She fulfils this role simply by going on a journey. A character, known aptly as The Helper, helps her along. In this story the helper character is her horse, Falada. The complication of the story is triggered through the villain. In The Goose-Girl, the villain is the Princess’ maid, whom, like the Princess, is not given a name. It could also be said that the character of Curdken is the villain of the story, but the fate of his character does not fall well into how Propp’s 31 functions says it should. The Maid could also fill the role of the ‘false-hero’. This story, then, merges the two characters into one persona, whose ultimate fate is acceptable to the few functions prop sets up concerning them.  As such, the folktale can be placed into the genre because it is structured with the necessary narratemes of the group. These plot elements, however, can only create meaning if they are pitted up against their binary opposite.

Binary Opposites are two opposing concepts, which are defined by each other (Fogarty, Sorcha 2005).In the same way a green traffic light can mean nothing without the opposite red, neither can a plot element mean anything without its contrary. There is no hero without a villain, because the villain is defined by the hero, and vice versa. When looking for these binary opposite in The Goose-Girl, many are instantly apparent. Of course, there is a hero and a villain, but this example is furthered with the two opposing ideas of wealth and poverty, and also of freedom and slavery. These ideas are question with another binary opposite however, which is of power and weakness. As can be seen, the powerful character is the poor and enslaved maid, where as the weak character is the hero, the wealthy and the free character. This inversion of ideas leads to the complication in the story, and thus creates meaning in the narrative. These are not the only binary opposites present in the text.

The safety of the princess home is paralleled with the danger of the unknown as she leaves. The helper, Falada, opposes the hinderer and the King opposes the lowly Curdken. These binary opposites, define one another. A helper cannot help without a hinderer hindering. A King cannot be powerful without subjects. A place cannot be safe if there is nothing from which to be saved.

As can be seen, The Goose-Girl is a part of a larger structure of literature, and thus conforms to a set of rules, which all similar texts also comply with. The Goose-Girl cannot be analysed on its own, as it cannot exist apart from the structure it was created under. Therefore the only possible way to find meaning in a literary text, such as The Goose-Girl, is to study it in terms of how it conforms to the langue of a structure. This brings new light onto the text, as we can view it in light of what we know about other tales that both influence it, and are influenced by it, and thus can understand its place in the history of literature and of society.


Schleifer, R & Rupp, G 2005a, ‘Structuralism’, John Hopkins Guide to Literary Theory and Criticism (2nd Edition), John Hopkins University Press.

Schleifer, R Rupp, G 2005b. ‘Saussure, Ferdinand de’, John Hopkins Guide to Literary Theory and Criticism (2nd Edition), John Hopkins University Press

Lang, A, ed. 1965 (Originally published 1889), ’Goose Girl’, The Blue Fairy Book. New York: Dover.

Barry, P 2002, ‘Structuralism’, Beginning theory: an introduction to literary and cultural theory, Manchester University Press, Manchester, pp. 39-60.

Fogarty, S 2005, ‘Binary Oppositions’. The Literary Encyclopedia.
[, accessed 11 September 2010.]

1 note

The Webisode (An Essay)

Since the beginning of cinema, production budgets have been increasing exponentially. With every year comes a new ‘most expensive movie ever’. The same can be said for television. However, the idea that the more money poured into production value directly correlates with viewer interest has begun to be challenged, and an alternate view of visual entertainment has been offered. This view details a change in perspective of the receiver of media, from purely a consumer to a content generator and participant, and with it, it creates a new take on the business of media.

The rise of user generated content, through the facilitation of such websites such as Youtube and Vimeo, have altered how both the consumer and in turn producer see media. These sites allows anyone, whether professional or not, to upload and display their homemade videos to the world free of charge. For some this was the chance for their art to be seen. “The promise that talented but undiscovered Youtubers can make their leap from their ‘ordinary worlds’ to the ‘bona fide’ media world’ is firmly embedded in Youtube itself” [Burgess, Green, 2009]

By simply offering a place where the average consumer can try their hand at generating content, the Internet swelled with millions of short, do-it-yourself, clips that’s aesthetic ranged from semi-professional to the downright mediocre.

This increasing popularity of no budget productions made with webcams and computer microphones inspired both amateurs and professionals into exploring the world of web based media. These creators, rather than relying on a big budget, innovated and creatively found new ways to tell a story.

One of the earliest amateur web series was ‘Red Vs Blue’, an Internet phenomenon that took in-game footage from Microsoft’s ‘Halo’ and redubbed it to make a story. Another example of a derivative web series was Yu-gi-oh: The Abridged series.

It’s creator, LittleKuriboh, recut footage from a popular Japanese anime series, and then redubbed the voices to create a parody of the original show.

As can be seen by these two examples, the Internet allows series with new takes on how entertainment is delivered to bring in just as big an audience as a large-scale production. One of the reasons for this is the ability for viewer feedback and contribution, as well as imitation. Web 2.0 focuses around a community of people communicating from all around the globe in various ways.   

“The meaning of user-generated media extends beyond the media that users create …to incorporate element such as blogs, posts and ratings that users contribute to media that they access [Barkat, Hart, Salazar, 2009]

These include posting comments and reviews on individual videos, linking to videos through blogs and also video responses. Due to this constant communication, and also due to the short production time of a small scale webisode, the creator of the series can be influenced by their viewer’s opinions in real-time, unlike Television production which have their series planned out well in advanced, and as such have no time for any public feedback. This can be seen in Yu-gi-oh: The Abridged Series, as in many episodes, LittleKuriboh comments on things his viewers have asked, and also on important cultural events of the time. He releases special question and answer segments with his fans, which bring him closer to the people that watch his show. So close in fact, that many people try to emulate his success with their own abridged series, such as ‘Pokemon Abridged’ and ‘Naruto Abridged’ etc.

However, all prolonged web series need both advertising and revenue, and while those that create online media may be talented in that way, they are not necessarily talented in the effective distribution of their product. Without a traditional television network, webisode creators must find new ways to market their product. One interesting solution to this problem is the invention of the Internet network. An example of this is ‘Revision3’, who holds the rights to many different web series, such as ‘Film Riot’, ‘Scam School’ and ‘The Totally Rad Show’

Though, the Internet based network is only one solution, and there are many other ways a webisode can get funding, such as through their audience. Not only does the audience contribute creatively, but they can also contribute financially. As such, not only does the webisode cut out the middleman of distribution (such as a network), but also that of a producer. With far less middlemen being able to influence the art form into what they individually want, the piece becomes far more pure to what the creator intended it to be, a luxury not known to writers and creators in other mediums, such as TV or Film.

In a podcasted interview with IGN, Felicia Day, creator of the popular web series ‘The Guild’ says, “That’s what’s cool about the Internet. We definitely have a unique perspective, and we have a passionate fan base…and they’d donate $5 to $10 to $20 to keep us going.  You don’t really have to ask permission if you have a cool idea, whether it be a…tv show or movie. The playing field is open.” [Felicia Day, 2010]

Of course, that doesn’t mean the only people trying their hand at webisodes are amateurs. Inspired by the popularity of user-generated media many professionals already in the media business began to experiment with the webisode, with differing levels of success. During the writer’s strike of 2007-2008, Josh Whedon  (Firefly, Buffy The Vampire Slayer, Angel), began work on a small budget production known as ‘Dr. Horrible’s Sing-a-long Blog’. In a letter to his fans, Whedon announced his wishes that his series would ‘show the world there is another way’. He also asked his audience to ‘spread to word. Rock some banners, widgets, diggs…let people know who wouldn’t ordinarily know’ [Whedon, 2008]. In this way, he, a professional, included his audience in his creation, changing the viewer to the participant, and thus continuing the shift in media.

Interestingly, some professionals go a step further than Whedon, when creating their webisodes, and regardless of budget, go for the amateur aesthetic for their piece. This is known as proteur production (a portmanteau of professional and amateur). An interesting example of this is the web series ‘LonelyGirl15’. This style can even be seen spreading into the work of professional films, such as ‘The Blair Witch Project’, ‘Cloverfield’, and just recently ‘Paranormal Activity’.

Also of note, is that webisodes do not always live and die on the web. Some are connected to network tv series, such as ‘Heroes: Webisodes’, and ‘Lost: The Missing Links’. These act as viral marketing and additional content. This is known as convergent media, as it crosses through different mediums, such as the Internet and television.
As seen, the webisode began its life as a small alternative to large scale production, but has quickly boomed to being a  noteworthy medium for new and interesting ways to entertain. It destroys the boundaries between consumer and creator, allowing anyone with a camera to be a story teller, whether professional or amateur. Using the Internet, the webisode has been able to grow and shift and find it’s place, not as a replacement for traditional media, but as a completely new medium, with both new problems but also new opportunities for those involved.


Barak, I., Hart, C.,  Salazar, J.F. (2009) Screen Media Arts: An Introduction to Concepts & Practices, Oxford University Press: Australia & New Zealand, pp. 396-397

Burgess, J. & Green, J. (2009) “You Tube and the Mainstream Media” in YouTube: Digital media and Society Series, Polity Press Cambridge, pp 15-37

IGN Staff (2010) “Keeping it Reel: Episode 77”. Podcast [Online], [Accessed March 26, 2010]

Whedon, J. (2008) “ A letter from Joss Whedon:,,, accessed March 30th, 2010

Charlie Bit My Finger – Again!, 2007, online video, accessed March 31st 2010, <>

Yu-gi-oh: The Abridged Series. Episode 1, 2006, online video, accessed March 31st 2010,<>

Episode 46: Jack Jackerson Saves Lives in a Sketch Shot with DSLR, 2009, online video, accessed March 31st 2010, <>

Paranormal Activity – Trailer 2, 2009, online video, accessed March 31st 2010, <>.