Playing Music and Playing Games -

Playing Music and Playing Games Simulation vs. Gameplay in Music-based Games Fares Kayali, Institute of Design and Assessment of Technology, TU Vienna...
Author: Blanche Clark
6 downloads 0 Views 1MB Size
Playing Music and Playing Games Simulation vs. Gameplay in Music-based Games Fares Kayali, Institute of Design and Assessment of Technology, TU Vienna Martin Pichlmair, Institute of Design and Assessment of Technology, TU Vienna Introduction: Simulating a Rock Concert When hearing the term ‘simulation’, most players would instinctively think about driving, flying, or sports games. In advertisements, this class of games boasts about its realism. Steven Poole highlights a different quality of sports video games when he asserts that the "... modern sports game is no longer a re-creation of an actual sport so much as it is a re-creation of viewing that sport on television." (Poole, 2000). He argues that sports video games primarily simulate the presentational layer of sports. While they simulate many aspects of a sport with dedicated game mechanics, those always trail the audiovisual realism. Analogue, athleticism-, skill- and experience-based bodily movements are mapped to a digitally controllable, accessible representation in a game-world (see Kayali & Purgathofer 2007). This mapping necessarily results in a reduction of complexity. At the same time the visual representation successfully emulates the - similarly ‘flat’ and reduced televised reality. The same holds true for music-based games. There, the simulation of the setting, style and presentation of music is much more usual than authentically modelling a specific instrument in software. In other words: Guitar Hero (Harmonix Music Systems, 2005) simulates a live recording of a rock concert, rather than it simulates playing guitar. The Nature of Simulations Technically a simulation can be defined as a set of rules. In game studies a concentrated effort has been made to define games as rule-sets (see Salen & Zimmermann 2004). Simulating reality means that those rules are derived (or ‘abstracted’) from the real world. Most games play in fictional worlds. Jesper Juul remarks that "To play a video game is therefore to interact with real rules while imagining a fictional world, and a video game is a set of rules as well as a fictional world." (Juul, 2005). While these two halves of play influence each other, it can be safely asserted that game rules can be split into two groups: those that allow for a gratifying gameplay and are thus arbitrarily introduced, and those that maintain the simulation of the fictional world. In order to present a believable world it is not necessary to simulate its principles. Sports games base physics simulation on statistics and probability as well as Newton’s formulas. Game mechanics thus can be read as abstractions of fictional or real mechanics. Abstraction in games is understood as the process of mapping reality to a constrained, understandable and thereby playable experience (see also Wolf, 2003). It is a means to mediate between reality (or believable fiction) and gameplay. Many apparent examples of abstraction can be found in control schemata. Complex actions available in the real world are transferred to 1 of 13

constrained actions on a game controller. Abstraction is about defining constraints upon player actions to spare her of decisions that are deemed unnecessary in the context of the game. In his paper “A Certain Level of Abstraction”, Jesper Juul (2007) writes the following about abstraction: “The level of abstraction is the level on which the player can act: The actions that are available to the player”. Gonzalo Frasca outlines a similar metaphor when he describes the “simualtógrafo” (Frasca, 2008), a fictional machine for recording aspects of reality to a simulation. While playing, the player decodes the simulation and acts according to the rules of the game as well as to the rules of the simulated fiction. The ability to decode metaphors is the reason players can interpret reduced models of reality (see Lakoff & Johnson, 2003, for a detailed account on the importance of metaphors in all aspects of cognition). Metaphors attach meaning to something by referring to a super-ordinate semantic level. Actions as well as objects can embody metaphors (Rusch, 2007). Japanese designer of music-based games, Tetsuya Mizuguchi, even called Guitar Hero players ‘wannabes’ (Tetsuya Mizuguchi on Rock Band in an interview with Kohler, 2008). Playing with the guitar-peripheral in Guitar Hero can rather be regarded as pretending to play a guitar in a metaphorical sense than playing for real. The player submits herself to the “suspension of disbelieve” (Coleridge 1817). To further this state of immersion a fictional world has to provide a consistent setting. Tolkien (1966) describes the importance of believable worlds that, through their own rules and laws, preserve an "inner consistency of reality". The same argument that Coleridge and Tolkien make for literature can be made for drama as well: "It is key to the success of a dramatic representation that all of the materials that are formulated into action are drawn from the circumscribed potential of the particular dramatic world". (Laurel, 1991). Consistency is more important than accuracy - just like in video games. To support the willing suspension of disbelief, video games have to present believable and consistent worlds and experiences. This inner consistency is important to game worlds, not only as narrative structure, but also to establish a playground for interaction. Simulating and consistently mapping reality to a game-world is as valid a way to achieve this goal as is creating new worlds with stable and understandable settings. Believable game-worlds shape a player’s expectations towards the game. Unconscious expectations to the game-world, as well as a conscious reflection of it, are part of what makes games emotionally engaging (Kayali & Pichlmair, 2008).

Music-based Games ‘Music-based games’ is a term used to categorise a class of computer games where music is the centre of gameplay. Some music-based games present themselves as musical interfaces, as instruments, to the player. E.g. Guitar Hero comes with a plastic guitar, and Donkey Conga (Namco, 2003) with a set of bongos. On screen, music-based games present themselves as worlds driven by music. Our paper about “Principles of Interactivity in Music Video Games” (Pichlmair & 2 of 13

Kayali 2007) presented a separation of music-based games into two distinct classes: Instrument games and rhythm-action games. The paper further elaborates on different principles those two classes embody. Musical Instruments The Sachs-Hornbostel system (Sachs & Hornbostel, 1914) divides musical instruments into different classes; idiophones, membranophones, chordophones, aerophones and the later added electrophones. Each class is characterised by its method of sound production (and manipulation). Prior to electronic instruments, the physical form of an instrument determined its musical output as well as the way it could be interacted with. The paradigm shift that happened with the introduction of electric, electronic and digital instruments is the separation between form and function as well as a radical increase in degree of freedom an instrument can possibly offer: “A violin is less restrictive than the piano because it has no fixed keyboard; the violin can play many more notes than a piano. Yet, both piano and violin are more restrictive than a synthesizer, because they each have a distinctive sound, while the synthesizer can produce the sounds of most traditional instruments and many non-traditional ones, like sirens or wind effects. The synthesizer, more than an instrument, is a "sound processor." The synthesizer player has control over an enormous palette of sound sources, in addition to the infinite range of combinations.” (Kurtz 1998) A violin produces a sound by physical means and its form is determined by acoustics and the human physiology. An electronic instrument can take an arbitrary physical form and its interface can be designed largely independent of the produced sound. In other words: A musical instrument’s possibilities are shaped by two aspects; The interface and the instrument’s acoustic capabilities. Both shape the space of possible musical expressions. Both are rendered independent by digital and electronic means. Following the Sachs-Hornbostel system, music producing games are electrophonic instruments. Music-based games have all the acoustic possibilities of a synthesiser. Their interfaces are diverse, ranging from gamepads to multi-touch screens and instrument peripherals. The challenge in the design of music-based games is mapping a part of a gaming device’s acoustic possibilities to the interface. This mapping is explained to the player by a game’s setting and its game-world. Depending on whether the gameplay of the music-based game is goal-oriented or free-form, games either have to present a reduced and predictable depth of expression or exhibit more freedom. In the latter case, games have to establish a setting - a sandbox - that allows the player to experiment and to set her own goals. Games that offer freedom come closer to instruments than those that present goals. They call for balance between ease of use and depth in expression (see Levin 2000). Making instrumental play accessible to non-musicians, who do not want to take the steep learning curve of a real instrument, is one of the prime design goals of music-based games. This means music-based 3 of 13

games focussed on instrumental play have to implement some of the attributes of real instruments. Most importantly they need malleability and diversity of musical results and an increasing degree of mastery that can be achieved by longer, focussed play. Balancing accessibility on one hand and freedom of expression on the other hand is a prime challenge in the design of music-based games. From a player’s perspective, these attributes shape how the game is perceived at first sight and how long it can keep up interest over time.

Rhythm Action Rhythm-action games are the second group of music-based games. At the core of rhythm-based games is the interpretation or reproduction of rhythmical patterns. By acting in rhythm, the player advances in the game and in the song. Many rhythm-based games have licensed soundtracks. Replaying known songs enhances the player’s identification with the game and makes the gameplay more predictable. Rhythm action games are goal-oriented: Hit or miss the beat - progress or stagnate. In that sense rhythm-based games are very traditional. They pose the player clearly formulated challenges. They reward her with points and progression. This strong goal-orientation is what separates rhythm-based games from instrument games. The latter are less confined by rules and rather do not have game over conditions. Improvisation in rhythm-based games is rarely found. Rhythm-based games can be clustered into several distinct groups depending on the gameplay they offer to the player: -Linear Rhythm-Action: Music in linear rhythm action not only conveys movement but literally takes up space in the game-world. This first category of rhythm-based games is also the biggest. It includes games like freQuency (Harmonix Music Systems, 2001), Guitar Hero, Vib Ribbon (NanaOn-Sha, 1999) and Rock Band (Harmonix Music Systems, 2007). Linear rhythm-based games are focused on replaying commercial music. Songs are mapped to the game-space in the form of progressing rhythmical patterns that must be matched by correct sequences of button mashes (or their equivalents, like touches, strums, shakes or waggles). -Mimicking: The exact reproduction of a given pattern advances the player to the next, longer and more complex pattern. The first rhythm-based videogame, Parappa the Rapper (NanaOn-Sha, 1996), is a game of this sort: The game presents short sequences of rhythmical button presses to the player, who has to reproduce them as exactly as possible to advance. Mimicking can be regarded as a rhythmic version of Quick Time Events. This mechanics is also used in other rhythm-based games like Space Channel 5 (United Game Artists, 1999) and many of the mini-games in Rhythm Tengoku (Nintendo R&D1, 2006), as well as music game sequences in arbitrary games like the Zelda series (Nintendo, 1986-2007) and LOOM (Lucasfilm Games, 1990). -Non-Linear Rhythm-Action: The use of rhythmically executed combos at arbitrary times is the core mechanics of non-linear rhythm-action games. The player decides which pattern she wants to use at 4 of 13

which moment in time. Patapon (Pyramid, 2007) uses rhythmical combos to control a small army that progresses through a hostile 2D environment. Actions like movement, retreat and offensive patterns are all triggered by rhythmical sequences of four succeeding button presses. In the game Electroplankton (Indies Zero, 2005), a collection of musical toys for the Nintendo DS by Toshio Iwai, the mini-game Nanocarp puts the player in control of a formation of plankton that she can direct by clapping her hands in a rhythmical pattern.

The Unity of the Senses In our paper “Levels of Sound: Principles of Interactivity in Music Video Games” (Pichlmair & Kayali 2007) we described several features of music-based games. Some of the principles we described there are focussing on the psychology of playing music-based games. These are synaesthesiea, kinaesthesia, and performance. - Synaesthesia is a neurological condition where “sensory perception of any kind may manifest itself as sensory experience of another” (Jeremy Strick in “Visual Music”, (Brougher et al., 2005)). Commonly children are synaesthetes, yet adults are rarely (van Campen, 2008). Synaesthesia has influenced music as well as the visual arts, most famously through Wassily Kandinsky. These influences can get and were transferred to games. This is best illustrated by the game Rez. By tight synchronisation of acoustic and visual traits and the addition of haptic feedback, simulated synaesthesia connects with gameplay, player immersion and game flow (see Csikszentmihalyi, 1990 and Chen, 2007). Tetsuya Mizuguchi claims to further the reception of music by interconnecting music with visuals but most importantly by involving the recipient (the player) through adding interactivity to the experience (T. Mizuguchi in an interview with Fahey, 2008). - Kinaesthesia is the sense of moving a body part. Games that provoke bodily reactions in the player further a kinaesthetic experience. That way the rumble feature of Rez increases the perception of physical movement. According to Westecott (2008) the feedback loop between kinaesthetic reception and physical input intensifies the experience of a game and even improves reaction times. Henry Jenkins (2005) notes that games in general have a far greater potential to trigger kinetic energies in players and the audience than other media. By using peripherals, force feedback and gestural input, music-based games have the potential to provide an experience that not only encompasses the visual and auditory senses but also includes kinetic, bodily perception. At last, kinasethetic playing also adds an important spatial dimension to games. Similar to synaesthesia, kinaesthesia highlights the unity of the senses. - The act of performance is closely linked to kinaesthetic play. Some music-based games lend themselves particularly to being played during a live performance. Henry Jenkins notes on freQuency and Rez, that “Such games start to blur the line between play and performance, creating

5 of 13

a context where even novice musicians can start to jam and advanced players can create complex and original musical compositions.” (Jenkins, 2005) Many music-based games relate to this desire. Through bodily engagement and a spectacular way to show off skills they increase the involvement of the player. Those games replace, or compliment, the challenges of playing a musical instrument with the challenges of playing a game. In Guitar Hero and in karaoke games the player is actively encouraged to take part in a performance. When playing in front of a real audience the player transcends the boundary of the game’s simulation and becomes a traditional performer. Electroplankton was even used in live-performances by its creator Toshio Iwai1 . Guitar Hero uses the concept of performance on several levels. Firstly, the game is set in a performance environment, featuring the player character as the performer. Secondly, the game allows the player herself to perform using a physical musical instrument. Thirdly, the game communicates gameplay with performance related metaphors, e.g. under-performing is accompanied by increasing boos from the crowd. Star Power, an enhanced score multiplier, is activated by abruptly raising the guitar to double the score multiplier. In Guitar Hero: On Tour (Vicarious Visions, 2008), star power is activated by shouting into the DS’ microphone. Performance is always kinaesthetic, the correlation between bodily movement, visual impressions and music can be read as synaesthetic.

Playing Music and Playing Games All music-based games use a form of notation that makes music accessible to the player. Rhythmbased games always use an abstract form of notation the player translates to a series of rhythmical button presses. Music is thereby abstracted to rhythm. The number of notes played is often sacrificed. Pitch is also abstracted to roughly correspond to the different buttons that are to be pressed in succession. The following graphics illustrates this mapping of notation across several degrees of difficulty using a song from Guitar Hero:

Toshio Iwai’s performance at the Futursonic 2006 festival is documented at http://10.futuresonic.com/urban_play/ instrument/ [Accessed 10/02/08] 6 of 13 1

Fig.1: A transcription of the Guitar Hero song “Less Talk More Rokk” on different difficulty levels. The last line is the original song (Shultz, 2008). The above sheet of music serves well in illustrating the concessions to gameplay concerning representation that are made in rhythm games. As the player traverses the passive score of the game, the mapping of the notes appears arbitrary. Indeed there are notes that have a different tone on each difficulty level. A retained feature are notable changes in pitch (especially those that occur at rhythmically significant instants) that are transferred to pitch changes in the reduced notation. The basic rhythm of the song is maintained across all difficulty levels. Depending on the degree of difficulty the rhythm is reduced to quarter, eighth, or semiquaver notes. If a user interface element is an interactive, arbitrary abstract, visual representation of sound2 , we refer to it as sound agent. Sound agents describe their associated sound through their colour, depicted image, shape, position, size, or any other visual factor. Often, sound agents are partly autonomous but allow for user intervention. The plankton in Electroplankton are examples of sound agents, and so are the musical insects in Sim Tunes. Sound agents often build the surface of active scores. While most rhythm games present a linear notation of music, active scores favour a spatial representation. An active score is a score that can be manipulated by the player and by the rules of the game. Actually many active score games are an abstract simulation of dynamic systems. The different ponds of living plankton in Electroplankton are a good example for the representation of a simple ecosystem that serves as a playground. In active scores, quantisation is used to let the player generate a continuous harmonic stream of music that avoids dissonant tones and irregular rhythms, created by her unpredictable behaviour. The musical instrument that comes to mind in the context of quantisation is the music-box. This instrument is ‘programmed’ with punch cards and operated by turning a crank. The punch cards ensure a quantised, and thus rhythmic, output. Music boxes served as an inspiration for Toshio Iwai’s musical game Sim Tunes (Debatty, 2006). The symbolic use of sound in the form of auditory icons and earcons is described in “Audio Games: New perspectives on game audio” (Friberg & Gädenfors, 2004) 7 of 13 2

For music-based games, quantisation means that sounds are mapped to the music’s underlying rhythm. Similarly, tones may be mapped to a ‘safe’ scale like the pentatonic. These constraints limit creative freedom, yet they render music accessible. Quantisation is done by the player in rhythm-action games and by the software in instrument games. Music-based games either employ quantisation implicity through the environment or tacked on as a gameplay element. Tetsuya Mizuguchi’s Rez (United Game Artists, 2001) and Toshio Iwai’s Otocky (Sedic, 1987) quantise all gameplay events to a single rhythm. The Interface Rhythm-based games have spawned a huge number of instrument peripherals. In a way all peripherals serve the same purpose - they replace button presses on a game controller with a device used for rhythmical input. The devices themselves are diverse. They all use the original purpose of the instrument as a metaphor. Yet they are greatly abstracted. The Maracas in Samba De Amigo (Sonic Team, 1999) need to be shaken to the beat of the music, but it is not relevant with how much force the movement is conducted, as long as a certain threshold is reached. The guitar controller of Guitar Hero abstracts the strings of a guitar to 5 buttons and strumming to a rocker switch. Gestural interfaces in music-based games can be based on real dynamic systems and their rules. The results of the interaction with the system are then transferred to musical parameters according to the rules of the game. Musical output is only controlled indirectly, mediated through the interaction with a game’s environment. For example, the planktons in Electroplankton mediate between player gestures and musical output. In Hanenbow the player controls the angle of the leafs of a plant and the emission of planktons that bounce between these leafs, triggering instrumental samples. Alternatively, gestural interfaces can be abstracted from gestures used to interact with musical instruments. The interaction in Electroplankton’s Marine-Snow (see the picture below) is inspired by stringed instruments. Those allow sweeping across several strings and picking each one individually. In a similar way the spinning of the Lumiloop planktons to trigger continuous sounds reminds of singing bowls or the rubbing of wine glasses that produces resonant sounds.

Fig.2: The interaction with the Marine-Snow and Lumiloop planktons as detailed in the manual to Electroplankton 8 of 13

In these examples, the plankton act as sound agents, that bridge active scores with instruments via their associated gestures. This way, they establish a physical, kinaesthetic link between musical interaction and the game-world. Both Kinds of Games Dutch historian Johan Huizinga (1938) described two different attitudes towards playing: paidia and ludus. Whereas paidia is expressive free-form play, ludus is rigid and goal-oriented. The two kinds of games presented in this paper - musical instrument games and rhythm-action games - focus their gameplay on these two modes. Yet the differentiation of playing music (as in paidia) and playing games (as in ludus) is non-exclusive. FreQuency features free-form passages as a reward to success in the highly structured rhythm game sections. The following table gives an overview over the two prevalent modes of playing this paper is about: playing music and playing games.

playing music

playing games

primary form of play is paidia

primary form of play is ludus

freedom of musical expression

mostly constrained to linear reproduction

play is unstructured and open-ended

play is structured and goal-oriented

ostensible in instrument games and non-

ostensible in sonification games & rhythm-

games / toys

based games

appeals to a niche demographic

appeals to a mainstream demographic

player can act as a real performer

player acts as a “wannabe” performer

music is simulation-oriented

music is gameplay-oriented

interface is gameplay-oriented

interface is simulation-oriented

Fig.3: A table detailing the differentiation between playing music and playing games. The claim that playing games more appeals to a mainstream demographic than playing music is supported by the much larger number of rhythm-based games sold than other music-based games. 10 of the 100 top-selling videogames 3 of 2007 are music-based games. 9 of them are rhythm-based games and the other game is a version of SingStar (Sony London Studio, 2004-2008). The Game Designer’s perspective The above table shows a crosswise relation of playing music and playing games regarding gameplay and simulation. Playing music means interacting with a playful interface that simulates a 3 http://www.edge-online.com/features/the-top-100-selling-games-last-12-months?page=0%2C0 [Accessed 10/02/08]

9 of 13

fictional musical instrument. Playing a goal-oriented music-based game mostly means using a simulated interface to access gameplay-oriented music. These different attitudes towards simulation are at the choice of a game designer. She can freely chose where to put her emphasis within the boundaries of the game setting. Pure simulations are devoid of designed gameplay. Yet games demand concessions to make them accessible and enjoyable. In the graphics below, the simulation space represents the traits of the simulated, and the design space contains attributes shaped by intuition, good practise and culture of traditional game design. The design space is the grammar of the medium called video games, whereas the simulation space is informed by the real world. Game-worlds, rules, and mechanics are created by assembling elements - actions, objects, conceptual relationships - from both spaces. A designer fades between those two spaces like with a crossfader (see figure 4).

Simulation Space

Designed Space

crossfader

Game Space

Fig.4: A visualisation of the balance between decisions driven by design and simulation. In some music-based games the simulation is scaled back to support spectacle. Games shadow scruffy aspects of reality while emphasising spectacular moments. The prime example is the reduction of sex, drugs and rock’ n ’roll to plain rock in Guitar Hero. While the former two are omitted, the game highlights difficult guitar solos that are frenetically cheered to or relentlessly booed at by a virtual crowd while the player spectacularly raises the guitar to activate star power. In games with intense sonification like Rez, Wipeout HD’s (SCE Studio Liverpool, 2008) zone mode and freQuency, player immersion even reaches a level where it simulates - or even substitutes drug-induced or meditative mental states. In other games simulation is scaled back to support gameplay. Examples are frequency-scale quantisation (no disharmonious notes can be played), time-scale quantisation (no arrhythmic notes 10 of 13

can be played) and abstract instrument peripherals. From this perspective, operating the crossfader between simulation and artificiality is the prime role of the game designer. Summary By analysing music-based games we found two different kinds of playing: playing music and playing games. Most rhythm-action games have a strict game structure, while musical instrument games offer free-form play. Rhythm-based games present strong simulations of the visual and haptic traits of performing and playing instruments. Yet their gameplay is a strongly abstracted version of handling a real instrument. Games that feature musical instrument play provide more freedom of expression. They achieve this through simulating dynamic systems that provide for lively experiences and emergent gameplay. The act of designing music-based games can be seen as an act of balancing between allowing freedom of expression and constraining the game space to render a game accessible, playable and enjoyable. The best music-based games excel in both categories - they offer a believable simulation of any particular aspect of music while giving the player the freedom to explore her own capabilities. References: Brougher, K., Strick, J., Wiseman, A. & Zilczer, J. 2005, Visual Music: Synaesthesia in Art and Music Since 1900, Thames & Hudson, London. Campen, C.v. 2008, The Hidden Sense - Synesthesia in Art and Science, The MIT Press, Cambridge, Massachusetts, London, England. Chen, J. 2007, 'Flow in Games (and Everything Else)', Communications of the ACM, vol. 50, no. 4, pp. 31-34. Coleridge, S. 1817, 'Biographia Literaria', in H.J. Jackson (ed.), Samuel Taylor Coleridge, Oxford (1985). Csikszentmihalyi, M. 1990, Flow: the psychology of optimal experience, Harper, New York. Debatty, R. 2006, 'Insects and pianos: Toshio Iwai's talk at Futuresonic', we make money not art, [Accessed 10/02/08]. Fahey, R. 2008, 'Q Entertainment's Tetsuya Mizuguchi ', eurogamer.net, [Accessed 10/02/08]. Frasca, G. 2008, 'El simulatógrafo: una herramienta para crear juegos serios', paper presented to the Homo Ludens Ludens Symposium, Gijon, Spain. Friberg, J. & Gardenfors, D. 2004, 'Audio games: new perspectives on game audio', paper presented to the Proceedings of the 2004 ACM SIGCHI International Conference on Advances in computer entertainment technology, Singapore. Huizinga, J. 1938, Homo Ludens: A Study of the Play Element in Culture, Routledge London. Jenkins, H. 2005, 'Games, the New Lively Art', in J. Raessens & J. Goldstein (eds), Handbook of Computer Game Studies, MIT Press, Massachusets. Juul, J. 2005, Half-Real - Video Games between Real Rules and Fictional Worlds, MIT Press, Massachusetts. 11 of 13

Juul, J. 2007, 'A Certain Level of Abstraction', paper presented to the DiGRA 2007 International Conference - Stituated Play, The University of Tokyo, Tokyo. Kayali, F. & Pichlmair, M. 2008, 'Intentions, Expectations and the Player', paper presented to the [player] conference, 2008, Kopenhagen. Kayali, F. & Purgathofer, P. 2008, 'Two Halves of Play - Simulation versus Abstraction and Transformation in Sports Videogames Design', Eludamos. Journal for Computer Game Culture, vol. 2, no. 1. Kohler, C. 2008, 'Interview: Mizuguchi and Matsuura, Music Gaming Geniuses', Wired, [Accessed 10/02/08]. Kurtz, G. 1998, 'Games and Instruments: Two Ways to Play', Gamasutra, [Accessed 10/02/08]. Lakoff, G. & Johnson, M. 2003, Metaphors We Live By, The University of Chicago Press, Chicago. Laurel, B. 1991, Computer as theatre, Addison Wesley, London. Levin, G. 2000, 'Painterly Interfaces for Audiovisual Performance', master thesis, Massachusetts Institute of Technology. Pichlmair, M. & Kayali, F. 2007, 'Levels of Sound: On the Principles of Interactivity in Music Video Games', paper presented to the Digital Games Research Association 2007 Conference - Situated play, Tokyo, Japan, 09/24/07-09/28/07. Poole, S. 2004, Trigger Happy: Videogames and the Entertainment Revolution, Arcade Publishing, New York. Rusch, D.C. 2007, 'Think Smooth! Challenges, Pleasures and Pitfalls of WarioWare: Smooth Moves', Eludamos, vol. 1, no. 1. Sachs, C. & Hornbostel, E.M.v. 1914, 'Hornbostel-Sachs-Systematik', Zeitschrift für Ethnologie vol. 46, no. 1914 (4-5), pp. 553-590. Salen, K. & Zimmerman, E. 2004, Rules of play, MIT Press, Cambridge, Massachusetts. Shultz, P. 2008, 'Music theory in music games', in K. Collins (ed.), From Pac-Man to Pop Music Interactive Audio in Games and New Media, Ashgate Publishing, Burlington, USA. Tolkien, J.R.R. 1966, 'On Fairy Tales', in, The Tolkien Reader, Ballantine Books, New York. Westecott, E. 2008, 'Bringing the Body back into Play', paper presented to the [player] conference, 2008, Kopenhagen. Wolf, M. 2003, 'Abstraction: An Untapped Potential', IGDA online, [Accessed 10/02/08]. Ludography: Harmonix Music Systems 2001, FreQuency, Sony Computer Entertainment (PS2) Harmonix Music Systems 2005, Guitar Hero, Red Octane (PS2) Harmonix Music Systems 2007, Rock Band, MTV Games (multiplatform) Indies Zero 2005, Electroplankton, Nintendo (DS) Lucasfilm Games 1990, Loom, Lucasfilm Games (PC) Namco 2003, Donkey Conga, Nintendo (GC) NanaOn-Sha 1996, Parappa the Rapper, Sony Computer Entertainment International (PS) NanaOn-Sha 1999, Vib Ribbon, Sony Computer Entertainment (PS) Nintendo 1986-2007, The Legend of Zelda (series), Nintendo (multi-platform) Nintendo R&D1 2006, Rhythm Tengoku, Nintendo (DS) Pyramid / Sony Computer Entertainment Japan Studios 2007, Patapon, Sony Computer Entertainment (PSP) SCE Studio Liverpool 2008, Wipeout HD, Sony Computer Entertainment Europe (PS3) Sedic 1987, Otocky, ASCII Corporation (NES) 12 of 13

Sonic Team 1999, Samba de Amigo, Sega (DC) Sony London Studio 2004-2008, Singstar (series), Sony Computer Entertainment (PS2, PS3) United Game Artists 1999, Space Channel 5, Sega (DC) United Game Artists 2001, Rez, Sega (DC) Vicarious Visions 2008, Guitar Hero: On Tour, Activision (DS)

13 of 13

Suggest Documents