Monday, March 22, 2010

Our machinima.
Hello everyone!

Though much later than the deadline was set, here comes our machinima project. It's sad to admit that we have unfortunately failed to complete the project, though Recardo kindly gave us a deadline extension. Despite a lot of effort that we put into this exercise we could not accomplish everything that we planned. Nevertheless our failure gave us a good experience in the machinima production which we would like to share with you here. First of all let us explain the idea that was behind our project. After our last meeting, following your suggestions we came up with a script inspired by a little story. It's called "Box of kisses". We did not find a way how to attach the whole script to this post but here is a brief overview of it:
Scene 1:
Morning. A dark small narrow room without much furniture. There's a man inside. He's looks insane. Darkness outside the window is slowly being replaced by slight morning light. This weak light is entering the room which though stays very dark.
Scene 2:
Noon. The same room. A bit more light. A man sees, hears and
experiences strange things. Life tries to get into the room and reach the man. He resists. He waits.
Scene 3:
Night. It's getting late. The room gets darker. Man opens the box. Light is coming out. The room is filled with kind and worm light. Man is happy. His daughter is playing with her favorite toy.

This was our initial idea for the script. It looks very simple. The decorations and actors are neither complicated much. Unfortunately even the simplicity of the script did not help us to finish the project in time. The source of all our problems was Unreal Tournament Editor.
One of the reasons that we failed to achieve our machinima piece was simply the lack of experience and also the limitations of Unreal Editor. We have made several small animations using the editor but we didn’t realize how simple they were with compared what we wanted to achieve. First it seemed quite easy because we were using all the features we could find and making animation out of it. But once we started developing a story and later trying to adapt the editor to meet with our story, it wasn’t as flexible as we had imagined.
Because of lack of experience in modeling in the editor, we ended up modeling and remodeling the same things all over again. First we couldn’t import models from different formats to avoid modeling things on our own. Later we managed to find a way by converting to Unreal Editor supported formats, but while importing we ended up having weird models instead. The editor was structuring the verticies incorrectly, thus we were having a strange form instead of a chair. Therefore, we started modeling everything on our own, which took a lot of time.
Later, we had problems animating the characters. Sometimes they would freeze like a statue, and other times they would move on their own and not how we were scripting them. One of the limitations of the editor was all the characters had a predefined movements or animations, so it was impossible for us to animate them based on the story. For example, we couldn’t make the character to sit on the chair, so we tried to crouch the character as if he was sitting. We tried to make the character to walk slowly, but instead it was walking very fast.
Another problem was to animate the camera. Maybe we didn’t have enough skills to make the camera move the way we wanted, or it is really hard to animate the movement of the camera.
Nevertheless, the most annoying problem was the frequent crashing of the editor. A lot of times we used to do the same things all over again because we didn’t save the results every 10 seconds.
At some point we decided to switch to a newer version of the editor, Unreal 3 Editor, but it more complicated than what we expected. Although, it was easier to model or animate, but we couldn’t use the characters because we couldn’t import the characters into the new editor. So after wasting a lot of time on the new editor we decided to move back to the old one.

Though we did not yet finish capturing here is some shots from scene 1:



We are very sad to upload this raw material with you but we wanter to share our results with you. As we discussed in the class even a failure is a result we could all learn from. This also does not mean that we have left our attempts to finish the movie which we really like. We will try to finish it even if it won't look the way we expected.

Back to filming,
Yarik & Vahe.

Monday, March 15, 2010

the First Interactive Movies in Theatre

Hey guys we talked about interactive movies before with Ricardo, so you might want to check this out, a new software by PowerFlasher made this thing possible.

Machinima

the story of the vedio about main charachter, who thinks her day is a normal day and she find that there was a war in the whole places , which destroy and kill every thing in the town, the vedio is mixed with a pictures and vedio from real life
making vedio using
iclone4 , windows movie maker , crazy talk, and fraps


the MMIRROR

The MMIRROR is a recursive acronym of MMIRROR Mixed Reality of Horror, which is a prototype of e-shopping virtual mirror for halloween mask. The application itself is developed by using Adobe Flash utilizing the webcam in your computer to be able to augmenting the mask in the reality world so that customer can test and view masks before purchasing it without any hesitation. This is only a prototype showing the augmented reality by using action script, adobe Flex, PaperVision, and the FLARtoolkit library. Thus, further implementation in a website is also possible.

You can also see the "video demo noh.swf" file to show how the application works. Firstly you have to print the marker from the pdf file, cut it with leaving some white part and dont cover the marker with your finger from the camera otherwise it wont be recognize. Then, download the latest flash player from adobe website and play one of this .swf file, put your marker up inside the screen and have fun.

print the marker first:
http://www.filefactory.com/file/b0ab83e/n/ARmarker.pdf

download swf's here:
http://www.filefactory.com/file/b0ab84f/n/the_MMIRROR_samples.zip


„The Organ“-Prototype

A two-component DIY-media of video games gameplay videos and real videos. This Max/MSP/Jitter application was made to show allegories in two different types of screens fed with gameplay recordings and the equivalent of real video.
This will will cause relations between the two stories and associations in the plot which can be chosen randomly by runtime like on an organ. Any topic could be blended here. In this case american troops are the topic because any gameplay shots are taken from the game Americas Army, a free game sponsored by the US Army. This could be understood as a critical view on this opinion influencing simulation which is intended.


The left part of the 'code' shows the real video section. A click on the button chooses one of 4 videos randomly. This number of videos is extendable. By clicking the read … Button the predefined film is taken from the hard disk and starts immediately. A tiny screen show the output of the untouched film.

1: The left part of the 'code' shows the real video section. A click on the button chooses one of 4 videos randomly. This number of videos is extendable. By clicking the read ... Button the predefined film is taken from the hard disk and starts immediately. A tiny screen show the output of the untouched film.
2: Shows the operating part. Transparency of 50% each and a module for brightness, saturation and contrast is integrated here. This transparency can be chosen randomly to by pressing the button on top. Standard is 50%.
3: Shows the right video and the same controllers like the left side. These are the gameplay videos.
4: A small unit to record the main screen to hard disk 5: Via fader the two videos are blended on the big screen in b/w. This is the output where the allegories can be seen. Because of a better comparison and the better integration the colours are reduced to black and white.


Here are two output examles:



Sunday, March 14, 2010

Machinima for M106 - Vicious Monster

This is a Machinima piece that was made using: Grand Theft Auto: Vice City (Rockstar Games) and Unreal Tournament 2004(Epic Games). The story goes about a "flashback" of the main character of GTA: Vice city (Tommy Vercetti) and how he became a monster thanks to the offering of a new life.

I used:
  • For recording: Camtasia Studio and Fraps.
  • For production (Audio +Video): Camtasia Studio, Windows Movie Maker and VideoLAN.

Hope you enjoy it.



Monday, March 1, 2010

Performace art in an online game: dead-in-iraq

HI,
here the reference of a project being carried out by artist Joseph DeLappe. The Title is dead-in-iraq. I recommend you to take some time to read this and other pieces of DeLappe's work as many of his works are related to video games.

Web site:
Dead-in-Iraq



DeLappe, Joseph. Dead-in-iraq 2006-ongoing

Friday, February 26, 2010

Language of Video Games in Cinema - Is the language of videogames just a first-person point of view (POV)?

ABSTRACT
The goal of this paper is to discuss about the language of videogames and the particularity of its first-person point of view. It will also answer the question whether the language of video games is just a first-person point of view (POV) or not, and if it has some relation with other elements like narrative and interactivity; with this, it will discuss about this particular aspect of the language of the videogames in Cinema, analyzing three of the elements of the language as perspective or point of view, narrative and interactivity in pieces of Cinema like Doom: The movie (2005), an adaptation of a video game and The Blair Witch Project (1999), which is a movie that was filmed entirely in first-person shots or first-person perspective.

Paper and presentation available on:

Monday, February 22, 2010

More Augnmented Reality

Hello everyone...
I found this digging on some friends info. I know that Felix wanted to work with this so...
Here you are.




All of them taken from: Vimeo: Inspiration Channel by kopfkribbeln

Wednesday, February 17, 2010

MAX/MSP workshop

HI,

this week was with us David Black and he introduced to our lecture MAX/MSP. We basically did a drum machine that slice up and re-organise a beat. The result of our work can be seen below.



Clearly, you can and should use any beat you want! that's what makes it attractive. I connect this with Burrough's cut-ups and machinic art.

Augmented Reality ads

The augmented reality advertising link: http://funkadelicadvertising.blogspot.com/2009/06/top-10-augmented-reality-adveritising.html
Felix, enjoy and good luck! -Denise

Tuesday, February 16, 2010

Amen Beat

The loop we just used today in our short workshop. Take a look.

Monday, February 15, 2010

ReConstitution 2008

Source: http://www.reconstitution2008.com/
ReConstitution is a live audiovisual remix of the 2008 Presidential debates. There will be three performances in three cities, each coinciding with a live broadcast of the debates.
We’ve designed software that allows us to sample and analyze the video, audio, and closed captioned text of the television broadcast. Through a series of visual and sonic transformations we reconstitute the material, revealing linguistic patterns, exposing content and structures, and fundamentally altering the way in which you watch the debates.
The transformed broadcast is projected onto a movie screen for a seated audience.
Join us in witnessing these historical television broadcasts and in reshaping the medium that has reshaped politics for the last half century.
The legibility of the underlying debate is maintained throughout the performance—we don’t want you to miss a word of it.


The video:

ReConstitution 2008 Reel from Sosolimited on Vimeo.

Sunday, February 14, 2010

Presentation of the course

This course tackles DIY, machinima, and hybrid media as means to engage us in a daring and challenging research in grass-roots media practices, thereafter, we will (hopefully) emerge with a good foundation of the semblance of contemporary digital media.
  1. Do-it-yourself (DIY)/ We will be dealing with DIY, the low budget practice with high expectations, as a form of both media resistance and participation. Nowadays, DIY has become a de-facto attitude towards media, everyone is an utterly media producer and operator of machinimas, game mods, mobile apps, blogs, wikis, online radios. All these expressions have steady challenged the traditional mass media structures of content's control. From this stand point we will be addressing at the cultural impact that concepts such as appropriation, repurposing, openness, remix, and playgiarism have on the nature of digital media.
  2. Hybrid media/ What has made possible the apparition of the hybrid media? Which role plays the automation of several media languages and techniques in the computer, in the appearance of such hybrids? How could be described the aesthetics of these hybrids? Should we be afraid of them? Thus, from this stream we will be looking at the machinic combination of media in rhizomatic surfaces.
  3. Machinima/ This term describes a relatively recent species of moving image that results from the mixture of playing video games and producing movies, a hybrid that silently in the underground but massive realm of video games has lured and inspired thousands to enter its territories. In the late 1990s, devoted video game players started to use video game software for experimental movie production. This is a marginal practice that utilizes recordings of gameplay to make short and simple narrative movies.