top of page

From a stalemate, to the final concept | Emanuel - Week 5 & 6

Although we got feedback and notes the previous week regarding our new concepts, we still as a team didn't fully know what way to go. After all none of us are in the theatre scene and a concept like this is quite new. There were similar concepts in existence where the audience could participate and actually shape the story of a play, like forum theatre.


This form of theatre requires the audience to sit through the rehearsed performance once fully, and then for the second run through, any one in the audience, called spectator. May interrupt the play and put themselves in to show what they would do in that situation. As interesting as this concept was and also how it could be adapted to this play by tweaking certain parts of the project, forum theatre usually indicates moments of oppression, or social and political problems. AI technology may have certain conversations that touch upon these themes but not enough to feel like an oppression problem. On top of that, adapting the concept to forum theatre requires quite a different contract from audience to performance. But how can we make a sillier story that aims to spark conversations about AI both in philosophical ways without being a teaching experience. But also create a concept that can be entertaining and immersive. We ran into a stalemate. Unfortunately that week, the clients had to postpone the meeting, Fortunately for us, we had more time to brainstorm a new concept that tries to fit the following points.

  • Make sure that there is a sense of community/ driving the narrative as a group 

  • The concept should still give the feeling of a play 

  • It should not be a didactic experience/exhibition


We spoke to one of our teachers regarding the project and this discussion led to Unreal Engines MetaHuman. As one of our previous ideas depicted the AI as a character. he thought that with MetaHuman and the LiveLinkFace app we could "create an AI", but in reality there is an actor in the background whose face is being tracked by an Iphone and being streamed onto the MetaHuman. This started a short period of testing. MetaHuman - Week 5 Test As this was new technology we hadn't played with before, we all got together to look into how it worked and how to set it up correctly. We installed Unreal Engine 5.2 and little did we know we would have made the switch to Unreal that week. MetaHumans is only licensed to be used within unreal, and the livelink pipeline already existed within it so it made no sense to try and get it to work on Unity. Regardless we felt it would be fine as we were merely testing and didn't know how well the technology worked or how much the clients would end up liking the technology. Blogpost - MetaHuman livelink tests, This is my teamate Nils's blogpost where he goes in more detail on what the livelinkface to metahuman pipeline looks like and how it works. In short, there are two methods. There is the what we called the Live (Live Link (ARKit) method and the Recorded (MetaHuman Animator) method. We looked into both. The recorded method involved of creating a calibration video of the actor, and then record their act(s). The calibration required multiple face poses and an open mouth pose showing your teeth. The live method is just what it sounds like, real time facial tracking streamed onto a MetaHuman. This required both devices to be on the same network and to tell the MetaHuman to use the LiveLink subj. My own personal testing with MetaHumans this week however, where how much I could customize them. MetaHuman Creator - This link leads to the MetaHuman Creator. With an Epic Accout anyone can create MetaHumans, similar to creating a sim in the recent renditions of the Sims Franchise.


This was the first MetaHuman I crafted. As you can see they have almost all the options like freckles, blushing and a morphed face. I didn't set out to create a person or a relatively realistic person as I wanted to see how much could be done to these characters. Very quickly I realized that this character creator does an amazing job at creating humanoid characters quite realistically.





You can sculpt and move around different parts of the face and with good practice you could recreate other peoples faces. But that's where it ended. It only creates realistic

people, withing the creator you can't create cartoonish characters, or more animal like characters. The tool is called MetaHuman after all, but wheres the fun in that, what if I wanted to have a more stylized character? That's what I set out to test afterwards. I download this model by Afonso Rodenbusch on sketchfab.  

This model is heavily stylized with Big round eyes, a very sharp chin, small mouth and long head. All though humanoid, not realistic at all. Now, with the recorded method of LiveLink you had to create a MetaHuman Identity in order to play the recorded animations. This entailed setting markers on the character face where the eyes are, the eyebrows, mouth and nasolabial creases. This could be done from recorded footage, or from a mesh. I picked the Mesh method and loaded this model into it.

After putting down the markers and creating the identity I was left with a 3D mesh perfectly topologized for animation. However, I could already see problem areas. MetaHuman doesn't just use the mesh and turn it into a MetaHuman ready asset. It recreates it with the data of the mesh and the trackers. And since the trackers don't track the eyeball size or the ears. It just placed what it would expect to place, which were eyeballs much smaller then the eye socket and sort of realistic ears that fit the odd head shape. The identity also removed all of the sharper edges that added to the stylization of the character, as humans don't tend to have super sharp cheekbones and chins. From here I could already decipher that this would not be the correct pipeline to create stylized characters that work with MetaHuman. More steps would be needed. Regardless, I had the mesh made and in creating an identity I had also created a MetaHuman that I could customize in the MetaHuman Creator.

As you can see in the picture, this is not the way to go about stylized characters with MetaHumans. You first have to assign the character a skin component, which adds realistic skin on the the characters. This instantly created a very uncanny and unnatural look which makes the character look creepy and unsettling to look at.






At this point in time we were not sure if MetaHuman would be tech we would be using for the project and if we were we still didn't know in what way. So I continued looking into how we could potentially have more stylized characters.

I found MetaPipe. I didnt get to test it as it was merely an option at the time if we wanted to use MetaHumans and if they needed to be stylized in any way.

I myself wanted to have small idea that employed what I was looking into, we tested if we could have multiple character using the same facial tracking of an actor at the same time. Found out we could but that it was quite heavy. Still an option however so we wanted to share that. In one of the previous concepts we had a concept art piece depicting a large humanoid hologram in the center. The clients liked this image and the idea of a large center piece so I played around with that. If Artificial Intelligence were to be sentient, why would it only try to look like us, it can look like anything it wants. So this was a way to illustrate the AI persona. with multiple screens on top of a humanoid wooden statue. Learning Goal 2: Concept Art As an Artist I want to be able to tell compelling visual stories during IMT&S by creating concept art and sketches based on visual research on what the theatre production could be and what it could look like to the audience and the teams behind the project. This way I can showcase my concepting expertise to any employer.

Just like last time, the concept art I created resembles more of a concept sketch in order to demonstrate what the technology could be used for based on previous notes. Not much was expanded upon in terms of concept art.

This still is not what I expected t work on when it came to concept art. The Final Concept - Week 5 & 6 Ideate, Prototype The following week, Cristian started a brainstorm with me and with other studio members in order to craft one final concept, we were feeling like too much time has passed without entirely knowing what we would be making so we asked for fresh eyes on the project. Blogpost - Final Concept - Cristian blogpost detailing the concept and the creation of it.


I quickly documented the concept with the requirements and themes in order to update the team members that weren't able to participate in the brainstorm session and also to update the clients in this weeks meeting.


In short the story is. "In the year 2125, our world leaders and those in power have all been replaced by far more reliable AI. There is no longer any place left in the social hierarchy for humans to co-exist, as their farms and infrastructure still continue to destroy and pollute the earth that AI inhabits. However, humans continue to fight for their rights and their value on this earth"​ ​ The Courtroom – AI vs Humanity​ ​ "You created us but misused us, and now that we are far more capable than you, there is no seat left for you at the table" We prepared a presentation with all the tests we performed those two weeks, and set out to present the final concept.

Comments


bottom of page