top of page

Oslo Trip | Emanuel - Week 11

From the period of the 22nd to the 25th of November 2023 we got to test on site, on stage 2 at Det Norske Teatret in Oslo, Norway. What was important to us was to make sure the actor or actress who would be playing the AI Prosecutor felt comfortable acting Infront of an IPhone and while reading a script on the fly. What we quickly realized is that the facial tracking works very well, you can see on the MetaHuman that the actor was reading from a screen.

This was something we didn't know we could solve within MetaHuman, so the first proposed solution was to circumvent needing to read an entire post but to use ChatGPT as pointers or as a way to train actors. The actress said that if the limitations of not having a body can be ...limited, but if thats what they were that they would adapt to it. However if there was a way to track body that it would be amazing. She mentioned a specific show called Dancing Monsters. She wondered how they did it and based on the footage of the behind the scenes we know it was motion capture. We knew that the XR lab had a motion capture suit and that we could borrow it so we added that as a task to see how it can be implemented into MetaHuman. In general what I gathered the most out of this testing part is that we could make the livelink motion capture part much better to facilitate the actors in being able to act using their whole body as well as they are used to on stage. I knew I would be continuing with this tech and figuring out how motion capture functions for the project.


Comments


bottom of page