top of page

Project AI Theatre - December | nDisplay Research

Tags: Empathize, Define, Ideate, Prototype

Foreword: This post will include my activity in the first two weeks and a half of December. After we came back from Oslo and had a meeting, we realized that the project was "done". So, we just worked on either adding new small features or polishing the project. The productivity in these month was pretty low overall.


December, for me, was mostly researching and trying to create a prototype with a new technology inside Unreal Engine called nDisplay. What nDisplay does is create multiple different cameras inside Unreal and also displays each camera on different monitors.

In our concept, there are 2 main displays. One which shows the virtual courtroom with the judge and another display which shows the prosecutor. Both the judge and the prosecutor are in the same virtual space, but the location of them is different. That's why we needed 2 different monitors; either this or a singular display with a significant FOV (Field of View), which might not look good on all display or on big displays.


Multi-Display Rendering, in the context of Unreal Engine nDisplay, refers to the capability of rendering a single Unreal Engine scene across multiple physical displays, projectors, or virtual reality (VR) headsets simultaneously. This means that instead of having separate instances of Unreal Engine running for each display or viewport, nDisplay allows you to treat all the connected displays as part of a unified rendering system.

Here's how Multi-Display Rendering works with nDisplay:

  1. Single Scene: You create a single 3D scene within Unreal Engine, which contains all the assets, objects, and elements you want to display across your multiple screens or devices. This scene can be as complex as needed, with detailed 3D models, textures, animations, and interactive components.

  2. Multiple Displays: You have a setup with multiple physical displays, such as monitors or projectors, or multiple VR headsets. These displays can be arranged in various configurations, such as a curved screen setup for a simulation or a large video wall for an interactive installation.

  3. nDisplay Configuration: Within Unreal Engine, you configure nDisplay to recognize and manage these multiple displays. This involves specifying the number of displays, their positions, orientations, and other display-specific settings.

  4. Synchronization: Unreal Engine nDisplay ensures that all connected displays render the same scene simultaneously and in sync. This synchronization is crucial to provide a consistent and immersive experience, as it ensures that all viewers see the same content at the same time, regardless of the number of displays or their physical locations.

  5. Real-Time Interaction: You can also implement real-time interactivity within your Unreal Engine scene. For example, users can interact with objects or navigate within the scene using input devices like controllers or touchscreens. This interactivity is maintained across all displays, enhancing the user experience.

In summary, Multi-Display Rendering with Unreal Engine nDisplay allows you to create large-scale, immersive, and synchronized visual experiences by rendering a single Unreal Engine scene across multiple displays or VR headsets. It simplifies the management of complex multi-screen setups and ensures that the content remains coherent and consistent for all viewers or participants. This capability is valuable for various applications, including theme park attractions, architectural visualizations, training simulations, and virtual production.


I started the research and prototyping of this system in Netherlands, on my laptop, using a single display; and of course it did not work properly. On 15 of December I went back to Romania for the Holidays, but while there I did try to remake the prototype using the multiple displays I had at my home office; and again it did not work properly. Basically, I had the same issues as my engineer colleague. In the end we decided to not use it anymore, since trying to make it work would take too much time and now we have a "different final product."


This is related to what I typed at the foreword. At the Oslo trip we were told that the theatre does not want a final product. What they actually want is a proof of concept. Our project was never supposed to be a final product or anything close to that. They wanted us to come up with an idea and prototype it. After IMT&S they can either choose to keep and continue the idea or just ignore it completely and do whatever they wanted. I know this might be wrong to say, but that part both demoralized and relaxed me at the same time. It demoralized me the fact that my project was never intended to be final and displayed, but also relaxed knowing how many issues the project has; issues that cannot be solved by us, since we are not experts. By issues I mean, tracking issues with LiveLink, Metahuman issues, and so on; things that are beyond our control.


In the end December for me was just researching a new idea and trying to develop it; in the end failing.

Commenti


bottom of page