XR Studio Information and Terms

Overview

This article will broadly outline the different resources and tools associated with the XR Studio when it is used for Virtual Production. In that pursuit, this article seeks to provide information as well as defining terminology. If there are any questions or need for clarification, feel free to reach out to postproduction@emerson.edu.

Facility Information & Terms

  • EML: Emerging Media Lab
  • Ansin 312 Lab: This is the classroom area bearing the majority of the EML's Windows-based workstations. Treat this space as a media creation space, whether it's for digital art & designs, 3D assets, animations, games & interactive media, 3D environments, etc.
  • VR Suites (Ansin 310 and 311): Individual rooms with their own workstations, bearing a software build similar to the Ansin 312 workstations. These rooms are built to accomodate a room-scale setup for PCVR headsets (currently we are using HTC Vive HMDs).
  • Studio (Ansin 309): Also can be known as the XR Studio. This is a multi-purpose production space which can also serve to spatially extend Ansin 312 or be split off using a foldable partition in between. This space can be configured as a larger VR space, a filmable backdrop/chromakey screen, volumetric capture space, and for classroom-scale virtual production. 
  • The EML workstations are individual Windows computers that can be logged into with your Emerson credentials. Files are saved locally to that particular computer’s drive, depending on which one you are working from. User file directories are cleared out semi-regularly, with students keeping their actively worked on/most recent copy of their project(s) on a self-provided external hard drive.
  • The workstations’ build contains software geared for 3D authoring, game engines, media creation, virtual reality, photogrammetry, 360 video processing, etc.
  • The workstations are connected to the Emerson network via ethernet connection. If a workstation has lost network connectivity with no immediately apparent reason, please let the student staff and/or the EML manager know.
  • The workstations have network access to the Bin, which is an Emerson-accessible network of directories that can be used for accessing/distributing course materials and available editing/sfx resources. It can be accessed by entering the address ‘\\bin.emerson.edu’ into either the Run command (Windows + R) or directly into the Taskbar/Start Menu search bar on the bottom left. The EML uses the Bin for the Render farm, specifically the ‘Courses/Render’ directory.
  • EML-XR-NAS: Similar to the Bin, the EML & XR Studio has a locally hosted NAS (Network Attached Storage), installed in Ansin 307 or the XR Studio Control room. The EML-XR-NAS serves as a “middle-man” or hand-off point between the EML workstations and the disguise servers, when transferring projects into the Studio. Furthermore, it serves as an output destination for recording using the XR camera. The NAS can be accessed through the address “\\eml-xr-nas.emerson.edu” or by using the desktop shortcut, which should be added onto the desktop, across all of the EML workstations.
  • Those enrolled in associated classes/projects should have automatic access to the EML-XR-NAS, similar to how one connects to the Bin. No further credentials are needed. Currently, the primary folders of note are:
    • D3_General: This is both a backup and mirror of the d3 projects directory that lives on the two Media Servers (Director & Actor). These folders contain the disguise designer (also known as ‘d3’) projects.  
    • If you are bringing in media such as video and images, you will need them to be placed into the correct folder inside of the disguise project, and you will need your disguise project folders to match across all media servers (two Media servers, in this case).
    • Project_Dropoff: This is where you drop-off your Unreal Engine (or other Real-time Rendering engine) project. Real-time rendering engines use a layer/protocol known as RenderStream, which allows you, in Disguise Designer, to map the actively rendered scene/output from an engine such as Unreal (or Unity, Notch, etc) accordingly. You will need your Unreal Engine (or other engine) project to be placed into the RenderStream Projects folder, which lives on each of the RXII Render Servers.
    • XR Studio Recordings: When you are recording “using the camera” in Ansin 309/the XR Studio, this is where your video file will be output by default. Technically, you are not actually recording by directly using the camera. Instead, “camera recording” is triggered from the Teaching Station/XR Operator station using the Stream Deck. You are actually recording using the BlackMagic Hyperdeck recording/playback device that is configured inside of the XR Control Room (Ansin 307) that has access to both the Raw Camera feed (non-composited video) and the XR Camera feed (live-composited video). The Stream Deck’s lower shortcuts allow you to choose which resolution (1080p vs 4K) and format (H265 vs ProRes) that you are recording with through the Hyperdeck.
  • When you are dropping off files into D3_General and Project_Dropoff, files should be automatically copied/synced over to where they need to go. D3_General is a mirror of the d3 projects folder(s) that live on the Director and Actor folders. If you place an image/video file into the corresponding folder in D3_General, it should sync into the corresponding folders on the Director and Actor servers, and become available to use in your disguise Project.
  • Project_Dropoff is for real-time render based projects, such as 3d environments/Virtual Set Extension rendered from Unreal Engine. These are synced into the Renderstream Projects directories on both of the Render servers (RXII-A and RXII-B).
    XR-Studio_Network (6).png
  • Director? Actor? Media Servers? Render Servers?: The various functionalities of the XR Studio (Ansin 309) are essentially powered by a collection of servers and video gear installed in Ansin 307, also known as the XR Studio Control room.
    • The primary servers in charge of the studio’s control/configuration, media output, and real-time rendering are the disguise servers. From those, there are Media servers (model VX2+) and Render servers (model RXII).
    • The Media Servers are in charge of controlling/configuring the XR Studio as well as providing video/media output to both the LED screen and the live-compositing of virtual elements onto the XR camera’s feed. There are two VX2+ servers, named Director and Actor.
      • The (dedicated) Director server provides a GUI for controlling and configuring the different outputs/mapping of content within the XR Studio, whether you are mapping videos/images to the Quasar LED lights or an actively rendered Unreal Engine scene to the LED wall and virtual set extension.
      • The Actor server, using both of its HDMI-based outputs, drives content to the LED wall as well as the live-composited elements such as the virtual set extension that allows the environment to continue past the bounds of the LED wall, and any augmented reality (AR) elements on the front-plate.
    • The Render servers are for distributing the computational workload of real-time rendering based engines such as Unreal. For workflows utilizing an inner/outer frustum or an (AR) frontplate and backplate, multiple render servers may be involved/work in tandem. There are two model RXII Render servers, named RXII-A and RXII-B.
    • Genlock refers to a generated reference signal that “pulses” throughout a system of separate equipment, in order for them to be able to time their operations with one another. Think of this as a “heartbeat” or “metronome” for the interconnected equipment. When filming, you will need to be mindful of what rate you are looking to film at (23.97, 29.98, 59.94, etc) as that matching rate needs to be set on the Genlock provider, the camera, and in the disguise project settings.
    • The currently active Genlock provider is the Tascam CG-1800. This is the source/generator of the reference signal sent throughout the connected XR Studio gear, to the devices that are able to receive it.
    • The Blackmagic Videohub 20x20 12G is a collection of 20 SDI inputs and 20 outputs, and serves as a nexus/centerpoint for patching those signals, ensuring data goes where it’s needed.
    • The Blackmagic HyperDeck serves as a recording and playback box and is arguably the actual device that records both what the camera sees and the live-composited feed that includes virtual elements such as the set extension and AR elements. Normally, you would need to physically be able to set this recording off directly at the HyperDeck's touchscreen; however, the Stream Deck on Ansin 312’s Teaching Station/XR Operator station provides shortcuts to remotely set the active recording resolution/format and to initiate/stop the recording.
    • The Synology NAS (Network Attached Storage) provides a middle storage, backup, and transfer point between the workstation computers and the disguise servers. Currently, they are configured to automatically sync files from D3_General and Project_Dropoff to the Media servers’ d3 projects and the Render servers’ Renderstream respectively.
    • The stYpe Redspy Optical Camera tracking system has an infrared-emitting & upwards facing camera lens that relies on a collection of many retro-reflective stickers that serve as positional reference points in order for the Studio to be able to accurately track the camera’ location. The A307 XR Studio Control room has a corresponding server that receives/processes this tracking data.
    • The Redspy camera tracker is also actively reading the camera’s zoom and focus using two encoder gears. Whenever the corresponding rings on the camera’s lens move, the encoder gears turn and “capture”/quantify those movements. Tracking the zoom allows the XR studio to adjust the size of the frustum or the bounds of what the camera sees & is rendered up on the wall. Tracking the focus allows the virtual Unreal Engine camera to “focus” on elements in the 3D environment. This may come in handy when rolling focus between composited AR elements and physical sets/actors as if they are in one environment. A downside is that the physical camera’s focus and the focus of the displayed Unreal engine scene on the LED wall may either “double-up” or suffer some noticeable latency/delay when adjusting.
  • The Teaching Station/XR Operator’s Station is a dedicated computer where the professor runs the class from. When it's not being used for instruction purposes, it becomes an operator's station where one may remote to the Disguise Servers, monitor the XR camera feed, trigger shortcuts for operating the XR Studio, and controlling elements of the actively rendered virtual environment/set.
    • Teaching Station computer: This workstation has the same specs and software build as the other Precision 5860 machines in Ansin 312, and is connected to an AV rack which includes a project-connected VS-611DT HDMI switcher. This switcher can have the projector show output from the Teaching Station, Elmo document scanner, Blu Ray disk player, and a Guest HDMI cable.
    • KVM: Standing for ‘Keyboard, Video, and Mouse’ and is an ethernet powered & connected unit providing a switchable output from several machines to a single monitor. Using the shortcut (Ctrl + Alt + C), you are able to switch active output and controls between six computers: the Teaching Station computer, the Director and Actor media servers, the two Render servers, and the stYpe Redspy camera tracking server. As it currently stands, you cannot project what you see on the KVM onto the projector screen. It will continue to only show what is output via the Teaching Station computer.
    • Parsec: This is essentially a software alternative to the KVM functionality. It is installed throughout the EML’s workstations, including the Teaching station. If you need to project/demonstrate remote operations on another machine, it’s recommended to use Parsec on the Teaching Station computer as students will be able to see it projected up on the screen. Alternatively, using the screen share functionality of Faronics Insight may be another option.
    • Elgato Stream Deck: This is a configurable collection of backlit buttons, providing one button shortcuts to operating the XR Studio. This includes switching the active KVM mode, controlling the active disguise timeline, recalling DMX/lighting console cues, and configuring/recording using the Hyperdeck. The Stream Deck is powered via USB connection to the Teaching Station computer. Furthermore, shortcuts only appear when you are logged onto the Teacher’s Station, and have BitFocus Companion running. For information on loading/configuring the Stream Deck’s shortcuts, please see the corresponding section/guide.
    • Atomos Neon 17” HDR monitor: This provides an ongoing color-calibrated feed of what the camera currently sees. It can provide a preview covering the Rec 709 as well as the DCI-P3 color gamut. More information here.
    • Monogram Modular Control Surfaces: These are a modular collection of buttons, dials, sliders, and turn-dials that can be mapped/configured in disguise to control attributes/settings, as well as controlling elements of an Unreal Engine scene with exposed parameters. For example, if you want to set off an animation or fade a virtual light, as long as your level blueprint is configured with an exposed parameter/variable and how to operate/process/modify those variables, then you’re able to control your UE scene in real-time within disguise Designer (d3), similar to a “theatre stage technician”, only in this case part of the stage is a 3D virtual environment. 
  • Why isn’t my file or project showing up in disguise?: Remote onto the Render Server (RXII) and access the d3 projects folders using both desktop shortcuts (d3 Projects 101 & 102, which correspond to Director and Actor). Check where your media files are supposed to be, and make sure they’ve synced over correctly. Manually copy them from the respective NAS folder, if needed.

    If you are looking for your Unreal Engine projects, you will need to open the Renderstream Projects directory on the RXII Render servers, and ensure they have copied over.
    We’re remoting onto the Render servers?: When troubleshooting, the reason for potentially remoting into the Render servers is, in cases where you have a disguise project actively running on the Media servers, you can remote onto the RXII servers to access their project directories without having to close out of the active project running on the Media Server(s).

Resources

 

 

Was this article helpful?
0 out of 0 found this helpful
Have more questions? Submit a ticket