Pixel Streaming Integration in Unreal

Pixel Streaming Integration in Unreal

by Graham @Target3D

Most of this document is covered in the article here but this document is more specific for the hardware used on the Dazzle showcase (01.12.21)

Streaming positional data of a handheld tablet from Motive, into an Unreal project running on a PC acting as a Pixel Streaming server, streaming video to the handheld tablet.

  1. Setting up Pixel Streaming
  2. Configure SteamVR
  3. Setting up an Unreal Scene

Setting up Pixel Streaming

The workflow for Pixel Streaming in the Dazzle demo was almost completely by the book as shown in the Unreal guide here. The only real difference was that there were compile errors which meant that I couldn’t build the project for Windows as shown in step 10 of the guide.

The way I got round this was to build a basic project (I used Unreal’s Pixel Streaming Demo) and follow the guide using that. This then gives you the batch script which can be run in order to start the server.

As long as I had enabled the Pixel Streaming plugin in the Dazzle project and set up its parameters as shown in the guide, it would connect to the pixel streaming server even though it would not compile and build properly.

Pixel streaming lets you use Javascript, HTML and CSS to build an interactive website incorporating the Pixel Streaming feed. This functionality is built into the demo Pixel Streaming project from Epic, so i just copied and pasted these files into the pixel streaming folder of the Dazzle project. You can then just change the contents of these pages to suit what you’re doing. The web files can be found here and need to be copied and pasted into a folder named “WebInterface” in the root directory of the Unreal project folder of the new project:

NB: Pixel Streaming works if you just navigate in your browser to the IP address of the machine that is acting as the Pixel Streaming Server. In order to use the custom web interface you need to add the name of the html file to the end of the IP address. Eg. rather than

This tripped me up for a while……

Setting up Rigid Body Tracking of Tablet

This is very similar to setting up the tracking of any other rigid body in Motive, and then streaming that data into Unreal, and using it to drive the position of a camera. Therefore I will be shamelessly borrowing bits of Harry’s write up here to explain the process.

Marker Placement

  1. Put the tablet in a case so that you don’t have to stick anything to the tablet itself
  2. Stick optitrack markers asymmetrically to the tablet, try and put them in locations where they will not be obscured by the user of the tablet’s hands.
  3. Place the tablet somewhere in the tracking volume so that you can see the markers in Motive.
  4. Select all of the markers that are attached to the tablet in Motive, making sure that you have the correct number. Create a rigid body in motive using Ctrl. + T, or from the create menu.
  5. Once created, set the streaming ID to whatever you like, or just keep it at 10000. Because this needs to stream to a separate machine (Whatever machine is running motive to the backpack PC) the streaming settings need to be set up to allow this:

Note here that the local interface is set to whatever network is shared by both the mocap PC and the Pixel Streaming PC, in our case this was the tracking network. Transmission type is set to Unicast as we found this held a more stable connection than multicast when multiple sets of mocap data were being streamed.

Unreal Scene Setup

So if everything has gone well you should now have an Unreal Project which has pixel Streaming working, and a tablet which is being tracked as a rigid body in Motive. Now all that there is left to do is to set up a camera in Unreal whose position and rotation are driven by the data coming from Motive. The easiest way to do this is to use the Optitrack LiveLink plugin following the guide which can be found here.

  1. Delete/turn-off/hide all other cameras from the scene.
  2. Enable the Live Link and Opti-Track Live Link Plugins in Edit -> Plugins
  3. Restart Unreal
  4. Go to Window -> LiveLink to open the Live Link pane and add a new source, ensuring the ServerAddress is the IP address of the machine sending the Mocap data, and that the client address is the IP address of the machine running the Pixel Streaming project.
  5. Add a camera to your scene, make sure it’s mobility is set to movable.
  6. Click on your camera in the outliner and add the component Live Link Controller
  7. In the Live Link Controller properties section, make sure to set the Subject representation field to be the correct rigid body being streamed from Motive.
  8. The camera should now be animated via the Live Link plugin and respond to the data that is being streamed from Motive.


Subscribe to get new posts straight to your inbox…