Camera Tracking with the FreeD protocol without a camera calibration file is possible with all licernses. In order to use professional Camera Tracking data an additional Tracking license option must be purchased and added to your Ventuz Designer or Runtime licenses.
Among the multiple Video I/O boards supported by Ventuz, there are some limitations and we have prepared a list of supported Cards. Please have a look at the the Supported Hardware Vendors Page.
With Ventuz 7 it is possible to use multiple camera tracking streams in parallel and to map them on different cameras in Ventuz.
Currently Ventuz supports these Camera Tracking system vendors:
German vendor Trackmen develops various camera tracking systems using various technologies. Trackmen uses in all cases the same consistent protocol for tracking data transfer, so all Trackmen systems can be connected to Ventuz, regardless of the tracking technology used.Currently, Trackmen offers these tracking products, all supported by Ventuz:
British manufacturer NCam develops Optical Camera tracking solutions that use a lightweight sensor bar attached to the camera to track natural features in the environment, allowing the camera to move freely in all locations - that makes this system especially well suited for shoulder-held or steady-cam shots, or for augmented reality projects outside of the studio. More info here
Croatian manufacturer Stype offers a sensorizing kit with auto-aim functionality, called Stype Kit, for existing cranes and jibs. The system does not require any additional external sensors or infra-red cameras and there is no need for any physical modifications of the camera crane. More info here
British manufacturer Mo-Sys is a traditional vendor of solutions for remote heads & motion control, broadcast robotics, mechanical and optical camera tracking for AR and VR, and on-set visualization. One of the latest additions to their portfolio is an optical camera tracking system called StarTracker, which features a small sensor camera tracking a cloud of markers placed on the ceiling - that makes it mostly usable for permanent in-studio setups. More info here
Introducing the FreeD protocol as vendor independent tracking input allows to receive tracking data from a variety of devices. The input implements the D1 data packet allowing to receive Camera ID, Pan-, Tilt- and Roll Angle, X-, Y- and Z-Position (Height), Zoom and Focus. To correct lens distortion, a lens calibration file can be loaded into the FreeD input. Contrary to the vendor specific tracking inputs, using the FreeD tracking data input does not require an additional license option to receive just the D1 data. For the lens correction an according license option is required though.
If you have the Tracking option enabled in your license, all the supported tracking devices will be displayed in the Tracking section of the Device Configuration. Please bear in mind that Ventuz does not check if Tracking systems are connected, this list only shows the supported Tracking systems.
In order to configure the video device and tracking source configure the Video Input and Output in the Video section as shown and add the Tracking device in the Tracking section. For tracking you will normally need one Tracking and one Video source. Both must be placed in the Inputs pane of the Tracking or Video section and normally one Video Output device, which must be placed in the Video Outputs pane.
Depending on the Camera Tracking system of your choice, tracking data will be transmitted via Ethernet (Trackmen/NCam/Stype/FreeD) of using Serial connections (Mo-Sys). Therefore, when you add the tracking systems in your Configuration Editor, these are the options available, and how to set up tracking data transmission to Ventuz system.
In order to set up the tracking data communication expand the settings of the Tracking device. (A in figure above).
Depending on the Tracking device different parameters will be shown
For tracking systems using Ethernet infrastructure two parameters will be available, currently the supported tracking systems that use IP Communication for the tracking data are Trackmen, NCam, Stype and FreeD:
For tracking systems using Serial Communications infrastructure two parameters will be available, currently the only supported tracking system that uses Serial Communication for the tracking data is Mo-Sys StarTracker:
Apart from the above settings which are common to most Tracking systems, some of them feature specific parameters due to some unique capabilities.
Ventuz can be used in 2 basic modes (not only for tracking):
In most cases Video and tracking data delays won't be equal. These delays are also dependent on cable lengths and intervening equipment in the signal pathways, so each installation will feature different delays.
Genlock Source settings for Deltacast board
Deltacast Video input for Timecode operation with extra video input buffers:
Even when using external keying we need an input SDI signal synchronized to the house clock. The image of this signal does not matter, it is only used to get timing information to correctly align the tracking data with the house clock.
You will need to adjust the frame delay in the external keyer / video mixer.
Once you have set up the tracking data sources and the correct synchronization (see above), now you are ready to use the external Camera Tracking and Lens Distortion data to drive the virtual camera into your Ventuz scene.
In order to do so, just create a 3D Layer in Ventuz Layer Editor. As you can see in figure above, if you get into the 3D scene Properties, by clicking on the 3D Scene Root (A in figure), the contextual Properties will appear in the Properties Editor to the right.
If you expose the 3D Camera Properties, you will get two fields, first one being a Camera dropdown menu (B in figure above) that displays the camera being used - the (default) camera is the start option, and Ventuz will always map this Camera to the Tracking Device. If you have only one Tracking device configured, this mapping will be correct and the (default) camera can be used. You can change this setting to another, non-tracked Ventuz camera, that your camera is not tracked anymore. When multiple tracking devices are configured, it might happen that the (default) Camera is not mapped to the first tracking device. To have full control, it is recommended to assigne the cameras manually to a Tracking Ordinal to have full control.
The other control available, Lens Distortion (C in figure above), is just a checkbox that controls if Lens Distortion data is applied or not.
A different way of using the incoming tracking data stream is to create a Camera, change the View property to Tracked and assign the correct tracking data stream. Dont forget to Activate your camera, or change the Camera property (B in figure above) to your desired camera. This is necessary, if you use multiple camera tracking inputs.
If you have a Tracking Device configured it is always mapped to the (default) Camera of each 3D Layer. In order to not use the tracking data on a layer you need to create an extra Camera and activate it, to not use the (default) Camera.
With Ventuz 7 onwoards, Ventuz supports multiple camera tracking devices simultaneously. To add another device, simply add an input in the Tracking section of the Config Editor. The tracking devices are mapped internally on the Ventuz Ordinals. This order is not the same as the order of configuration in the Tracking config.
By changing the Tracking device in the dropdown menu, a different device can be mapped to Ordinal #0. The Ordinals are top to down incrementing from 0 (0, 1, 2,...). For more information see the Pipe Mapping section in the Device Configuration.
In Ventuz Designer the tracking devices only need to be assigned to the Cameras. These can be done as shown above by selecting the View as tracked and the Ordinal of the tracking device. The cameras in the Compositions can be controlled independently by their tracking devices. With playing out to different outputs, one machine is capable of rendering multiple streams controlled by their own tracking data.
To render the same scene with different camera positions it is best to use the Screen Render Options Node together with a basic Previsualization Scene. With the Screen Render Options node Ventuz can perform a camera override to the Screen node. The existing camera will be overwritten with the camera data from the tracking device. This can be done for each Screen independently, while they are bound to the same Previs Canvas and Composition. The Composition Layer is independent of the multi rendering.
Every Screen node needs an own Screen Render Options Node, which is set to Camera Override:TrackedCamera and the desired tracking ordinal. (see no. 1) The Screen node is set to bound Render Options with the Screen Render Options node bound to the property. (see no. 2) For each Screen independently. The Screen will render now the Previs Canvas with the camera override. In order to have the right Composition(s) connected, a Previs Canvas node needs to be connected. (no. 3) In this example the canvas only consist of one Composition Layer. (no. 4)
To use this Previs Scene, import it to your Render Setup and activate it. Now, only the output mapping needs to be done.
With this simple Previs Scene, one Composition can be rendered multiple times from different camera positions. This also can be used for rendering multiple times with non-tracked cameras.
For a Multi Pipe scenario, the tracking devices are assigned for each pipe independently and therefore the second pipe can use a different device than the first one. For more information refe to the Pipe Mapping section in the Device Configuration.