Camera

Camera Sensor Output Camera Sensor Output

Overview

The camera sensor provides image data from customizable perspectives and resolutions. It functions similar to its real-life analogue: the video camera. Users can attach multiple cameras at arbitrary locations to vehicles or the scene. Besides color information, these simulated cameras can be configured to also provide semantic annotation, instance annotation, and depth information.

Usage

Cameras are exposed from C++ to Lua via two functions. These two functions differ in the method of data exchange. One writes data to a shared memory handle, while the other returns data in a table containing b64-encoded strings. The functions are:


Engine.renderCameraBase64Blocking(pos, rot, size, fov, nearFarPlanes)

Renders an image at any point in the map and returns it Base64 encoded.

Args:

pos(Point3F): camera position in the map

rot(QuatF): camera orientation (quaternion)

size(Point2F): image size

fov(float): hrizontal field of view

nearFarPlanes(Point2F): defining the closest and furthest point to be rendered

Returns: table

colorRGB8(string): RGB image, Base64 encoded

annotationRGB8(string): pixel wise annotation of ground truth, Base64 encoded

depth32F(string): depth image, Base64 encoded

width(int): image width

height(int): image height


Engine.renderCameraShmem(colorShmem, depthShmem, annotationShmem, pos, rot, size, fov, nearFarPlanes)

Render image and put data into given shared memory. Every shared memory handle argument is optional and can be replaced with nil.

Args:

colorShmem(String, optional): name/id of shared memory handle for RGB image

depthShmem(String, optional): name/id of shared memory handle for depth image

annotationShmem(String, optional): name/id of shared memory handle for annotation image

pos(Point3F): camera position in the map

rot(QuatF): camera orientation (quaternion)

size(Point2F): image size

fov(float): horizontal field of view

nearFarPlanes(Point2F): defining the closest and furthest point to be rendered

Returns: table

color(string): name/id of the shared memory handle for the color

depth(string): name/id of the shared memory handle for the depth

annotation(string): name/id of the shared memory handle for the annotation

width(int): image width

height(int): image height


Note: Functions relying on shared memory for exchange expect the shared memory to have been previously opened with Engine.openShmem.

Annotations

The result of a camera’s render includes additional ground truth information beyond the mere color data traditionally captured by cameras. For each render, the camera can also provide pixel-perfect annotations that encode the class of an object a pixel belongs to. This data is written to the given annotationShmem when using renderCameraShmem or into the annotation entry in the table of renderCameraBase64Blocking. Annotation classes can be seen/configured in the annotations.json file found in BeamNG.tech’s installation folder. An example pair of color and semantic annotation information looks like this; each pixel in the left image corresponds to a pixel encoding the type of the object that produced the pixel in the right image:

Semantic Annotation Semantic Annotation

Annotation information is included by default when using renderCameraBase64Blocking and can be found in the annotation entry of the resulting table. When using renderCameraShmem, the data is written to the annotationShmem iff provided.

Depth Information

Cameras can also be used to extract depth information from the perspective of the camera. In this case, an additional image of floats is returned where each pixel contains the distance of the point that produced the pixel to the camera. One example looks like this, taken at the toll booth on West Coast USA:

Depth Information Depth Information

Depth information is included by default when using renderCameraBase64Blocking and can be found in the depth entry of the resulting table. When using renderCameraShmem, the data is written to the depthShmem iff provided.

Page created: 29 December 2018, at 11:02
Last modified: 14 October 2021, at 19:57

Any further questions?

Join our discord