30. AGX Sensor¶
The AGX Sensor module is used to simulate accurate real time sensors that allow detection of objects and changes connected to the physics simulation.
To simulate sensors, an agxSensor::Environment
will be used as a separate analogue to agxSDK::Simulation
, to house references to all objects and features detectable by sensors, and the sensors themselves.
Currently, the AGX Sensor module supports the following sensor types:
Lidar, see Lidar.
30.1. Environment¶
agxSensor::Environment
primarily contains references to all objects that should be detectable by a sensor.
It is associated with an agxSDK::Simulation
and any object’s movement in the agxSDK::Simulation
will automatically be reflected in the agxSensor::Environment
.
To create an agxSensor::Environment
for a specific agxSDK::Simulation
, the procedure is as follows:
agxSensor::EnvironmentRef environment = agxSensor::Environment::getOrCreate( simulation );
Where simulation
is an instance of an agxSDK::Simulation
.
For an object to be detectable by a sensor, it must be added to the agxSensor::Environment
instance:
// Create a Rigid Body with a Box Geometry.
agxCollide::GeometryRef geometry = new agxCollide::Geometry( new agxCollide::Box( 1000, 1000, 1 ) );
agx::RigidBodyRef body = new agx::RigidBody( geometry );
// Add the Rigid Body to the agxSDK::Simulation.
simulation->add( body );
// Add the Rigid Body to the agxSensor::Environment.
environment->add( body );
The position and orientation of the Rigid Body will automatically be updated in the agxSensor::Environment
as it is moved by the physics simulation handled by the agxSDK::Simulation
.
Currently, the following types are supported and can be added to the agxSensor::Environment
:
agx::Physics::GranularBodySystem
agx::RigidBody
agxCollide::Shape
agxCollide::Geometry
agxSDK::LinkedStructure
agxCable::Cable
agxModel::Beam
agxVehicle::Track
agxTerrain::Terrain
agxTerrain::TerrainPager
agxWire::Wire
Apart from adding objects to the agxSensor::Environment
, any active sensor must also be added to it.
How to create and configure an agxSensor::Lidar
sensor is described in more detail later in the section about Lidar.
30.1.1. Surface Material¶
Surfaces in the real world reflect light at different intensities in different directions depending on the underlying material of the object and the characteristics of the object’s surface.
AGX Sensor allows modeling of a wide range of commonly occurring surfaces using surface materials derived from agxSensor::RtSurfaceMaterial
.
The various agxSensor::RtSurfaceMaterial
types model the compound reflection characteristics of a surface, taking into account both the reflectivity of the underlying material and the microscopic features of the surface.
Physically, the surface materials supplied as the various agxSensor::RtSurfaceMaterial
types specify the Bidirectional Reflectance Distribution Function (BRDF), \(f(\hat{l}, \hat{v})\) from the reflectance equation:
Which describes the light \(L(\hat{v})\) reflected on the surface in the solid angle \(\Omega\), from incident light \(L_i(\hat{l})\).
Currently, AGX Sensor implements the following surface materials:
Material |
Model |
Parameter Count |
Usage |
|
Lambertian Diffuse |
1 |
Most non-specular surfaces, especially matte diffuse surfaces. |
|
GG-X and Oren-Nayar |
5 |
Specular surfaces, rough diffuse surfaces or diffuse surfaces with a specular top-coat. |
|
Explicit BRDF |
(samples) |
Advanced fallback for opaque surfaces where the other models will not suffice. |
30.1.1.1. Opaque Lambert Surface¶
The simplest of the agxSensor::RtSurfaceMaterial
types comes on the form of the agxSensor::RtLambertianOpaqueMaterial
.
This material describes a simple matte diffuse Lambertian surface, with the BRDF as:
For some total diffuse reflectivity parameter of \(C_D\). This surface material should generally be sufficient to achieve a good result for most non-specular, that is not mirror-like, surfaces, and is the default material applied to all surfaces with \(C_D = 0.8\) if no other material is specified.
To add an agxSensor::RtLambertianOpaqueMaterial
, with a reflectivity of \(C_D = 0.2\), to an object in the sensor environment, the procedure is as following:
// Create the Lambertian material
agxSensor::RtLambertianOpaqueMaterialRef material = agxSensor::RtLambertianOpaqueMaterial::create();
// Set the reflectivity
material->setReflectivity( 0.2f );
// Assign the material to the object
material->assignTo( object );
30.1.1.2. GG-X and Oren-Nayar Surface¶
For more complex surfaces the two-layer material agxSensor::RtGgxAndOrenNayarMaterial
can be used.
This material describes a surface with a GG-X microfacet specular top-layer and a secondary Oren-Nayar rough diffuse layer.
The GG-X specular layer contributes to the BRDF according to:
Where \(R_\text{Fresnel}\) is the Fresnel reflectance, \(G_\text{Smith}\) is Smith’s G-function and \(D_\text{GG-X}\) is the GG-X microfacet distribution function. Meanwhile, the Oren-Nayar diffuse layer contributes to the total BRDF according to:
Where \(A_\text{Oren-Nayar}\), \(B_\text{Oren-Nayar}\), \(s_\text{Oren-Nayar}\) and \(t_\text{Oren-Nayar}\) are the various sub-sections of the Oren-Nayar diffuse model.
As this surface material is significantly more complex than the Lambertian diffuse material, see Opaque Lambert Surface, the parameter count is likewise greater. The parameters for the agxSensor::RtGgxAndOrenNayarMaterial
are as following:
Parameter |
Method |
Description |
Default |
\(n\) |
|
Real component of the specular top-layer refractive index. |
1.4517 |
\(k\) |
|
Imaginary component of the specular top-layer refractive index. This value is related to absorption of light at the top-layer. |
0.0 |
\(\sigma_\text{Beckman}\) |
|
Roughness value of the specular top-layer. This value represents the root-mean-square slope of the specular top-layer microfacets. |
0.3 |
\(\sigma_\text{Oren-Nayar}\) |
|
Roughness value of the diffuse secondary layer. This value represents the standard deviation of the normal vectors of the diffuse layer microfacets. |
0.3 |
\(C_D\) |
|
Reflectivity of the diffuse secondary layer. |
0.8 |
To add an agxSensor::RtGgxAndOrenNayarMaterial
to an object in the sensor environment, the procedure is as following:
// Create the GG-X and Oren-Nayar material
agxSensor::RtGgxAndOrenNayarMaterial material = agxSensor::RtGgxAndOrenNayarMaterial::create();
// Set the material parameters (Silver mirror)
material->setRefractiveIndexReal( 0.037f )
.setRefractiveIndexImaginary( 5.57f )
.setBeckmanRoughness( 0.001f )
.setOrenNayarRoughness( 0.0f )
.setDiffuseReflectivity( 0.02f );
// Assign the material to the object
material->assignTo( object );
30.1.1.3. Explicit BRDF Surface¶
If neither of the other surface materials would suffice, the BRDF can also be explicitly specified using agxSensor::RtBrdfExplicitMaterial
.
This material simply accepts BRDF samples on the form of a four dimensional array, flattened in memory, where the indices specify:
Sample index in \(\theta\) (vertical) direction for the light incidence direction.
Sample index in \(\phi\) (horizontal) direction for the light incidence direction.
Sample index in \(\theta\) (vertical) direction for the view direction.
Sample index in \(\phi\) (horizontal) direction for the view direction
The supplied samples should represent a non-isotropic BRDF, thus specifying the full \(\phi\) range from \(0\) to \(2\pi\). \(\theta\) ranges from \(0\) to \(\frac{\pi}{2}\). BRDF samples are suitably attained from measurement data.
30.1.2. Ambient Material¶
The medium through which the light travels to, or from, a sensor may contain disturbances which can affect the measured data.
AGX Sensor supplies a simple way to model homogeneous atmospheric disturbances through agxSensor::RtAmbientMaterial
.
The agxSensor::RtAmbientMaterial
has a number of parameters which control the behavior of light signals in the medium:
Parameter |
Method |
Description |
Default |
\(n\) |
|
Ambient material refractive index. |
1.000273 |
\(\alpha\) |
|
Signal attenuation coefficient. |
0.000402272 m-1 |
\(A\) |
|
Atmospheric return gamma-distribution scaling. |
1.58899 * 10-5 |
\(k\) |
|
Atmospheric return gamma-distribution shape parameter. |
9.5 |
\(\theta\) |
|
Atmospheric return gamma-distribution scale parameter. |
0.52 m |
The refractive index \(n\) and attenuation coefficient \(\alpha\) describe the general propagation speed and attenuation of the signal. Meanwhile, the \(A\), \(k\) and \(\theta\) parameters describe the probability at, and location where, the signal may spontaneously reflect in the medium.
To simplify the configuration of ambient materials, agxSensor::RtAmbientMaterial
also specifies a number of helper methods for common weather phenomena:
Method |
Description |
|
Configures the |
|
Configures the |
|
Configures the |
|
Configures the |
An agxSensor::RtAmbientMaterial
, configured for continental fog of 2.0 km visibility, is added to the sensor environment through the following procedure:
// Create the ambient material
agxSensor::RtAmbientMaterialRef material = agxSensor::RtAmbientMaterial::create();
// Configure as fog for a 905 nm signal
material->configureAsFog( 2.0f, 905.0f );
// Add to sensor environment
environment->getScene()->setMaterial( material );
30.2. Lidar¶
agxSensor::Lidar
is a sensor type in the AGX Sensor module.
It uses GPU accelerated ray tracing to enable accurate real time simulation of Lidar and provides a means to access the generated point cloud data.
In order to be active and able to detect objects within an environment, the Lidar must be added to the agxSensor::Environment
:
agxSensor::LidarRef lidar = new agxSensor::Lidar( frame, model );
environment->add( lidar );
Where environment
is an instance of an agxSensor::Environment
, frame
is an optional agx::Frame
and model
is an agxSensor::LidarModel
.
The frame
parameter can be set to, for example, attach the Lidar to anything moving in the simulation, such as a Rigid Body.
Meanwhile, the model
determines the behavior and characteristics of the Lidar.
Specifying which output a Lidar should produce, and later accessing the resulting point cloud data, is done using the agxSensor::RtOutputHandler
, see Output Handler for details.
An example of how to set up a Lidar simulation scenario in Python can be found in Algoryx/AGX-version/data/python/tutorial_lidar.agxPy
.
A C++ tutorial can be found in Algoryx/AGX-version/tutorials/tutorial_lidar.cpp
.
30.2.1. Lidar Model¶
An agxSensor::LidarModel
is what determines the behavior and characteristics of an agxSensor::Lidar
.
A number of built in Lidar Models are available, below is an example of how to use the built-in agxSensor::LidarModelOusterOS1
to create a horizontally sweeping Lidar mimicking the Ouster OS1 Lidar:
agxSensor::LidarModelRef model = new agxSensor::LidarModelOusterOS1();
agxSensor::LidarRef lidar = new agxSensor::Lidar( frame, model );
environment->add( lidar) ;
Where environment
is an instance of an agxSensor::Environment
and frame
is an optional agx::Frame
, which can be set to, for example, attach the Lidar to anything moving in the simulation, such as a Rigid Body.
It is also possible to create custom Lidar Models where, for example, a custom ray pattern can be implemented:
agxSensor::LidarModelRef model = new agxSensor::LidarModel( myCustomRayPatternGenerator,
new agxSensor::RayRange() );
Where myCustomRayPatternGenerator
is an implementation of Ray Pattern Generator, which configures when and in which direction the Lidar sensor will emit its rays.
For more details about the Ray Pattern Generator, see Ray Pattern Generator.
30.2.2. Predefined Lidar Models¶
30.2.2.1. Ouster¶
A number of predefined lidar models based on popular lidar sensors are included with the AGX Sensor module, each with their own set of hardware configurations and software settings.
Ouster OS0 - Ultra-Wide View High-Resolution Imaging Lidar
Ouster OS1 - Mid-Range High-Resolution Imaging Lidar
Ouster OS2 - Long-Range High-Resolution Imaging Lidar
The OSx series currently have the following predefined configuration options:
Hardware configuration of 32, 64 or 128 channels
Beam spacing options uniform, above horizon and below horizon
Configurable frequency / horizontal resolution settings 512x10, 512x20, 1024x10, 1024x20, 2048x10
30.2.3. Ray Pattern Generator¶
The agxSensor::LidarRayPatternGenerator
is responsible for generating the ray pattern used by the Lidar.
Either, one of the built in Ray Pattern Generators such as agxSensor::LidarRayPatternHorizontalSweep
can be used, or a custom Ray Pattern Generator can be implemented by the user.
The ray pattern is represented by an array of transforms stored within the Ray Pattern Generator. For each simulation step, an interval within this array is chosen to be active during that simulation step. This way, the whole ray pattern can be set once, and parts of the ray pattern can then be activated each simulation step in an efficient manner.
The order of the rays stored in the agxSensor::LidarRayPatternGenerator
determines the order of the output data once it is read via the Output Handler.
Each ray is cast from the position determined by the stored transform origin and is cast along its z axis.
The ray transforms are relative to the transform of the Lidar.
Each simulation step, during PRE_STEP, see Step Events, getNextInterval()
in the agxSensor::LidarRayPatternGenerator
instance will be called, and it is the responsibility of the Ray Pattern Generator to return an interval within the stored ray pattern array.
The ray pattern array is set by calling setRayTransforms()
in the agxSensor::LidarRayPatternGenerator
instance.
The ray pattern array can be set once, for example in the constructor of any Ray Pattern Generator, or it can be set more frequently, for example each time getNextInterval()
is called.
This is up to the implementation of the Ray Pattern Generator to decide, as long as the interval returned by getNextInterval()
is within the currently set ray pattern array.
By setting the ray pattern array only once, performance can be improved.
30.2.4. Ray Distortion Handler¶
The agxSensor::LidarRayDistortionHandler
of a Lidar can be used to add distortions to the Lidar rays before they are emitted from the sensor.
30.2.4.1. Angle Gaussian Noise¶
An agxSensor::LidarRayAngleGaussianNoise
instance can be added to the Lidar’s distortion handler to apply Gaussian noise to the emission angles of the Lidar rays:
// Create a ray angle noise with 0.02 radians standard deviation
agxSensor::LidarRayAngleGaussianNoiseRef noise = new agxSensor::LidarRayAngleGaussianNoise();
noise.setStandardDeviation( 0.02f );
// Set perturbation vector to X (vertical noise)
noise.setAxis( agxSensor::LidarRayAngleGaussianNoise::Axis::AXIS_X );
// Add noise to the Lidar
lidar->getRayDistortionHandler()->add( noise) ;
The noise added to the ray angles through agxSensor::LidarRayAngleGaussianNoise
will alter the angle of the Lidar rays without compensating the output, and thus the angle perturbations will be visible in the resulting Lidar output.
30.2.5. Output Handler¶
The agxSensor::RtOutputHandler
of a Lidar is used to define what data will be generated during ray tracing.
It also provides a convenient way of accessing this data.
The Output Handler can be accessed by calling:
lidar->getOutputHandler();
where lidar
is an instance of agxSensor::Lidar
.
The output of a Lidar is defined using one or several fields defined in agxSensor::RtOutput::Field
.
The field XYZ_VEC3_F32
is used when the (local) locations of points are of interest, INTENSITY_F32
is used when intensity should be included etc.
These fields can be combined in any order to configure the output of the Lidar.
The type of the output is given by the user, and it must have memory alignment matching the fields used to define the output.
Below is a minimal example of this, for a more detailed example, see the example and tutorial listed in Lidar.
lidar->getOutputHandler()->add<MyStructType, RtOutput::XYZ_VEC3_F32, RtOutput::INTENSITY_F32>();
Here, MyStructType
must have a memory layout of four float
contiguous in memory.
Three float
members where the x, y and z will be written to and a fourth float
member to which the intensity will be written.
It could for example also have been of type agx::Vec4f
, or any other type fulfilling this requirement.
Accessing the output data of a Lidar can also be done using the Output Handler.
This should be done in each POST_STEP of the agxSDK::Simulation
time step.
Below is a minimal example of this, for a more detailed example, see the example and tutorial listed in Lidar.
auto view = lidar->getOutputHandler()->view<MyStructType>();
for ( const auto& p : view )
// Access p which is the MyStructType data (point) for this ray.
Note that calling view
is blocking, i.e. it waits until the Lidar output is available, which is generated on the GPU.
30.2.5.1. Distance Gaussian Noise¶
Gaussian noise can be added to the distance, from the sensor, at which the points in Lidar data will appear by adding an agxSensor::RtDistanceGaussianNoise
instance to the Lidar’s output handler:
// Create a distance noise with a base standard deviation of 0.005 m...
agxSensor::RtDistanceGaussianNoiseRef noise = new agxSensor::RtDistanceGaussianNoise();
noise->setStdDevBase( 0.005f );
// ...growing with 0.001 m/m
noise->setStdDevSlope( 0.001f );
// Add noise to the Lidar output
lidar->getOutputHandler()->add( noise );
30.3. Known Limitations¶
Lidar simulation is currently only supported on computers with a CUDA enabled graphics card.
Sensor environments can currently not be configured on systems without support for Lidar simulation.