Video: Working at Cyngn ft. CEO, Lior Tal
What's it like working at Cyngn? Let's go behind-the-scenes with Cyngn CEO himself, Lior Tal, to find out. Watch now.
Cyngn has organized the elements of our technology stack into a periodic table. Let's explore the 6 major groups within this framework.
The autonomous vehicle space has continued to identify and develop the essential elements that make up a valuable autonomous driving system and the patterns among them. In approaching this, Cyngn has organized the elements of autonomous driving and of our technology stack into a periodic table framework.
Inspired by the Periodic Table of Elements, this Periodic Table of AV Technology provides a valuable framework for researching, improving, and explaining autonomous vehicle innovation — both internally and externally.
Given the complexity of building, designing, and improving advanced driving systems, there are numerous elements that are essential for building and deploying a valuable driving system and are therefore included in this table.
There are four notable benefits to organizing this information in this framework.
In general, the periodic table organizes the elements and technology in a flexible way that enables us to deploy AV technology across numerous use cases. Rather than looking for a problem to solve, this approach allows us to lead with our solution and apply it to a variety of applications. This allows us to better identify, “what sensor selection is best for this given application?”, for instance, giving us great flexibility in our deployments.
What Elements Are Present in the Periodic Table of AV Technology?
The Periodic Table is organized into six major technology groups:
1. The Sensor Group
Sensors, driver, calibration, and synchronization elements lie within the sensor group. This group represents the starting point and is essential for the vehicle’s journey.
There are multiple sensors on our vehicles: LiDAR, cameras, RADAR, and IMU. Each of these sensors operates in slightly different ways. The elements within the calibration system ensure that these sensors are functioning within these expected parameters and the synchronization systems align the information received by different sensors (with different frame rates and timeframes) across all of the sensors.
Here, we can see the benefits of building a system that’s optimized for flexibility. If an application requires a different type of LiDAR with different capabilities, all of the elements in this group would continue to work. In this way, any sensor hardware can slot into our technology stack— including hardware that has yet to be invented.
2. The Perception, Mapping, & Localization Group
Once information flows to the vehicle via the sensors, it is then sent to the perception, mapping, and localization group. This group helps the vehicle identify where it is and detect other obstacles in the environment. The group creates a map and sets the rules of the road.
Key perception elements within this group include the 2D, 3D detection with classification and 2D, 3D detection based on the existence of the surrounding actors. These elements detect associated objects in the environment that haven’t yet been labeled by the system.
Additionally, the 2D and 3D tracking systems work to match these actors across time to keep track of objects as they move through space. Following, the fusion system enfolds multiple sensor channels, both on or off the vehicle, to create a 3D understanding of the environment, while the perception diagnostic system ensures all sensor information and processing is occurring without error.
A notable element in this module and particularly within the mapping and logistics elements is the semantic layer. The semantic layer makes use of the incoming raw data from the sensor group to help us make advanced driving maneuvers and act safely in the environment. It teaches vehicles how to understand human-created symbols and rules of the road. This may include adhering to required speed limits or stopping at stations where, in the case of materials handling, a pallet may be unloaded off the AV.
While human drivers instinctively understand these sorts of signals, the vehicles require this additional layer of information to comprehend them.
3. The Path Planning Group
The path planning group comprises prediction, planning, routing, and decision elements. The vehicle uses elements from the previous group to detect objects and obstacles in the environment. The group must predict how these objects might move and make safe driving decisions based on this information. This module is about making decisions or as Biao Ma, our head of engineering explains: “thinking before taking action.”
This path planning group allows a vehicle to plan out a potential 1,000 candidates for paths in a single second. Ben Landen, the VP of Business Development adds, “this group is where it comes together to create a system that can do things that humans simply can’t do with vehicles, representing a superhuman type of capability”.
The group includes the adaptive prediction framework, which is a flexible data pipeline that generates accurate predictions that can be adapted to any domain and objects that may be found there. This framework has two key elements.
The first is relevancy prediction, which helps a vehicle decipher what is relevant in its surroundings. This is important in an environment where many objects are moving. For instance, in a warehouse, an autonomous vehicle might encounter a person moving slowly and forklifts moving quickly. Here, relevancy prediction is critical in deciding that the forklifts are relevant in this scenario, while the person presents a lower risk.
The second is trajectory prediction. Once the vehicle understands whether an object is relevant or not, it must decide what the expected trajectory is of the given actor and whether it will affect its own path. Where will the object be in the very near future?
In the previous example, trajectory prediction can predict that the forklift is moving directly towards the vehicle and about to hit it, while the person is far enough away that it will not affect its path.
The culmination of being able to predict the movements of hundreds of objects per second with great granularity is achieved in this decision-making and path planning process.
4. The Control Group
Now that the vehicle has a plan, the control group elements control the vehicle in a smooth, flexible, and agile way.
In short, this group executes the plan. The elements in the control group make it easier for us to identify how to send signals to the vehicle to control it and allow us to be adaptive. We can more easily identify which vehicle we’re plugging into. In turn, we can bring the technology to any use case, putting Cyngn in a position to address any customer need and expand across their fleets.
Consider a warehouse stock chaser that is moving from point A, around a corner, to point B. Here, the path-following control system will communicate the commands and tell the vehicle to move along this desired, planned path. The adaptive steering control system will then instruct the vehicle to steer to the right, towards point B. As it approaches its final destination, the speed control system generates the command to tell the vehicle to decelerate. In fact, if a person were to unexpectedly step in front of the path, this system is also able to control the vehicle by slowing down, so that it can stay on its intended path.
These elements adapt to the plan that is coming from the upper stream and will consider platforms with different configurations. As Ma explains, “having the adaptive steering and adaptive speed control here allows for smooth and flexible execution.”
This module also contains critical elements like the drive-by-wire system, which is the abstraction layer that translates electrical instructions into physical movements. At Cyngn, we have put this in our system and brought it up with a variety of driving platforms.
Second is the automatic system ID which automatically identifies and extracts the system identification from the control perspective. This includes general attributes like vehicle position and speed for driving. This automatic ability improves efficiency and further reduces the time required to deploy a novel vehicle.
5. The Fleet Management Group
The fleet management group consists of vehicle management and analytical elements that enable users to get visibility into their autonomous vehicle investments and manage their fleet.
This group essentially contains the “customer-facing” elements and includes apps that have been built on top of the AV technology. These apps ensure that the vehicle is running exactly as expected, while also connecting to the customer via teleoperations and an HMI system.
The HMI visualization element is defined as the human-machine interface that visualizes what the vehicle is thinking and trying to do. This is important because it’s how humans can actually interact with the technology and visually comprehend the vehicle’s “mind”, while the teleoperation systems further enable remote control capabilities. It builds out functionality for the customer and the autonomous vehicle operator.
This group further encompasses analytical elements, including the runtime understanding of the fleet. The runtime diagnostic central system aggregates, stores, and analyzes subsystem diagnostics and when necessary, initiates emergency procedures by triggering the fail-safe response system. In addition, the analytics system surfaces data intelligence and insight of the fleet for vehicle operators. As you can see, each of these elements revolves around what the system can communicate as well as analyze, monitor and fail-safe respond to.
“There are layers of complexity to this technology that are not necessarily valuable for customers to interact with”, says Landen. By contrast, these elements provide a simple-to-use system with interfaces that let customers see the important aspects of their business. This may include workflow, business insights, and so on, that allow you to interact with the vehicles without having to worry about the complex stack of LiDARs, for instance.
6. The Safety and Simulation Group
The last group in the periodic table is the safety and simulation group. This group ensures that we have a redundant system in place if any unexpected behavior occurs.
First, the safety redundancy elements ensure that the system is keeping everyone safe at all times, no matter what. We have dedicated hardware and software for the system, a virtual bumper, a security system, and a black box data recorder.
Additionally, there are simulation elements in this group that make it easier for us to rapidly model and deploy autonomous solutions. It further gives the vehicles the data and training needed to function perfectly. This includes the data annotation pipeline, physics engine-based simulation, paired with the logism system to generate scenarios for the simulation.
This group particularly encompasses the third pillar of Cyngn’s EAS system, Cyngn Evolve. This lets us, as Landen puts it, always “stay on the leading edge of pushing the technology forward, collecting new data, training on new models, simulating numerous scenarios, etc." It is crucial to simulate vehicles in a digital environment before transitioning them into the real world.
What’s Next for The Future of AV Technology?
The six groups that contain the 80 elements of Cyngn’s Periodic Table of Autonomous Vehicle Technology come together to solve complex, real-world problems. Across the entire chart, these different systems and elements are very interconnected and unite to create a valuable autonomous driving system.
This allows Cyngn to bring self-driving capabilities to various industrial domains and continue to drive the technology forward to bring new improvements, updates, and features.
Autonomous driving opens up countless opportunities for humans. AVs can keep people out of dangerous environments and as they become more sophisticated, they will be able to do more and more of the tasks that human drivers do every day.
Without having autonomous vehicles, you can’t begin to produce killer apps on top. Landen thinks of killer apps as solutions, in effect, to things we are currently doing inefficiently, unsafely, or inefficiently. There are layers of greatly beneficial applications that can be built on top of each other once you have the capability to move autonomously.
For instance, we can look at an example of humans that pack up thousands of pounds of cargo in a warehouse. Here, humans are taken away from this packing task in order to also help with the moving portion. By implementing autonomous vehicles, humans can instead transition fully to packing, while the vehicles move the cargo. Eventually, we can expect the vehicles themselves to scan barcodes along the way.
Moreover, if a warehouse closes its facility at night, these mobile vehicles can be retooled with a simple app that will enable them to patrol the grounds as security guards. This creates layers of applications that are just beginning to scratch the surface of the various value-generating capabilities autonomous vehicles can provide.
What's it like working at Cyngn? Let's go behind-the-scenes with Cyngn CEO himself, Lior Tal, to find out. Watch now.
In this video series, we kick off interview #1 with Ben Landen, our VP of Business Development. Ben discusses his career journey, role at Cyngn, and...
In this full demo video, Jordan Stern, Cyngn's Director of Product, gives us a comprehensive walk-through of EAS and DriveMod's different features.