Tesla’s use of artificial intelligence in its electric vehicles is about to improve with a new update to its software, according to the CEO of the company Elon Musk.
While Tesla is recognized worldwide for its investment in artificial intelligence for autonomous driving technology, the automaker has also been using its expertise in machine learning and other branches of artificial intelligence to develop other features related to the operation of its vehicles.
Unlike other manufacturers that use a rain and snow sensor in order to detect the intensity of it to activate the automatic windshield wiper, Tesla decides to put its computer vision system to the test by using its cameras to improve automation and the function of brushes.
The automaker now created a new “Deep Rain” neural network to better handle the task.
Now Musk has been driving growth in Tesla’s computer vision capability in the new full autonomous driving Beta update.
It may also interest you: Polaris updates its GEM Low Speed Vehicle (LSV) with new options
In recent months, the CEO has joked that Tesla will move to a completely vision-based system and that it will not even rely on its in-vehicle radar system.
“When radar and vision disagree, which one do you believe? Vision has much more precision, so better to double down on vision than do sensor fusion,” explains Musk
“Sensors are a bitstream and cameras have several orders of magnitude more bits per second than radar (or lidar). Radar must meaningfully increase signal/noise of bitstream to be worth the complexity of integrating it. As vision processing gets better, it just leaves radar far behind,” he added.
This update is expected to come to those using Tesla’s full autonomous driving beta, which is the automaker’s “full autonomous driving” version.
While automating all driving-related tasks, the driver remains responsible for the vehicle and must pay attention and be ready to take control at all times.
Written by | Gabriel Sayago