The first Autonomous Tech Conference took place on Oct 21 – Nov 1, 2018 at Tel Aviv Convention Center.
Alex Shulman spoke at a conference on “Future sensors for autonomous driving: robust detection of any obstacle under any weather and lighting conditions”.
Abstract:
Self-driving cars are reliant on their sensors to see the world around them.
One of the biggest challenges standing in the way of autonomous vehicles is the ability to drive under any weather and lighting conditions.
Poor weather conditions account for 22 percent of automotive accidents in the United States. Weather conditions such as fog, rain, snow can lead to fatal accidents if the driver is not able to detect obstacles and respond appropriately. Autonomous vehicles are designed to provide the highest safety driving levels and must cope with any weather condition to drive safely
Until today most autonomous vehicle testing has taken place in locations with good weather conditions and focus on the fundamentals of self-driving. However, recently more and more manufacturers have started testing their self-driving cars in adverse weather conditions, as they understand the need for autonomous vehicles to drive safely under heavy rain at night, dense fog or heavy snowfall. Such testing reveals the strengths and weaknesses of different sensor systems.
It is a common understanding that autonomous driving will require data fusion from multiple sensors for redundancy purposes and for increased sensing robustness. For example, dense fog or heavy snowfall create significant challenges for visible light cameras and for LiDARS. Radar on the other hand can better cope with this condition, but it lacks the required resolution to allow safe autonomous driving.
Foresight has developed a unique multispectral vision sensor. It is based on seamless fusion of 4 cameras – 2 sets of stereoscopic long-wave infrared (LWIR) and visible-light cameras, enabling highly accurate and reliable obstacle detection. Simultaneous information from both regular (visible) and thermal (far-infrared) stereo cameras increases dramatically detection of pedestrians, vehicles and other stationary and moving objects under severe weather and lighting conditions.
The goal of this presentation is to assess the accuracy of different sensors in severe weather conditions. Specifically, detection accuracy and detection range are compared using sensor data recorded day and night, simulated fog and heavy rain for different types of obstacles (vehicles, pedestrians, other static and moving objects). Detection accuracy and range for Foresight multispectral vision sensor, is assessed using (a) stereo visible light images; (b) stereo far-infrared images; and (c) combination of visible and far-infrared stereo images.
Bio:
Alex Shulman serves as Director of Products in Foresight and he oversees all aspects of product management and product marketing.
Mr. Shulman has deep experience in multiple high-tech product categories: opto-electronics, mechanics, software & algorithms in several successful startups that became leading Israeli high-tech companies.
He has led several successful new product launches resulting in significant market share gains in highly competitive markets worldwide. Mr. Shulman holds degree in engineering and MBA.
For Alex Shulman’s presentation click here
Legal Disclaimer:
You understand that when using the Site you may be exposed to content from a variety of sources, and that SagivTech is not responsible for the accuracy, usefulness, safety or intellectual property rights of, or relating to, such content and that such content does not express SagivTech’s opinion or endorsement of any subject matter and should not be relied upon as such. SagivTech and its affiliates accept no responsibility for any consequences whatsoever arising from use of such content. You acknowledge that any use of the content is at your own risk.
Leave A Comment