I don’t feel that I’m acting dramatic when I say that self-driving cars will be one of the most impactful technologies in the last 50 years. The capability will likely save millions of lives, will free up millions of driving hours, and hopefully, clear up some of the roadway congestion. Sure, there have been some fits and starts in getting there and we’re not there yet, but that’s to be expected in anything new and meaningful.
When you think of companies involved in self-driving cars, you might first think of Tesla, NVIDIA, and maybe even Intel’s MobilEye division. Another company that needs to be looked at in this space is ON Semiconductor Corp, who I just added to my list of players in the space. I have been researching them recently and I wanted to share with you some of my findings.
LiDAR, radar, and cameras work together for a safer experience. ON SEMI
Key SD sensors are camera, LiDAR, and radar
There are many ways to segment SD (self-driving) electronics, but simplistically, there are different kinds of sensors- LiDAR, radar, and cameras, that feed a much larger processor or set of processors that determine what is happening in the outside and even inside the car and then determine what the car should do. It does this in milliseconds. There are a few approaches to where the data processing is done data- some are done very close to the sensors and some is done in the more centralized processors.
LiDAR, radar and cameras work as a team in an ADAS system. LiDAR is used to detect medium distance objects typically up to 150 meters with resolution in the low tens of centimeters. LiDAR is used to provide a high-resolution map of objects surrounding the vehicle to help make driving decisions. Radar is typically used for longer distances, precise velocity measurements and generally immune to weather conditions, albeit with lower resolution than LiDAR. Radar is used in applications like adaptive cruise control in cars today. Cameras can be used in medium and short distances, are very fine grain and can work in the day, dusk and at night with lights, but it takes a lot of processing.
ON Semiconductor plays in the SD sensor space with LiDAR, radar, and cameras, and is in fact, is the only manufacturer who makes all three. In the camera space, ON Semiconductor competes with Sony and OmniVision, in radar with Infineon, ST Micro, Texas Instruments and NXP Semiconductors and in LiDAR, it’s a wide-open competitive field.
In cameras, ON Semiconductor distinguishes itself as the onlymanufacturer with a range of CMOS sensors scaling from 1.2MP to 1.7MP all the way up to 8.3MP. Cars would use the 8.3MP camera for full SD and ADAS, 1.7MP and 1.2 MP for ADAS, surround view and backup functions. Having all three cameras, I believe, could simplify car designs, lower development costs and simplify maintenance as customers are tuning for only one design. It’s a good value proposition and unique right now.
Camera sensor. ON SEMI
ON Semiconductor, like many semiconductor companies, builds reference design platforms to help speed up time to market for its designs and in automotive image sensors, and is called “MARS,” short for or Modular Automotive Reference System. MARS comes in many different flavors and designers can select from six different sensor boards, six different co-processor options, two different serializers and de-serializers, and ethernet. Not only do reference designs speed up time to market, but they also create a barrier of entry for smaller entrants.
In May, ON Semiconductor acquired SensL Technologies, who designs many sensing chips used in medical imaging devices, airport hazard detectors, and LiDAR chips for many kinds of applications including self-driving applications. ON Semiconductor has been mum on exactly how it will integrate SensL, but SensL already has working products which don’t compete with cameras or radar, so it should be straight-forward.
In 2017 ON Semiconductor acquired from IBM an experienced RF team and founded the ON Semiconductor Radar Design Center in Haifa, Israel. It began immediately the development of a comprehensive radar sensor roadmap which targets both best-in-class performance Long Range Radar for a high level of autonomy as well as size, power efficient short and mid-range intelligent radar for ‘safety belt’ coverage enabling both ADAS and cost-efficient SD applications.
ON Semi’s self-driving ecosystems
As we live in a world of ecosystems, any technology’s degree of success is somewhat determined by the ecosystems it creates or operates in. What self-driving ecosystems does ON Semiconductor play in? This is straight-forward. ON Semiconductor works with the biggest:
One of the bigger platforms ON Semiconductor isn’t touting publicly is Google’s Waymo. This doesn’t necessarily mean ON Semiconductor isn’tworking with Waymo, but Google has been very mum on any components inside except NVIDIA. Needless to say, ON Semiconductor’s SD silicon are inside the biggest ecosystems.
You may be as surprised as I was the first time I heard the ON Semiconductor story on SD vehicles.
After the SensL acquisition, the company can be a one-stop sensor shop for nearly every SD application and every time you think NVIDIA, Intel Mobileye or Baidu Apollo, ON Semiconductor is likely in the mix. And while I have only been talking SD in this article, these same sensors are also used in assistive driving applications for use today in regular cars for braking, turning, lane change warning, and parking systems.
In addition to self-driving technologies, ON Semiconductor is also a powerhouse in power applications for fully electrical and hybrid vehicles where every volt and watt matters. If there’s interest, I’ll cover the company’s power story, let me know on Twitter.