Here’s a brief look into the last 70 years of development within the realm of automotive systems and some thoughts on where it’s headed.
A look in to the past
The first kind of electronics that made its way into a car were the most simple kind. Lamps, horns, radio to name a few. In the 1950s, companies began to develop semiconductor elements for cars, which later became application-specific integrated circuits (ASICs). But the real push came, when the Electronic Fuel Injection(EFI) hit the market.
A trigger that caused the use of electronics was the introduction of emission regulations. The more mechanical and “slower” systems for managing air and fuel in a combustion system were simply not going to cut it. This not only meant that we needed electronic systems, but also electronic sensors and actuators.
Micro-controllers were soon used as the brain(also called the Electronic Control Unit, ECU) to do these kinds of control tasks. And they became more common in the coming years. All manufacturers began to use them for all kinds of operations including comfort, HVAC systems. At first, these systems were standalone devices. With the introduction of communication buses like CAN, ECU’s began to talk to each other. The low-voltage system of vehicles began to get more complex and distributed. And the typical number of ECUs went to a comfortable average of 60 per vehicle including a central gateway unit.
And this led to 2 major trends. Firstly, a need for making a more homogeneous platform to develop, run and test software on all of these ECUs became apparent. In the early 2000s, OEMs (which can be read as vehicle manufacturers) came together to form a consortium to develop a standardardized SW platform and E/E(Electrical and Electronics) Architecture that is now famously known as AUTOSAR. Secondly, OEMs began consolidating the large number of ECUs spread out in a vehicle. And one main enabler that happened around this time was that microcontrollers became more powerful. You could do a lot more with a single ECU. The limitation now being pins available in a single ECU, physical space, and managing the wiring harness. The “pin problem” was solved by the creation of I/O devices, which basically acted as an extension to ECUs to provide more input-output interfaces and could be spread out more physically.
What eventually followed is a domain-based approach to electrical architecture in the automobile industry. Each domain has its own domain controller with a custom hardware package and standardized interfaces connected to a central gateway. A domain controller would then have multiple “simple” ECUs under it to handle the required functionality. Some common domains in the industry are body, chassis, infotainment etc.
Computers today are even more powerful. So that trend has been keeping up pretty well with Moore’s law. But another trend in today’s world is how we all are getting more and more connected, thanks to the internet and the smartphone. Moreover, we have all kinds of electronic devices with a “smart” version and also hook them up to the internet. Right from home security systems to smart coffee mugs.
Electronics in vehicles up until mid 2000’s were optimized to run high performance tasks. Tasks like injecting fuel into the cylinder of an engine 60 times per second, with a precision of a millisecond. Connectivity and web services are not exactly on the same playing field as low level PWM tasks running on a microcontroller.
The need changed from more low-level tasks like controlling voltage at a high rate, to more high-level tasks together with managing large amounts of data. The time frame changes from milliseconds to seconds. The bandwidth and computing power has increased considerably.
Today, we have a hybrid model with 2 kinds of ECUs in a typical vehicle. The first type is the more conventional ECUs that can handle high speed tasks that are critical to keep the vehicle running. The second kind are literal computers. Computers that are capable of handling high bandwidth tasks like streaming a movie, game-play and pretty much anything that a “normal” computer could do. The topology however is more centralized than before, due to the fact that computers are more powerful now, and more importantly virtualization SW has improved really well over the years. Allowing for different domains to be run on the same hardware. This enables a level of integration of SW that was simply not possible before.
Where are we headed to?
The state of the art today is the hybrid model that encompasses both a conventional and a high performance side. One question is, whether this is it? There is no doubt, software will outgrow eventually. But, I think that this model is here to stay for a while. Technological paradigms that cater to both the high performance, low level tasks, and the high computing world have stood the test of time before. Take Raspberry Pi for example. One of the most ubiquitous piece of DIY hardware. Equipped with both the old-school pins for various digital and analog interfaces, and with high bandwidth interfaces like WiFi, Ethernet, USB etc. Both these go hand in hand and have their own use cases that are not going anywhere. But better hardware architectures are definitely something to look forward to. There is some real good potential in different ways in which we can bring these 2 worlds together. As I believe, we still have a suboptimal way of doing this today. But that’s for another article.
But once we have solved that, we would have truly commoditized hardware in automobiles. And OEMs could actually become software companies. Now instead of having individual physical domain controllers, they could be virtualized within the same hardware. If executed correctly, there are many benefits to this approach. Reuse of HW, reduced testing and validation efforts, streamlined SW development processes. It becomes easier to organize around the value streams within an organization rather than traditional ways like projects or domains. Another interesting question to look at, as hardware gets commoditized more and more is, are there any benefits for OEMs to build their own hardware and software?
Taking a cue from the smartphone world (read as Apple), there could be hidden benefits of making both the hardware and software. But unlike the smartphone world, creating an ecosystem of devices might be a bit harder for automobile companies. If you squint a bit, the overall features and experiences among different vehicles are quite similar for a given price point. And tailoring better experiences for users for the sake of differentiating a brand seems a bit of a stretch. But for the sake of an efficient development process, now that seems something like OEMs would embrace more. The platform approach is already common in big automotive conglomerates where designs and know-how are used across brands.
The industry is very dependent on providers of HW, and OS’s and their development plans, which are more or less tied downstream vendors. In a way the trends in the industry are pre-determined by the Tier2/Tier3 vendors. Unless you are a large enough OEM that can move the needle for their downstream suppliers.
Another key parallel to these technological advancements compared to other industries, is that like any development, the more optimized the systems, more value is passed onto customers. We are now looking at a possibility where a vehicle is either a subscription that you have like Netflix, Facebook. Or it is a piece of tech that gains in value with subsequent software updates and add-ons that can be purchased over the course of its life.