When good microwaves go bad: Why embedded systems security is the next boss battle for developers

Published Aug 30, 2021
by Matias Madou, Ph.D.
cASE sTUDY

When good microwaves go bad: Why embedded systems security is the next boss battle for developers

Published Aug 30, 2021
by Matias Madou, Ph.D.
View Resource
View Resource

There are plenty of pop culture references to rogue AI and robots, and appliances turning on their human masters. It is heavily steeped in science fiction fun and fantasy, but with IoT and connected devices becoming more prevalent in our homes, so too should the conversation around cybersecurity and safety. Software is all around us, and it’s very easy to forget just how much we’re relying on lines of code to do all those clever things that provide us so much innovation and convenience. Much like web-based software, APIs, and mobile devices, vulnerable code in embedded systems can be exploited if it is discovered in the wild by an attacker. 

While it’s unlikely that an army of microwaves is coming to enslave the human race (although, the Tesla bot is a bit concerning) as the result of a cyberattack, malicious cyber events are still possible. Some of our cars, planes, and medical devices also rely on intricate embedded systems code to perform key tasks, and the prospect of these objects being compromised is not only alarming, but potentially life-threatening.

Much like every other type of software out there, developers are among the first to touch the code, right at the beginning of the creation phase. And much like every other type of software, this can be the breeding ground for insidious, common vulnerabilities that could go undetected before the product goes live. 

Developers are not security experts, nor should any company expect them to play that role, but they can be equipped with a far stronger arsenal to tackle the kind of threats that are relevant to them. Embedded systems - typically written in C and C++ - will be in more frequent use as our tech needs continue to evolve, and specialized security training for the developers on the tools in this environment is essential. 

Exploding air fryers, rogue vehicles… Are we sitting ducks?

While there are some standards and regulations around all secure development to keep us safe, we need to make far more precise, meaningful strides towards all types of software security. It might seem far-fetched to think of a problem that can be caused by someone hacking into an air fryer, but it has happened in the form of a remote code execution attack (allowing the threat actor to raise the temperature to dangerous levels), as has vulnerabilities leading to vehicle takeovers. 

Vehicles, in particular, are especially complex, with multiple embedded systems onboard, each taking care of micro functions; everything from automatic wipers, to engine and braking capabilities. Intertwined with an ever-increasing stack of communication technologies like WI-Fi, Bluetooth, and GPS, the connected vehicle represents a complex digital infrastructure that is exposed to multiple attack vectors. And with 76.3 million connected vehicles expected to hit roads globally by 2023, that represents a monolith of defensive foundations to lay for true safety.

MISRA is a key organization that is in the good fight against embedded systems threats, having developed guidelines to facilitate code safety, security, portability and reliability in the context of embedded systems. These guidelines are a north star in the standards that every company must strive for in their embedded systems projects.

However, to create and execute code that adheres to this gold standard takes embedded systems engineers who are confident - not to mention security-aware - on the tools. 

Why is embedded systems security upskilling so specific?

The C and C++ programming languages are geriatric by today’s standards, yet remain widely used. They form the functioning core of the embedded systems codebase, and Embedded C/C++ enjoys a shiny, modern life as part of the connected device world.

Despite these languages having rather ancient roots - and displaying similar vulnerability behaviors in terms of common problems like injection flaws and buffer overflow - for developers to truly have success at mitigating security bugs in embedded systems, they must get hands-on with code that mimics the environments they work in. Generic C training in general security practices simply won’t be as potent and memorable as if extra time and care is spent working in an Embedded C context. 

With anywhere from a dozen to over one hundred embedded systems in a modern vehicle, it’s imperative that developers are given precision training on what to look for, and how to fix it, right in the IDE. 

What does a business logic flaw look like in embedded C/C++? Take a look and see if you can identify and fix it like a pro.

Protecting embedded systems from the ground floor is everyone’s responsibility

The status quo in many organizations is that speed of development trumps security, at least when it comes to developer responsibility. They’re rarely assessed on their ability to produce secure code, but rapid development of awesome features is the gold standard. The demand for software is only going to increase, but this is a culture that has set us up for a losing battle against vulnerabilities, and the subsequent cyberattacks they allow. 

If developers are not trained, that’s not their fault, and it’s a hole that someone in the AppSec team needs to help fill by recommending the right, accessible (not to mention assessable)  programs of upskilling for the entire development community. Right at the beginning of a software development project, security needs to be a top consideration, with everyone - especially developers - given what they need to play their part. 

Getting hands-on with embedded systems security problems

Buffer overflow, injection flaws, and business logic bugs are all common pitfalls in embedded systems development. When buried deep in a labyrinth of microcontrollers in a single vehicle or device, it can spell disaster from a security perspective.

Buffer overflow is especially prevalent, and if you want to take a deep dive into how it helped compromise that air fryer we talked about before (allowing remote code execution), check out this report on CVE-2020-28592.

Now, it’s time to get hands-on with a buffer overflow vulnerability, in real embedded C/C++ code. Play this challenge to see if you can locate, identify, and fix the poor coding patterns that lead to this insidious bug:

Make buffer overflow history.



How did you do? Visit www.securecodewarrior.com for precision, effective training on embedded systems security.

View Resource
View Resource

Author

Matias Madou, Ph.D.

Matias is a researcher and developer with more than 15 years of hands-on software security experience. He has developed solutions for companies such as Fortify Software and his own company Sensei Security. Over his career, Matias has led multiple application security research projects which have led to commercial products and boasts over 10 patents under his belt. When he is away from his desk, Matias has served as an instructor for advanced application security training courses and regularly speaks at global conferences including RSA Conference, Black Hat, DefCon, BSIMM, OWASP AppSec and BruCon.

Matias holds a Ph.D. in Computer Engineering from Ghent University, where he studied application security through program obfuscation to hide the inner workings of an application.

Want more?

Dive into onto our latest secure coding insights on the blog.

Our extensive resource library aims to empower the human approach to secure coding upskilling.

View Blog
Want more?

Get the latest research on developer-driven security

Our extensive resource library is full of helpful resources from whitepapers to webinars to get you started with developer-driven secure coding. Explore it now.

Resource Hub

When good microwaves go bad: Why embedded systems security is the next boss battle for developers

Published Aug 30, 2021
By Matias Madou, Ph.D.

There are plenty of pop culture references to rogue AI and robots, and appliances turning on their human masters. It is heavily steeped in science fiction fun and fantasy, but with IoT and connected devices becoming more prevalent in our homes, so too should the conversation around cybersecurity and safety. Software is all around us, and it’s very easy to forget just how much we’re relying on lines of code to do all those clever things that provide us so much innovation and convenience. Much like web-based software, APIs, and mobile devices, vulnerable code in embedded systems can be exploited if it is discovered in the wild by an attacker. 

While it’s unlikely that an army of microwaves is coming to enslave the human race (although, the Tesla bot is a bit concerning) as the result of a cyberattack, malicious cyber events are still possible. Some of our cars, planes, and medical devices also rely on intricate embedded systems code to perform key tasks, and the prospect of these objects being compromised is not only alarming, but potentially life-threatening.

Much like every other type of software out there, developers are among the first to touch the code, right at the beginning of the creation phase. And much like every other type of software, this can be the breeding ground for insidious, common vulnerabilities that could go undetected before the product goes live. 

Developers are not security experts, nor should any company expect them to play that role, but they can be equipped with a far stronger arsenal to tackle the kind of threats that are relevant to them. Embedded systems - typically written in C and C++ - will be in more frequent use as our tech needs continue to evolve, and specialized security training for the developers on the tools in this environment is essential. 

Exploding air fryers, rogue vehicles… Are we sitting ducks?

While there are some standards and regulations around all secure development to keep us safe, we need to make far more precise, meaningful strides towards all types of software security. It might seem far-fetched to think of a problem that can be caused by someone hacking into an air fryer, but it has happened in the form of a remote code execution attack (allowing the threat actor to raise the temperature to dangerous levels), as has vulnerabilities leading to vehicle takeovers. 

Vehicles, in particular, are especially complex, with multiple embedded systems onboard, each taking care of micro functions; everything from automatic wipers, to engine and braking capabilities. Intertwined with an ever-increasing stack of communication technologies like WI-Fi, Bluetooth, and GPS, the connected vehicle represents a complex digital infrastructure that is exposed to multiple attack vectors. And with 76.3 million connected vehicles expected to hit roads globally by 2023, that represents a monolith of defensive foundations to lay for true safety.

MISRA is a key organization that is in the good fight against embedded systems threats, having developed guidelines to facilitate code safety, security, portability and reliability in the context of embedded systems. These guidelines are a north star in the standards that every company must strive for in their embedded systems projects.

However, to create and execute code that adheres to this gold standard takes embedded systems engineers who are confident - not to mention security-aware - on the tools. 

Why is embedded systems security upskilling so specific?

The C and C++ programming languages are geriatric by today’s standards, yet remain widely used. They form the functioning core of the embedded systems codebase, and Embedded C/C++ enjoys a shiny, modern life as part of the connected device world.

Despite these languages having rather ancient roots - and displaying similar vulnerability behaviors in terms of common problems like injection flaws and buffer overflow - for developers to truly have success at mitigating security bugs in embedded systems, they must get hands-on with code that mimics the environments they work in. Generic C training in general security practices simply won’t be as potent and memorable as if extra time and care is spent working in an Embedded C context. 

With anywhere from a dozen to over one hundred embedded systems in a modern vehicle, it’s imperative that developers are given precision training on what to look for, and how to fix it, right in the IDE. 

What does a business logic flaw look like in embedded C/C++? Take a look and see if you can identify and fix it like a pro.

Protecting embedded systems from the ground floor is everyone’s responsibility

The status quo in many organizations is that speed of development trumps security, at least when it comes to developer responsibility. They’re rarely assessed on their ability to produce secure code, but rapid development of awesome features is the gold standard. The demand for software is only going to increase, but this is a culture that has set us up for a losing battle against vulnerabilities, and the subsequent cyberattacks they allow. 

If developers are not trained, that’s not their fault, and it’s a hole that someone in the AppSec team needs to help fill by recommending the right, accessible (not to mention assessable)  programs of upskilling for the entire development community. Right at the beginning of a software development project, security needs to be a top consideration, with everyone - especially developers - given what they need to play their part. 

Getting hands-on with embedded systems security problems

Buffer overflow, injection flaws, and business logic bugs are all common pitfalls in embedded systems development. When buried deep in a labyrinth of microcontrollers in a single vehicle or device, it can spell disaster from a security perspective.

Buffer overflow is especially prevalent, and if you want to take a deep dive into how it helped compromise that air fryer we talked about before (allowing remote code execution), check out this report on CVE-2020-28592.

Now, it’s time to get hands-on with a buffer overflow vulnerability, in real embedded C/C++ code. Play this challenge to see if you can locate, identify, and fix the poor coding patterns that lead to this insidious bug:

Make buffer overflow history.



How did you do? Visit www.securecodewarrior.com for precision, effective training on embedded systems security.

We would like your permission to send you information on our products and/or related secure coding topics. We’ll always treat your personal details with the utmost care and will never sell them to other companies for marketing purposes.

Submit
To submit the form, please enable 'Analytics' cookies. Feel free to disable them again once you're done.