The IoT is changing embedded software development

BOB ZEIDMAN | March 1, 2016

article image
Embedded software engineers have always been a rarified group. When application programmers were moving toward high-level object-oriented programming languages like C++, Java, and graphical application development environments like MATLAB, embedded programmers were only reluctantly moving from assembly language to C. There have always been many fewer embedded programmers than application programmers, especially when smartphones allowed clever hackers and hobbyists to develop an app and upload it to the cloud. Unlike app developers, embedded programmers need to have a deep understanding of the hardware platform on which their code runs.

Spotlight

NVIDIA

NVIDIA’s invention of the GPU in 1999 sparked the growth of the PC gaming market, redefined modern computer graphics, and revolutionized parallel computing. More recently, GPU deep learning ignited modern AI — the next era of computing — with the GPU acting as the brain of computers, robots, and self-driving cars that can perceive and understand the world. Today, NVIDIA is increasingly known as “the AI computing company.

OTHER ARTICLES

Lock Down Personal Smart Devices to Improve Enterprise IoT Security

Article | April 9, 2020

The presence of internet of things (IoT) devices in employee’s homes is a neglected item in many enterprise threat models. Caution is certainly warranted here, but it’s entirely possible to improve your risk awareness and secure smart devices in a calm and measured way. Overlooking privacy and security risks has consequences. It’s in everyone’s best interest to consider the potential impact of every point of data output in your technological ecosystem. Any of these devices could affect the security of your digital connections. To minimize both personal and enterprise risk, it’s important to adhere to the following IoT security best practices.

Read More

WiFi for Enterprise IoT: Why You Shouldn’t Use It

Article | April 9, 2020

So you’re building an IoT solution and you’re ready to select your connectivity approach. Should you use Bluetooth? WiFi? LoRa? Cellular? Satellite? As I’ve explored in a previous post, the connectivity approach you choose ultimately comes down to the specific needs of your use case. Some use cases favor mobility and bandwidth, and power consumption doesn’t matter as much. Other use cases favor extensive battery life and broad coverage, and bandwidth doesn’t matter as much. In this post, I argue that for Enterprise IoT solutions, you shouldn’t use WiFi regardless of the use case. To build and implement a successful IoT solution, your connectivity needs to be reliable and consistent. When there’s an issue that needs troubleshooting, knowing that certain components of your IoT solution are reliable and consistent enables you to narrow your focus and address issues more effectively. There are many challenges in IoT, many of which stem from operational challenges and from having thousands of devices out in the real world where they’re subject to harsh, ever-changing environments.

Read More

Internet of Things (IoT): The Need for Vendors to Address Security

Article | April 9, 2020

By the end of this year there will be 5.8 billion Internet of Things (IoT) endpoints, according to Gartner. And depending on how IoT devices are counted the number is even higher. Statista, for example, estimates the device count for 2020 to be more than 30 billion. Security remains a big challenge for IoT as a strategy to be successful. IoT devices are still not being designed with security as a top priority.Mary O’Neill, VP of security at Nokia, noted in a press conference at MWC Los Angeles 2019 and reported by SDXCentral, that “if an IoT device today is plugged into the network and it doesn’t have protection on it, it’s infected in three minutes or less.”Jake Williams, founder of the security firm Rendition Infosec, said that “IoT vendors emphasize, often rightly, that their products improve quality of life, but they often neglect to disclose the risk of these devices to consumers. The onus of understanding how an IoT device might impact security should not be purely on the consumer. The vendor shares this responsibility.

Read More

THE FUTURE OF BIOMETRICS IOT

Article | April 9, 2020

In 2018 when Apple unveiled its iconic iPhone X with a feature to unlock the phone with Face ID thereby eliminating the use of the home button, it met a lot of eye-rolls. Fast forward to now, people are in love with the biometrics enabled technologies. While iPhone X had a unimodal authentication system, gadget these days have updated themselves in a better way. Let’s try to have a better understanding of the Biometrics. Biometrics are a way to measure a person’s physical characteristics to verify their identity. It can be physiological traits, like fingerprints and eyes, or behavioral traits, that define the manner an individual respond to stimuli. These characteristics are unique to the person. Once collected the data compared with the pre-existing database to find a match. Accordingly, it then produces an outcome. There are many varieties in which this data is collected. Facial and voice recognition, iris and finger scanner, signature verification, hand geometry, keystroke, gait detectors are some of the examples.

Read More

Spotlight

NVIDIA

NVIDIA’s invention of the GPU in 1999 sparked the growth of the PC gaming market, redefined modern computer graphics, and revolutionized parallel computing. More recently, GPU deep learning ignited modern AI — the next era of computing — with the GPU acting as the brain of computers, robots, and self-driving cars that can perceive and understand the world. Today, NVIDIA is increasingly known as “the AI computing company.

Events