IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.
Sponsor Content
What does this mean?

My Time with the Technology that Founded our IoT World

When I started in technology 30-plus years ago, I never would have imagined we’d have the smart devices we enjoy today. There are so many types of gadgets that improve our quality of life, from smart lighting toand smart refrigerators (to remind you it’s time to buy more milk), and even smart hairbrushes for healthier hair. I have to admit, there are times when I just sit back, a little awed, and wonder how in the world did this all get started?

5ce2bf7bce1f6
When I started in technology 30-plus years ago, I never would have imagined we’d have the smart devices we enjoy today. There are so many types of gadgets that improve our quality of life, from smart lighting toand smart refrigerators (to remind you it’s time to buy more milk), and even smart hairbrushes for healthier hair. I have to admit, there are times when I just sit back, a little awed, and wonder how in the world did this all get started?

A first-person history of the Internet of Things

I began my career working for a very small company of about 13 people. We spent our days developing products for monitoring diecasting and injection molding machines. Think of this device as a network- attached digital oscilloscope. Our goal was to analyze, in real-time, the complex process of making a specific part. This included speeds, temperatures, pressures, and so on. But in retrospect, the most interesting part was that we were also setting parameters around those variables to determine if the part created was good or bad.

I remember a conversation with a gentleman who had been operating diecast machines for over 30 years and was the “guru” for dialing in the process. I asked him how he knew if he was making good parts, and he said it was the “feel” of the machine. So our efforts to embed technology into the mix drastically changed the manufacturing process, turning it from art into science. And even back then, we realized we were on the cusp of a new revolution and I was excited to be a part of it.

Early IoT: Frying eggs with sticks and stones

As you can imagine, in the late 1980’s we were developing with sticks and stones in comparison towith what’s available today. Early Internet of Things (IoT) products we created were based on the first 16-bit home computer processor:; Motorola’s 9900 central processing unit (CPU). It was big enough to fry an egg on and hot enough to burn off a fingerprint (I’m speaking purely from experience, and not about the egg).

Fortunately, cooling and space requirements weren’t a big concern since our cabinets were about the size of the robot in the 1960s TV series “Lost in Space.” For reference, that’s about the size of a modern smart fridge. The bigger issues we faced were:

 

  • processing the inputs locally from the diecasting and injection molding machines
  • sending that data to the central controller for storage and additional analysis
 

Basically, this was fog computing in its infancy. And we were knee-deep in figuring it all out.

The birth of digital transformation

To better understand what we were dealing with back in the ’80s, try to visualize a sea of machines pumping out parts. The time it takes to make each part is called cycle-time. Say you have a 120- second cycle time per part. Try reducing it to 15 seconds for a multi-point RS-485 serial communication network that only allows one device to transmit at a time. Definitely a challenge, especially back in the ’80s.

We had to develop techniques to compress the time it took to send information, one of which was to send the starting value and then the subsequent differences. We were working with bits, not bytes (and certainly not kilo-bytes). Given the speed at which many of these machines operated, we even had to develop custom hardware that would send data directly to random access memory (RAM) because the CPU wasn’t fast enough to collect the information. I realize how hard that is to believe given today’s processing capabilities, but it was a simple fact of everyday life for us back then.

You may be asking, why did we use the multi-point RS-485? Pretty straight forward; all the manufacturing equipment that relied on induction motors would produce serious electro-magnetic interference (EMI). RS-485 was the only viable tech at the time that could withstand that harsh environment.

The heart of our system was a central controller that would store the information collected from the remote devices for quality assurance, allow the capability to update process parameters on each remote device, and then provide alarming for machines that were operating out of parameters. Many manufacturers were so reliant on our technology, that they couldn’t run their machines without ours collecting and processing data.

Next up: How the IoT transitioned from past to future

iot-550x315-1.jpg


Ironically, we are still dealing with many of the same challenges from 30 years ago, but at a much greater scale. This is a picture of a circuit board I designed in 1988 and a Raspberry PI. These devices are very similar in function, with the exception of analog to digital (A/D) conversion capability on the large board, but this functionality could be easily added to a PiI with a single A/D chip. There is obviously no comparison of the processing, storage, and display function when it comes to speed and performance, the PiI wins hands-down!

Every day you and I leverage the IoT for a better quality of life. And IoT devices are helping private- and public-sector organizations be better stewards of our planet’s limited resources. Plus, the power of the IoT is helping to monitor and improve manufacturing processes, better control traffic flow, ease parking stresses in communities large and small, and more. In part two, I’ll address the transition from past to present.

How IoT transitioned from the past to the future

In the first installment of this series, I talked about my experience with the Internet of Things (IoT) and the impact that I saw it having on the tech industry. In this post we’ll delve into the transition, or what I would refer to as the needle-movers, and what to expect in the future.

There are four fundamental factors that allowed IoT to change from very limited use-cases to the broader consumption that we have today:

 

  • Low- power and small-scale microcontrollers
  • Electronically erasable programmable read-only memory
  • Ease of development
  • Low manufacturing cost
 

Low- power and small-scale microcontrollers

iot-550x315-2.jpg


Think of a microcontroller as a self-contained integrated circuit (IC), with a processor, memory, and interfaces to the outside. The picture from Part 1 part one of this blog (right) showed a giant circuit board with discrete components for everything; display, memory, input/output (I/O) connections, serial communications, clock signal, and the list goes on and on.

The issue was that you needed a bunch of physical connections, or traces on the circuit board to connect all these components together. When you can put it all on a single chip you not only save not only real estate, but decrease the amount of power required to run the system – you guessed it, needle-mover!

Electronically erasable programmable read-only memory

eprom-300x144-3.jpg


There was one significant development that really made this possible; electronically erasable programmable read-only memory (EEPROM). This is where the program code is stored so when the system powers up, it has instructions on what to do. Back in the olden days, we started with programmable read-only memory (PROM) ICs. Write your code, burn it to PROM with a PROM writer, then carefully place the PROM ICs into sockets on the circuit board and test the system. Oops, you made a mistake, throw the PROMs away and grab a new one or one(s) and start over.

As you can see, not only is this time consuming, it can get expensive. Along comes erasable programmable read-only memory (EPROM), this saves costs since you didn’t have to throw the chips away, but you have to place them in ultraviolet light for 20 or more minutes to erase them. EEPROM can be electronically erased in an instant without having to remove them from the board and new code can be downloaded and tested for rapid prototyping — another needle-mover!

Ease of development

hello-300x184-4.png


Developing around old silicone was a real challenge! Writing compilers, developing interfaces, burning code to PROMs, writing code in assembly, troubleshooting with digital analyzers, reading code in binary and hexadecimal. Imagine the number of things that can go wrong with this picture and certainly did  oh, and don’t forget the carbon-based (human) factor.

Many of the software development kits (SDKs) today allow you to develop code, troubleshoot, compile, and write your code directly to the microcontroller. In addition, there are a plethora of libraries you can download for free from places like GitHub or share your code to help out the community.

One of my favorite development platforms is around the Arduino Integrated Development Environment (IDE) (www.arduino.cc). There is a bunch of supported hardware, including the ESP8266 with Wireless WiFi Module. This thing is the size of a postage stamp. What I really like about the Arduino IDE, is that I can program in my favorite language, “C”. If you are looking at testing your hand in developing an IoT project, this is a great place to start. Ease of development, access to libraries, simulators, simple debugging — another needle-mover!

Lower manufacturing costs

Manufacturing costs for small run 8.5x11 circuit boards in the late ’80s were thousands of dollars and these were nothing sophisticated; 10-mil traces, 10-mil spacing, and 4 layers. I recently worked on a personal IoT project and had 25 each, 3.5-inch square, 4 layer boards made for a couple hundred bucks. This is a far cry from spending $150 and a week to etch a small two-sided circuit board on your own. Plus, now the manufacturing and assembly process is almost completely automated. This has dramatically reduced the cost of populated circuit boards - needle-mover!

What’s next in IoT

There’s no doubt that we will continue to see more and more devices connected to the Internet. Just because we “may” have more devices than IPv6 addresses (kidding), it doesn’t mean this will change our lives for the better. We will experience the art of the possible when these devices, or systems of devices, or systems of systems, begin to interact and use intelligence or artificial intelligence to make decisions.

On a regular day, you are awakened in time for your daily workout based on travel time to your first appointment and in time to take the kids to school. What if you didn’t rest well, have a cough or a fever, or too much to drink? The determination of when to get up can be modified based on a myriad of factors. Your beverage of choice and breakfast awaits, transportation shows up at your door (or you are beamed directly maybe a little far reaching) and you’re off. Meeting preparation is done in transit, and you arrive on time and prepared. This technology will allow us to maximize our time and improve our quality of life.

The technology to do all this is available today, but we need to address the security and privacy concerns as devices and systems communicate. Protecting end-devices is difficult enough, with the capabilities that I described, we will all need to work together to protect these critical systems and the sensitive information they contain.

State and local government and education add sustainable value from Cisco solutions for smart cities, public safety, transportation, big data, cybersecurity, and more.