Vol 2 Issue 4
January 2020
In This Issue


Have you heard about these new edge computers? Everyone is talking about them. Maybe we should have some in our facility?
You already do.
Edge is one of the most common buzzwords in Industry 4.0. It is a term that we did not see a decade ago, but today it proliferates throughout our industry.
A clear definition of the term is elusive, but according to Wikipedia the definition of edge computing is "any type of computer program that delivers low latency nearer to the requests". Another definition given in "Proceedings of the IEEE" vol 107, No 8, August 2019 in a paper entitled "Edge Computing" yields "Edge computing is a new paradigm in which the resources of an edge server are placed at the edge of the Internet, in close
proximity to mobile devices, sensors, end users, and the emerging IoT."
There is a lot to unpack there, and these are only 2 of many other definitions to choose from. Let's start by unpacking the first definition.
In that definition, it refers to a computer program delivering low latency. This means there should be minimal delay in getting data into an application and getting a result out. This is done by having that program and the device that hosts it nearer to the requests. By that definition, the first calculator you had in school was an edge device. The inputs were on your fingertips, you held the device performing the calculations in your hand, and the result was inches away.
This definition misses the existence of a network. As computing capability evolved, the ability to share and access data and distribute processing across a network become fundamental. The first desktop computer that sat isolated in your office became an edge computer when it was connected to a network. You still had local processing near the inputs and outputs with low latency when you used a program on that computer to create a document or spreadsheet, but the file servers, printers, or mainframe computers became the "cloud" of network resources that gave you more capability.
This edge computing approach differed from the dumb terminal approach. In my sophomore year in college I had a class in computer programming which required me to go to the computer lab and write code. This lab had Remote Job Entry (RJE) computer terminals. These terminals were dumb in the sense that all they did was provide a keyboard and a view screen. All the processing was done in a mainframe computer. When we were writing computer code, we were editing a file on the mainframe. When we compiled the code, the mainframe created the executable code. When the program was executed, the mainframe did the processing and generated the results. The RJE terminals were not edge computers.
In today's modern control systems, the dumb terminal is still a popular approach. We call them thin clients because they don't have much inside them. The idea is to have inexpensive user interface devices that can be centrally managed and easily replaced to avoid having to maintain patches and configuration of every single user interface device. The thin clients are not edge computers.
However, every facility with a Programmable Logic Controller (PLC), Remote Terminal Unit (RTU), or Process Controller (controller in a Distributed Control System, DCS) has edge devices. The IEEE definition given earlier states that edge devices are near sensors. A PLC, RTU, or controller is directly connected to the sensors in your facility and the processing in these devices directly send outputs to actuators. Therefore, these are edge devices and always have been.
It is possible to move functionality out of the edge device to a server, and there are merits in doing so. For example, if you have a batch process you could have logic in your edge device that stores the recipes and controls the sequencing of phases and steps. However, it might make a lot more sense to have that done in a server that is centrally managed and can access IO from multiple devices in the control network. This is where the design of an edge system becomes pertinent.
Would you do the same with a PID loop? It could be argued that if the edge device only handled the IO and handed all the processing of algorithms to a server in the cloud, it would be easier and more cost effective to have all PID processing in one place. However, what about latency? You cannot have delays and uncertainty when a PID loop has been tuned to respond to setpoint changes and process upsets. You want a PID to be processed as close to the IO as possible. Therefore, PID control requires edge processing. The same would be the case for interlocks and safety systems.
So, what part of edge computing is new? The IEEE paper referred to the Internet in its definition. This is what is new. Prior to the Internet, we either had standalone devices like calculators, desktop computers, and locally monitored PLCs or we had networked systems with RJE terminals, mainframes, and intelligent devices like Process Controllers in a networked control system. Once the Internet happened, the devices largely stayed the same, but the terminology changed. Dumb terminals were now thin clients, and the smart phone with your calculator application is an edge device.
In a Feb 4, 2019 Control Amplified podcast, Peter Zornio (Chief Technology Officer at Emerson) states that he is bemused when people use terms like edge as if they are new concepts when in fact the automation industry has been utilizing edge processing long before the term was invented. It is easy to be confused and led to believe that you are being left behind by those embracing new concepts. It may in fact be the case that only the terminology is new, and you have been on the cutting edge all along.

Pat Dixon is Southwest Region Engineering Manager for Global Process Automation (GPA), a controls system integration firm.   

His LinkedIn profile is https://www.linkedin.com/in/dixonpatrick/.


Training and Industry 4.0

Decades ago, when I worked for a little old soap company in Cincinnati, they had two principles:  (1) the highest return on investment was advertising and (2) the second highest return on investment was internal training. In fact, at the time, they said they had never found diminishing returns in either of these areas. In those days, they managed their dividend expectations for their stockholders and put everything else that was left over in these two categories.  The company was very profitable, so there was a lot left over for these.

I watch people today buying sophisticated, expensive software packages and then pull their punches when it comes to training their own people.

Folks, if you are buying from an established large,  reputable company, the software will do what they say it will do.  In fact, it will likely do more than they say it will do.

If it is not working for you, likely it is because you have failed to adequately train all of your personnel.  Not just a little bit of training, either, overdo it (at least you will be thinking you are overdoing it). 

I walk into facilities where I know overarching software packages have been purchased.  Yet I see people doing things the way they have been doing them for years. I know they have the software to improve the way they do things.  I ask them about it, and most often the answer is they have not been trained.

If you have bought a major software package, some of the tasks you asked people to do should be trivialized as they learn the software.  How long will training take?  It will become a permanent part of your process, just like it was at the soap company.  But overall, you will be much more efficient, even when including the training time.

His LinkedIn profile is

Are you Wasting Perfectly Good Data?

Built on the back of the steam engine, the First Industrial Revolution fundamentally transformed the way people and organizations worked. Today we are witnessing how data is transforming organizations in a similar way with Industry 4.0. The factories of tomorrow will be totally connected, allowing us to leverage the full impact of data analytics.

Read the entire article here.

Venkat Viswanathan


The Internet of Things 101:  Getting to grips with an education in IOT
For those interested in pursuing a career in IoT, I think that it's very important that they obtain a full stack knowledge of the internet of things. Right from interfacing sensors to microcontrollers and microprocessors, to wired and wireless network communication, security frameworks, and software development for web applications.

Read the entire article  here.
Dr. Derek Molloy, Associate Professor in the School of Electronic Engineering at Dublin City University


Unleashing the Power of the Industrial Internet of Things
When collected and acted upon efficiently, data enables companies to stay agile, improve the customer experience and leapfrog competitors. Achieving productivity and ROI gains from IoT will come down to a company's ambition to take the strategy beyond a mechanism for cost prevention.

Read the entire article  here.

Rick Veague

IOT is generating easy to crack keys

A preponderance of weak keys is leaving IoT devices at risk of being hacked, and the problem won't be an easy one to solve.
NCCoE Taps 8 Companies for Energy Sector IOT Security Project

FPInnovations and the Forestry Research Institute of Sweden (Skogforsk) signed a memorandum of understanding (MOU) at the Sweden-Canada Innovation Days symposium held in Montreal in late November. The signing creates an opportunity for the international exchange of research on automated harvesting and advances their respective interests in forestry innovation.

Read the entire article here.

MeriTalk Blog


Coming up next month...
  • What do lettuce and carrots have to do with IIOT?
  • 3 Ways Tech Will Change Manufacturing in 2020
  • Sharpening Your Edge in a World Filled with IoT Devices
  • and much more

Industree 4.0 is exclusively sponsored by SAP