Technology Trends That Empower Innovation

Technology Trends That Empower Innovation








Technology Trends That Empower Innovation
Technology Trends That Empower Innovation

Open standards, more powerful desktop computers and lower-cost software make design, modeling and automatic code generation for PLCs and PACs practical for improving automation. Other technologies go beyond problem-solving to achieve productivity and performance enhancements. Here’s a look at advances in 18 technology areas that are worth paying attention to:


  • AI, ML and expert systems
  • Cloud computing
  • Hyperautomation
  • Low-code/no-code platforms
  • Edge computing platforms
  • Modular design and programming
  • BioPhorum activity
  • Semantic/contextual data
  • Communications
  • Multiplatform closed ecosystems
  • Open source IEC 61499 Eclipse Foundation 4diac/Forte
  • OPC Foundation field level communications
  • Digital twins
  • Intelligent sensors
  • Spatial computing/intelligent vision
  • Connected worker technology
  • Remote expert services
  • Robotics



AI, ML and expert systems


The commercial use of artificial intelligence is accelerating at all levels with the wide commercial application of AI, natural language processing, machine learning and other expert systems. Increased processing power at lower cost is accelerating the technology. It is tempting to apply new technology immediately but as with any technology, these are new tools that need to be understood and applied properly; they are not instant “silver bullets” to solve all problems and increase operations efficiencies. The quality and value of AI applications depend directly on internal algorithms and data sources.

In the context of industrial automation and controls, poorly applied AI can have negative outcomes impacting performance, personnel and plant safety. The European Commission AI ACT Legal Framework notes: “What does ‘reliable’ mean in the AI context? We speak of a ‘reliable’ AI application if it is built in compliance with data protection, makes unbiased and comprehensible decisions, and can be controlled by humans.”

The AI ACT Regulatory Framework defines four levels of risk for AI systems: unacceptable risk, high risk, limited risk and minimal risk. Mission-critical industrial control and automation applications are within the AI ACT high-risk category. AI systems identified as high-risk include AI technology used in:


  • critical infrastructures (e.g., transport), that could put the life and health of citizens at risk.
  • educational or vocational training, that may determine the access to education and professional course of someone’s life (e.g., scoring of exams).
  • safety components of products (e.g., AI application in robot-assisted surgery).
  • employment, management of workers and access to self-employment (e.g., CV-sorting software for recruitment procedures).
  • essential private and public services (e.g., credit scoring denying citizens opportunity to obtain a loan).
  • law enforcement that may interfere with peoples’ fundamental rights (e.g., evaluation of the reliability of evidence).
  • migration, asylum, and border control management (e.g., automated examination of visa applications).
  • administration of justice and democratic processes (e.g., AI solutions to search for court rulings).


Properly applied AI, ML and expert systems offer industrial companies enormous potential to significantly cut operating expenses and improve staff efficiencies, quality, productivity, operations and reduce maintenance and repair costs. AI technologies help achieve the goals of all industrial automation to increase productivity and efficiency. AI industrial applications properly designed with the right data can more effectively handle unforeseen scenarios in complex and rapidly changing environments based on patterns and trends in the data without being explicitly programmed for every possible scenario with little to no human interaction.

The goals of AI applications should be in line with the company’s overall strategy and then define potential AI use cases for evaluation and prioritization for projects.

There are an increasing number of no-code, self-serve software tools simplifying the application of these technologies by industrial subject matter experts rather than data scientists. Industrial automation and control systems have a wealth of data that can be used more effectively with these technologies.

In addition, AI processor chips enable high-performance applications to run within controllers and edge computers for demanding applications. Server and cloud AI/ML/expert system solutions are suitable for a wide range of applications, but network communication speed and latency factors pose limitations for many real-time industrial and process applications that are overcome with AI chips embedded in industrial edge devices and sensors.

There are offerings from offerings in the market including Nvidia, Intel Myriad-X, Google Edge TPU and Hailo. These new technologies are proven in other areas including video analytics with image recognition and related applications. These chips can be applied using plugin add-on board modules that are aggressively priced, conforming to the popular M.2 and mPCIe connector standards found in many computers including embedded industrial PCs, adding high-performance AI processing without degrading other applications in the computer.

This is analogous to early PC coprocessor add-ons to achieve high-performance floating-point mathematical calculation performance and video display coprocessors to achieve high-resolution/performance graphics. For example, the original IBM PC included a socket for the Intel 8087 floating-point coprocessor (aka FPU), which was a popular option for people using the PC for computer-aided design or mathematics-intensive calculations, or system architecture encompassing cloud, enterprise and embedded applications. AI chips for embedded edge applications are particularly valuable for real-time industrial automation and control effectiveness.

See Automation.com articles about AI and ML.



Cloud computing


Cloud computing is delivering efficient and powerful applications at a lower cost. These applications are being applied to improve manufacturing with technology solutions from suppliers including Amazon Web Services (AWS) and Microsoft architectures—important industrial digitalization building blocks from sensor to enterprise and cloud. Cloud software architectures and tools built on open standards are highly refined and easy to use to develop a wide range of applications including historians, AI (AI), expert systems, machine learning (ML) and digital twins. Evidence of the commitment to the integration of the entire manufacturing business is membership and participation in the OPC Foundation by technology companies including AWS, Microsoft, IBM and Capgemini.

Cloud applications are providing many functions previously only available with onsite systems. This is particularly important for small and medium-sized manufacturers that did not have the financial strength to make the large investment required for onsite systems. Cloud applications provide small and medium manufacturers with the functions previously only available to large companies to increase efficiency and profits. In the U.S., companies with fewer than 100 employees make up more than 94% of all U.S. manufacturers. In Europe, there are approximately 22.6 million small and medium-sized enterprises (SMEs) in the European Union in 2021.

For example, comprehensive system as a service (SaaS) manufacturing business solutions is an efficient way to achieve integrated digitalization of all functions, including enterprise resource planning (ERP), manufacturing execution system (MES/MOM), quality management system (QMS), analytics and Industrial Internet of Things (IIoT) and supply chain management (SCM). PLEX Systems, now a Rockwell Automation company, is an example with a full suite of cloud-based SaaS manufacturing business solutions.

See Automation.com articles about cloud computing.

 


Hyperautomation


Hyperautomation is an advanced automation strategy to drive profound digital transformation to gain a competitive advantage. Hyperautomation involves the orchestrated use of multiple technologies, tools and platforms including AI, ML, event-driven software architecture, robotic process automation (RPA), robotics, business process management (BPM) and low-code/no-code tool technologies. In the context of industrial manufacturing, hyperautomation is the digitalization and integration of the entire business, from plant process to business enterprise, including ERP, supply chain, logistics and customer fulfillment.

See Automation.com articles about hyperautomation.

 


Low-code/no-code development


A software revolution has been ignited by no-code/low-code development platforms. Low-code/no-code software is a high-impact manufacturing automation trend that empowers industrial professionals who understand manufacturing and production to directly apply technologies without being data scientists or trained programmers. These “citizen developers” create applications with artificial intelligence, expert systems, predictive maintenance, optimized machine operations and flexible manufacturing techniques using drag-and-drop interfaces, natural language tools and quality-tested models to build applications.

Without needing to manually code systems, developers can deploy automated solutions faster than before and adjust them more efficiently as needed. The higher level of adaptability provided by low- and no-code solutions is critical in the manufacturing sector where people must respond to constantly changing conditions. This is analogous to how spreadsheets democratized the use of computers for a wide range of applications enabling subject matter experts to directly apply their knowledge. Today’s no-code/low-code platforms are very comfortable technologies for people since they use smartphone, tablet and PC applications in their daily lives.

Since 1969, industrial automation and control people have been empowered with no code ladder logic programming that evolved into the IEC 61131-3 International Electrotechnical Commission (IEC) standard, first published in 1993. The standard continues to be enhanced and extended in IEC committees and by the not-for-profit PLCopen trade organization. Noteworthy enhancements include industrial safety, motion control, robotics, OPC UA and other functions.

See Automation.com articles about low-code/no-code development.

 


Edge computing platforms


Industrial edge computing devices provide distributed intelligence at or near physical processes to sense, control, run local programs and communicate with industrial controllers, plant operations, enterprise systems and cloud applications. Edge computing devices include industrial PCs, Raspberry Pi, Android and embedded system-on-a-chip (SoC). This is required for real-time closed-loop manufacturing business operations to be responsive, profitable and competitive.

Edge devices are part of a distributed computing architecture. They perform tasks that in many cases productively interact with enterprise and cloud computing applications. There is now a wide range of edge computers at various power and price points, from multicore processors to Raspberry Pi devices. Sensors with embedded processors and built-in communications are edge devices that provide a new level of functionality.


Figure 1: Edge devices, part of a distributed computing architecture, productively interact with enterprise and cloud computing applications.


 

The major value of edge computing is executing applications close to physical production, achieving fast response times with very low latency and capturing real-time data. The incorporation of higher-level functions directly into this new breed of powerful field devices and industrial controllers, coupled with real-time transaction-processing business systems, is diminishing the need for industrial middleware software.

Business systems have evolved more rapidly than industrial systems to meet the requirements of business functions including supply chain, customer service, logistics and Internet commerce. Middle-level software and computers have served their purpose of buffering, synchronizing, translating and refining sensor and controller information, but also created brittle systems with a great number of middle-level computers, duplicate databases, complex configuration control and software that is expensive and difficult to maintain.

Edge computing is computing that takes place at or near the physical location of either the user or the source of the data. Distributed functions at the edge include optimization, expert systems and AI with new classes of devices.

Rugged edge computing platforms provide gateway functions plus many other functions including distributed control, optimization, webservers, OPC UA server and clients, AI, REST APIs, image recognition and cloud communications (AWS, AZURE, etc.). Many incorporate multiuser environments such as Docker and Kubernetes, enabling the addition of user applications written in standard programming languages including Python and JavaScript.


  • Intelligent/smart field edge devices. Intelligent or smart field edge devices are a new class of smart field devices. These include sensors and actuators that are intelligent and communicate directly to controllers, enterprise and cloud applications. These devices incorporate distributed control functions including optimization, web servers, OPC UA servers and clients, REST APIs and cloud communications (with AWS, Azure, etc.). User-based initiatives are defining the new architecture based on these concepts including the NAMUR Open Architecture (NOA) and Open Process Automation Forum (OPAF) standards.
  • Edge gateways. Industrial edge gateways are typically rugged industrial computers running middleware software that connect to programmable logic controllers (PLCs), drives and other edge devices. These edge gateways contextualize information and map it to enterprise software and databases. Edge gateways are ideal for providing edge computing functions that leverage installed legacy controls and automation, extending capital equipment investments.


See Automation.com articles about Edge Computing.



Semantic/contextual data


Industrial control and automation communications are evolving using semantic/contextual conceptual data models from the manufacturing edge to enterprise/cloud increasing system efficiency, responsiveness and effectiveness. The semantic/contextual information inherently describes the meaning of data and sensors from machines and processes, which can be used directly by applications without interpretation or references. This is significantly different than getting data from traditional controllers and machines that provide nondescriptive data, for example, registry values that represent temperature or tool position information.

Advances in technology are making this economically feasible, providing significant benefits. Semantic/contextual information also simplifies performing cybersecurity checks on information by testing for proper data ranges based on application. Semantic technology combines elements of semantic analysis, natural language processing, data mining, knowledge graphs and related fields. Semantic technology encodes meanings separately from data and content files, and separately from application code enabling machines as well as people to understand, share and reason with them at execution time. With semantic technologies, adding, changing and implementing new relationships or interconnecting programs differently can be just as simple as changing the external model that these programs share.

With traditional information technology (IT), meanings and relationships must be predefined and “hard-wired” into data formats and the application program code at design time. This means that when something changes, previously unexchanged information must be exchanged, or two programs need to interoperate in a new way, requiring humans to get involved. Offline, the parties must define and communicate between them the knowledge needed to make the change, and then recode the data structures and program logic to accommodate it, and then apply these changes to the database and the application. Then, and only then, can they implement the changes. This is a common issue requiring the PLC representation of data to be mapped to application data representations.

These technologies formally represent the meaning involved in information. For example, ontology can describe concepts, relationships between things and categories of things. These embedded semantics with the data offer significant advantages such as reasoning over data and dealing with heterogeneous data sources.

Semantic technologies provide an abstraction layer above existing IT technologies that enables bridging and interconnection of data, content and processes. Second, from the portal perspective, semantic technologies can be thought of as a new level of depth that provides far more intelligent, capable, relevant and responsive interaction than information technologies alone. Semantic technologies would often leverage natural language processing and machine learning to extract topics, concepts and associations between concepts in text.

The OPC ecosystem. The OPC Foundation has become the unifying focal point for IT, operational technology (OT), industrial/process controls, manufacturing automation, IoT and cloud organizations participating in more than 65 joint working groups focused on defining and implementing standard contextual and semantic data models from industrial field devices, including sensors/actuators to enterprise and cloud systems to achieve the goals of secure/ reliable communications, multivendor vendor, platform and domain agnostic and interoperability from sensors to enterprise and cloud applications. OPC Foundation standards, semantic data models and ecosystem simplify application engineering and enterprise software development while improving system quality.

There are more than 850 OPC Foundation members and thousands of OPC-compliant products. In addition to a wide range of industrial members, the active participation of IT technical leaders is notable, including Microsoft, AWS, Google, IBM and SAP.

OPC Foundation standards are becoming widely adopted by IT, OT and cloud suppliers creating a valuable and efficient distributed industrial manufacturing architecture. OPC UA Companion Specifications, complete use case models and templates achieve a unified vendor-independent data interchange that simplifies data exchanges, lowers application engineering labor and improves quality.

The OPC Foundation’s globally available UA Cloud Library was co-developed with the Clean Energy and Smart Manufacturing Innovation Institute (CESMII). CESMII is the United States’ non-profit institute dedicated to smart manufacturing working to reduce cost, complexity and time-to-value so all manufacturers can engage in smart manufacturing.


Figure 2: OPC UA FX is the first IP field device approach incorporating semantic contextual data connectivity.




The UA Cloud Library makes OPC UA information models available in the cloud on a global scale, providing users with an efficient way to find and use OPC models. This simplifies application engineering for users to access all known OPC UA information models via an open, global, single source of truth. This also facilitates global OPC UA information model coordination and harmonization efforts by making it easy to search and cross-reference the latest OPC UA companion specifications in real-time. This makes the application of OPC UA Companion Spec as simple as adding a printer to a computer.

See Automation.com articles about semantic/contextual data.



Modular design and programming


Modular design and programming enable subject matter experts to directly create applications without writing code, resulting in superior-quality applications. Industrial automation has many common functions and processes that are encapsulated in software modules and configured to meet application requirements. Industrial automation continues to move toward open-standard, modular, model-based design with standardization including OPC UA Companion Specifications and module type package (MTP).

Modular building blocks can be deployed in any number and combination to put together a production line or process. This is a higher level of structured software design using self-contained, tested and validated modules. Modular building blocks can then be deployed in any number and combination by application engineers to build control and automation solutions to satisfy unique production requirements.

Industrial automation, distributed control system (DCS) and PLC vendors have products that use program modules within their proprietary architecture that are powerful, creating engineering efficiency. These building blocks cannot be used with other vendor products since they are not published to open architecture standards.

The trend is toward open multivendor interoperable open standard models including OPC UA Companion Specifications and MTP.

This is part of a no-code evolution that is accelerating enabled by newer low-cost technologies. Current examples on this trendline include ISA88 and PackML.

ISA88. The ISA88 Batch Control architecture and standard are used throughout the world to systemize and modularize recipe-driven processes. Industrial automation and process system vendors created their proprietary software applications to support these and other modular design models. The value of this approach has been proven and now is being defined in open standards.

PackML. Packaging machine language (PackML) is an industry technical standard for the control of packaging machines developed by OMAC, adopted by ISA as TR88.00.02. The primary goals of PackML are to encourage a common “look and feel” across a plant floor and to enable and encourage industry innovation. PackML includes standards-defined machine states and operational flow, overall equipment effectiveness (OEE) data, root cause analysis (RCA) data, flexible recipe schemes and common supervisory control and data acquisition (SCADA) or MES inputs.



Resources for modular design and programming


  • PLCopen standards define common modules for IEC 61131-3 Motion Control, Safety, and OPC UA functions.
  • The NAMUR standard 2658 on modular production is being fed into the IEC 63280 standard for automation engineering of modular systems in the process industries for internationalization.
  • OPC UA companion specifications allow models of modular defined programming for industrial equipment and processes.
  • The joint VDI-VDE-NAMUR 2658 standard defines the module type package (MTP).


BioPhorum activity. The BioPhorum mission is to create an environment where the global biopharmaceutical industry can collaborate to accelerate its rate of progress for the benefit of all. The BioPhorum’s vision is to develop the guidelines for Module Type Package (MTP) files to be used with modular equipment commonly found in biopharmaceutical processing plants. MTP files are used to achieve plug-and-play operations, dramatically reducing engineering labor, lowering project execution time and increasing quality.

At the heart of plug-and-play is the VDI/VDE/NAMUR 2658 standard that defines MTP. The objective of BioPhorum’s plug-and-play concept is to effortlessly integrate intelligent unit operations in the S88 procedural batch engine of the overlying supervisory automation system of a good manufacturing practice (GMP) compliant facility.

The “MTP” focuses on creating standardized nonproprietary descriptions of modules for process automation. MTP is advancing the concepts of ISA88 and ISA95 into open vendor-independent plug-and-produce models that include attributes like alarm management, safety and security, process control, human-machine interface (HMI) and maintenance diagnostics. OPC UA is used as a way to communicate MTP data between systems.

MTP also focuses on addressing common complaints users have when vendors deliver various pieces of equipment that do not directly and intelligently communicate with control, automation, asset management and business systems, requiring significant investment to integrate into plant operations. Today, the addition of hardware, software and application engineering for interfaces to integrate these decreases system reliability and increases lifecycle maintenance costs.

The process automation industry has been heavily affected by the influx of new technologies and, as we enter the next decade, several organizations are providing roadmaps to help process automation companies enable the best practices to leverage these technologies and drive competitiveness and productivity forward.

The Industry 4.0 for Process effort describes smart-networked sensors as a foundational part of the Industry 4.0 process architecture. These sensors will communicate with controls and automation systems simultaneously and directly with business systems. This effort—the application of Industry 4.0 concepts to improve process automation—is being driven by NAMUR and VDI/VDE in collaboration with several prominent leaders in the industry, including ABB, BASF, Bayer Technology Services, Bilfinger Maintenance, Endress+Hauser, Evonik, Festo, Krohne, Lanxess, Siemens and Fraunhofer ICT. The concepts are expressed in NAMUR’s Process Sensor 4.0 Roadmap, which describes smart networked sensors as a foundational part of the Industry 4.0 process architecture.

See Automation.com articles about PLCopen and BioPhorum.

 


Industrial edge semantic/contextual data


OPC Foundation standards provide semantic/contextual data models from field edge devices and application-specific companion specifications that enable plug-and-play system configuration. MQTT Sparkplug and OPC UA FX provide industrial edge-to-enterprise and cloud data communications.

OPC UA FX. OPC UA FX plug-and-play multivendor field device standard was launched in 2018. OPC Field Level Communications (FLC) FX semantic/contextual standardizes semantic/contextual field device communications between multivendor controllers and throughout the manufacturing business enterprise.

Regarding the advanced physical layer (APL), OPC UA FX is supporting Ethernet APL two-wire Ethernet for process automation and hazardous locations based on IEEE and IEC standards.

OPC UA over MQTT. Low-cost sensors and IIoT technology is making possible monitoring and control of more systems over many transport methods, including bandwidth-constrained wired, wireless and cloud connections over common carriers. OPC UA messaging over the message queuing telemetry transport (MQTT) protocol provides a lightweight solution for those applications where network bandwidth is limited. This combination delivers both the strengths of OPC UA data models—dramatically reduced application engineering labor and improved reliability—and the benefits of MQTT communications efficiency over constrained networks. More importantly, OPC UA and MQTT are both open-source technologies representative of the industrial automation architectural shift toward greater flexibility and interoperability of systems.

 

MQTT Sparkplug. Another possibility that allows users to create their unique data model definitions unique to their company is Sparkplug open-source specification hosted at the Eclipse Foundation that provides MQTT clients the framework to integrate data from their applications, sensors, devices and gateways within the MQTT infrastructure. The Sparkplug Specification aims to define an MQTT topic namespace, payload and session state management that can be applied generically for the requirements of real-time SCADA/control HMI solutions.

MQTT Sparkplug is a messaging protocol built on top of MQTT that enables users to free form define semantic/contextual information for their applications. Cirrus Link Solutions owns a patent related to Sparkplug and as a member of the Eclipse Foundation Sparkplug working group operates under the Eclipse Intellectual Property Policy. Under that policy, Cirrus Link granted an irrevocable (subject to a defensive termination provision), nonexclusive, worldwide, royalty-free, transferable patent license for the final specification.

This applies to anyone who makes, uses, sells, offers to sell and imports Sparkplug implementations as long as such implementations successfully pass the corresponding Sparkplug Technology Compatibility Kit (TCK) and remain in compliance with the Eclipse Foundation TCK License. This also allows customers to purchase their Sparkplug solution from any vendor whose solution has successfully passed the corresponding Sparkplug TCK and who remains in compliance with the Eclipse Foundation TCK License.

See Automation.com articles about OPC UA and MQQT.

 


Internet protocol communications


The industrial edge plays an important role in defining how Internet protocol (IP) communication transports transform industrial systems with open IP-based transport protocols. These include single-pair Ethernet (SPE), Ethernet-APL and 5G wireless private networks, which is a particular advantage for intelligent sensors, actuators and other industrial field devices.

Single-pair Ethernet. The industrial edge is entering mainstream computing and IoT with the integration of single-pair Ethernet standard 10BASE-T1, making IP communications embedded in end-field devices cost-effective, including sensors and actuators. Ethernet-based networks supporting industrial controls and automation leverage the advantages of Ethernet infrastructure products produced in high volume, including lower costs of hardware, software and support. SPE finally is the way to unlock more information directly from sensors, actuators, drives, motor starters and other devices.

SPE network technology (IEEE 802.3cg) provides communications over two wires using the Internet Protocol. SPE delivers standard unmodified Ethernet built on IP to enable intelligent field devices including sensors, motor controls and actuators to achieve industrial digitalization and accomplish the vision of Industry 4.0. SPE leverages standard IP message routing to deliver data anywhere in an Ethernet architecture.

SPE has significant engineering, maintenance and installed cost advantages over standard Ethernet with more than 75% smaller cable diameter, reduced weight, cost and 30% more bend radius than CAT 5. It also provides the potential to reuse existing installed twisted pair field wiring to carry SPE communications, simplifying plant and machine retrofits. The standard also provides a power over data line (PoDL) option with up to 50 Watts of power for edge devices.

There is an option for SPE Multidrop 802.3cg with auto-negotiation at 10<bits/s, PoDL, 16 device drops and 50-meter length. Multidrop for sensor networking has tremendous installed cost advantages over point-to-point networking.

Ethernet-APL. Ethernet-APL is a ruggedized, two-wire, loop-powered Ethernet physical layer that uses 10BASE-T1L plus extensions for installation within the demanding operating conditions and hazardous areas of process plants to directly connect field devices. Ethernet-APL enables process industries to benefit from the integration of the entire process manufacturing business including automation, OT and IT systems. Ethernet-APL configurations include intrinsically safe circuits suitable for Zone 0, Zone 20, or DIV 1 installations. Since Ethernet-APL is logically Ethernet, any industrial network protocol devices that electrically conform to 10BASE-T1L Ethernet physical layer standard (IEEE 802.3cg-2019) can take advantage of this physical layer.

EtherNet/IP, Profinet and other protocols can run simultaneously on an Ethernet-APL network as they do today on standard Ethernet with the same bandwidth and latency issues. This provides for a transition from these legacy protocols to new open intelligent protocols.

5G wireless private networks. Wireless 5G private networks are emerging in manufacturing as a method to support mobile workers and monitor and control equipment. Industrial digitalization requires acquiring reliable, timely and actionable information for real-time control in the hands of stakeholders including process operators, maintenance technicians, environmental health and safety people, and supply chain people. This information is in many areas including production plants and outdoor areas out of the reach of Wi-Fi, and public cellular signals.

Private cellular is enabling the achievement of digitalization goals to take advantage of many things including analytics, machine learning and digitally guided procedures with pervasive communications throughout operations. The broad use of Wireless 5G technology benefits from the economies of scale that created refined technology. Wireless 5G is superior to industrial Wi-Fi because of higher speeds, easy deployment and lower initial and lifecycle costs. Older controllers without Ethernet connections can be interfaced to the plant system network using Ethernet gateways, which are available from several suppliers.

MQTT. MQTT is an OASIS standard for IoT connectivity. It is a publish/subscribe, extremely simple and lightweight messaging protocol designed for constrained devices and low bandwidth, high latency, or unreliable networks. The design principles are to minimize network bandwidth and device resource requirements while attempting to ensure reliability and some degree of assurance of delivery. These principles also turn out to make the protocol ideal for the IoT world of connected devices, and for mobile applications where bandwidth and battery power are at a premium.



Resources related to communications




See Automation.com articles about connectivity and communications.

 


Open source controller software


Many options exist for programming industrial computers and embedded SoC CPUs using open-source controller software based on IEC 61131 and IEC 61499 standards. Unlike the computer industry, industrial and process controllers have been closed-architecture devices using unique application programming software and runtime software engines. Many of the vendors provide functionally equivalent programming based on the IEC 61131 standard that is nonetheless unique to their proprietary controllers. In addition, PLCopen has defined functions that have been adopted by many.

These IEC standards have enabled the use of standard industrial computers and embedded CPUs to implement control and automation system using standard IEC languages. The basic architecture consists of a control and automation integrated design environment (IDE), a runtime control engine, input/output (I/O) drivers and communications drivers. Users create an application in the IDE, and on most platforms, perform basic simulation functions, automatic program documenting and online debug functions.

The created program is downloaded to a target control execution engine in an edge platform that runs the user-created program. Traditionally, the edge platform has been proprietary PLC or DCS hardware. Today, a wide range of hardware options are available, including industrial computers, SoC CPUs and embedded processors.

 


Figure 3: IEC standards have enabled the use of standard industrial computers and embedded CPUs to implement control and automation with standard IEC languages.




IEC 61131. The IEC 61131 was first published in 1993, and the current version was published in 2013. IEC 61131 has been widely adopted throughout the world, as illustrated by the number of certified vendors listed on the PLCopen website. The IEC 61131-3 standard defines five programming languages for control and automation programming:


  • Function block diagram (FBD): Visual drag and drop programming.
  • Ladder diagram (LD): A graphical language that represents electrical relay logic, functions (i.e., timers, counters, proportional-integral-derivative [PID] controllers, communications, and analytics.
  • Structured text (ST): A high-level language structured and syntactically similar to other computer programming languages. Functions include IF-THEN-ELSE, mathematical (i.e., square root, transcendental) case statements, array, structures and data transformation functions.
  • Sequential function chart (SFC): Program flow control (i.e., machine control, packaging machines, batch process control [ISA88, a common language for PLC programmers]).
  • Instruction list (IL): lightweight programming language for applications with limited CPU and memory.


IEC 61131-3 functions have inputs and outputs with strong data types. IEC 61131 data type syntax includes Boolean, integer, real, string, array, structures and user-defined functions. Users can create functions and function blocks using the languages mentioned above as well as standard programming languages (i.e., C/C+, Python). Most IDE implementations include engineering tools that allow users to connect field control engines for online debugging including breakpoints, watch windows, strip chart recorders, trend graphs, and integrated HMI.

IEC 61499. IEC 61499 was initially published by the International Electrotechnical Commission (IEC) in 2005. The specification of IEC 61499 defines a generic model for DCS and is based on the IEC 61131 standard. In IEC 61499, the cyclic execution model of IEC 61131 is replaced by an event-driven execution model. IEC 61499 enables an application-centric design in which one or more applications, defined by networks of interconnected function blocks, are created for the whole system and subsequently distributed to the available devices. All devices within a system are described within a device model. The topology of the system is reflected by the system model. The distribution of an application is described within the mapping model. Therefore, applications of a system are distributed but maintained together.

Like IEC 61131-3 function blocks, IEC 61499 function block types specify both an interface and an implementation. In contrast to IEC 61131-3, an IEC 61499 interface contains event inputs and outputs in addition to data inputs and outputs. IEC 61499 defines several function block types, all of which can contain a behavior description in terms of service sequences.

The IEC 61131 program execution model is deterministic, and IEC 61499 has an event-driven execution model defined by the user as a build program. IEC61131 deterministic cycle reads inputs, resolves program logic defined by the user and writes outputs. IEC 61499 has an event-driven execution model where users write program logic explicitly defined execution sequence. The event-driven IEC 61499 execution model creates a level of complexity that needs to be managed as applications become larger. The current OPAF testbeds have been using both IEC 61131 and IEC 61499 successfully.

IEC 61499 models a distributed control system allowing automation applications to run across networks independent of the underlying hardware. This can be an advantage but is more complicated because the network bandwidth, quality of service (QoS) and reliability become important considerations for control availability, reliability and automation performance.

Processors and networking technologies have advanced significantly allowing the selection of higher-power computing platforms and network technologies.

In contrast to proprietary controller companies, software companies—examples are CODESYS and Straton—are offering IEC 61131 IDE and runtime software engines. The Eclipse Foundation’s 4diac project is open-source IEC 61499 software, while UniversalAutomation.org has created a controlled-source IEC 61499 runtime environment. (See the article “Open Automation Systems: An Update on the State of the Art” elsewhere in this issue.)

Eclipse Foundation 4diac/Forte. The Eclipse Foundation 4diac project created an open-source IEC 61499 software provided under Eclipse Public License, Version 2.0. The 4diac open standard includes the IDE development environment. The 4diac IDE is based on the Eclipse open-source framework, which allows easy integration of other plug-ins, providing new or extended functionality.

The Eclipse IDE is an established platform for general computer programming. IEC 61499-based systems follow an application-centric design, which means that the application of the overall system is created first. Each application is created by interconnecting the desired function blocks in terms of a function block network (FBN). As soon as the hardware structure is known, it can be added to a project’s system configuration and the already existing application can be distributed onto the available devices.

The 4diac FORTE is a small portable implementation of an IEC 61499 runtime environment targeting small, embedded control devices (16/32 bit), implemented in C++. Supported operating systems include eCos, NET+OS 7, Posix: Cygwin, Linux (i386, PPC, ARM), rcX, VxWorks, PikeOS, Windows and freeRTOS. It supports online reconfiguration of its applications and the real-time capable execution of all function block types provided by the IEC 61499 standard.

4diac FORTE supports all IEC 61131-3, edition 2 elementary data types, structures and arrays. It provides a scalable architecture that allows 4diac FORTE to adapt to the needs of your application. Applications can consist of any IEC 61499 element as basic function blocks (BFBs), composite function blocks (CFBs), service interface function blocks (SIFBs), adapters and sub-applications.

The 4diac function block library (4diac LIB) contains function blocks, which are available on the 4diac FORTE and can therefore be used to create IEC 61499-compliant control applications.

See Automation.com articles about open systems.

 


OPC Foundation field level communications


OPC Foundation field level communications (FLC) is modernizing the most basic industrial communications with mainstream computing semantic/contextual communications, modernizing the most basic industrial communications to the industrial edge including sensors, actuators, and all forms of field devices. OPC UA FX is the first IP field device approach incorporating globally standard semantic contextual data connectivity. It is a serious contender to become the unifying industrial protocol to support open architecture multivendor industrial digitalization.

OPC FLC is the first multivendor open-standard semantic contextual data connectivity communication solution between sensors, actuators, controllers, enterprises and cloud that meets all the requirements of industrial automation, factory automation and process automation. OPC UA FX continues to make rapid progress in modernizing the most basic industrial communications with mainstream computing data concepts to the industrial edge.

 


Figure 4: The UA Cloud Library makes OPC UA information models available in the cloud on a global scale providing users with an efficient way to find and use OPC models.




The OPC UA FX specifications also focus on controller-to-controller (C2C) communications and OPC UA Safety Stack and extensions for safety.

OPC UA FX can be used to transport data over any IP network; it inherently supports a wide range of transports. Ethernet APL two-wire Ethernet for process automation and hazardous locations is based on IEEE and IEC standards with preparations for APL testing in the OPCF Certification Lab. The OPC Foundation is working closely to align with the time-sensitive networking (TSN) profile for industrial automation (TSN-IA-Profile), which will be standardized by the IEC/IEEE 60802 Standardization Group. This will help ensure that a single, converged TSN network approach is maintained so that OPC UA can share one common multivendor TSN network infrastructure with other applications.

See Automation.com articles about OPC FLC.

 


Digital twins


The digital twin has become one of the most powerful concepts of Industry 4. 0. The concept should be familiar to automation and control people since it is a higher level of closed-loop control that ideally incorporates all the factors of a manufacturing business that affect production efficiency and profitability including incoming material quality, order flow, economic factors, customer orders, production plans, work in process (WIP) flows and machine efficiencies. Digital twins are a virtual representation of a real-world process constantly updated with its real-time twin to achieve complete manufacturing closed-loop control that is optimized and responsive to changes.

The implementation of model-based, real-time, closed-loop monitoring, control and optimization of the entire manufacturing and production process, the digital twin concept is helping organizations achieve real-time integrated manufacturing. The digital twin virtual model of the ideal manufacturing operations and processes constantly benchmarks actual production metrics in real-time, providing a wealth of information that organizations use to identify and predict problems before they disrupt efficient production. The digital twin is a prominent example of a practical macro-level closed-loop control that is now feasible with the advanced hardware, software, sensors and systems technology available.

A critical part of digital twin creation is the need to have a complete information set, including the capture of real-time information with a wide range of sensors based on these requirements. To facilitate this information collection, some common strategies include leveraging existing connected sensors, adding new sensors to existing PLCs and controllers, Installing edge devices and installing smart sensors.

Leveraging existing connected sensors. This is typically the popular first step since it does not require physical installation of new sensors. What it does require is application engineering and a software project to link information to the IT network. It may also require new software to be added to SCADA, PLC, HMI and/or DCS systems to accomplish communication with enterprise and other systems.

Adding new sensors. If there are unused sensor interfaces on the controller or available slots to add new interface cards, which can accommodate more sensors, then adding new sensors to existing controllers can be an option. This also requires application engineering to add these sensors to the program in the controller. It may also require the addition of new software to HMI and DCS systems to facilitate communication with enterprise and other systems. In this strategy, there is a risk that making changes in these controllers and systems will create performance and operating issues, so it may require a significant amount of systems and application engineering to ensure reliable operation.

Installing edge devices. In addition to practical concepts, like digital twins, the IIoT has led to companies bringing a wide range of edge devices to market. These edge devices are designed to capture information and communicate directly to enterprise systems and cloud applications, particularly AWS and Microsoft Azure. Many new sensors are not required to be part of the control and automation strategies in the plant but are required to monitor operating parameters for a complete digital twin and close the information loop. Edge devices typically connect directly to the IT network. The advantage to this is that they are non-intrusive, having no or very minimal impact on existing control software architecture. This can be an efficient way to communicate directly with production, maintenance and business systems.

Installing smart sensors. New classes of smart sensors are emerging that can communicate directly with production, maintenance and business systems. Wireless sensors can be an efficient way to acquire data with standard technology, including WirelessHART and ISA100, primarily used in process applications. For discrete points, the IO-Link wireless version is an option. Some sensors also can communicate over standard wireless Ethernet Wi-Fi using various software interfaces.

OPC UA is emerging as a fundamental technology for implementing the digital twin. Digital Factory OPC UA technology provides an efficient and secure infrastructure for the communications of contextual information, from sensors to business enterprise computing, for all automation systems in manufacturing and process control. OPC UA is leveraging the accepted international computing standards and putting automation systems on a level playing field with the general computing industry.

OPC UA uses common computing industry-standard Web services, which are the preferred method for system communications and interaction for all networked devices. The World Wide Web Consortium (W3C) defines a Web service as “a software system designed to support interoperable machine-to-machine (M2M) interaction over a network.” This is precisely the task of automation systems. OPC UA is being built into many sensors and other devices, to simplify the communication process.

See Automation.com articles about digital twins.

 


Intelligent sensors


Embedding intelligence in sensors, is a foundational part of Industry 4.0 concepts, is a growing trend. Sensors communicate with controls and automation systems and simultaneously and directly with business systems. Intelligent sensors are also part of the NAMUR New Open Architecture (NOA), a collaboration with VDI/VDE and Fraunhofer ICT.

IoT is becoming a reality with sensors and actuators embedded in physical objects—from roadways to pacemakers—linked through wired and wireless networks and leveraging internet protocol. Industrial controllers are starting to follow this trend by providing data refinement, local historians, analytics and advanced control at the source in end devices.

Modern controllers are communicating with all levels of systems using the “IP plumbing” that is pervasive in manufacturing plants including capabilities to send Email, FTP files and serving up Web pages. These devices incorporate powerful new SoC CPUs to simplify automation architectures. The new breeds of industrial controllers and embedded industrial end devices are incorporating this power and adding features that include embedded Web servers, email clients and Web services. These capabilities enable field devices including sensors, motor controls and actuators to communicate directly with controllers, enterprise systems and cloud applications.

It is common to now see dual-core CPUs in controllers and several companies have announced quad-core-based controllers. These more powerful industrial controllers are becoming automation computing engines that are starting to collapse the typical five-level model and make automation systems more flexible and responsive.

The incorporation of higher-level functions directly into this new breed of powerful industrial controllers is starting to eliminate the need for middle-level software. Middle-level software and computers have served their purpose of buffering, synchronizing, translating and refining sensor and controller information. But they have also created a great number of middle-level computers, databases and software programs that are expensive and difficult to maintain.

The interim solution is a migration to more powerful computers and the virtualization of existing middle-level software. This migration and virtualization improve performance and centralize software maintenance and configuration control. Over time the functions of this middle-level software are being taken over by the new more powerful controllers. The new high level of communications and computing at end devices is opening the possibilities for holistic and adaptive automation to increase efficiency. This is a logical evolution in step with the Internet of Things trend and will lead to more responsive and efficient production.

See Automation.com articles about intelligent sensors.

 


Spatial computing/intelligent vision


Spatial computing enables computers to blend in with the physical world in a natural way, seamlessly bringing together the virtual and physical worlds. Spatial computing brings people into the digitalization loop, empowering them to dramatically increase operations in manufacturing efficiency, creating experiences and applications that were previously impossible. Spatial computing devices display the real world and simultaneously real-time operating parameters in a way that appears three-dimensional. The numbers of smart glasses and helmets have grown dramatically from consumer- to industrial-grade accelerating adoption.

Features include integrated audio for hands-free operation and communication with other workers and remote experts. These devices can also include multiple 360-degree cameras, Wi-Fi, Bluetooth and GPS that can be used for personnel tracking in hazardous and safety areas. There are versions integrated into industrial helmets for special and hazardous area requirements.

Workers can use this equipment to bring up assembly instructions, procedures and operating manuals displaying step-by-step instructions in the worker’s field of vision. In assembly areas, workers can be guided with pick-by-vision instructions including customer order information. Assembly of individual items can be confirmed with voice-controlled barcode scans, using the camera built into the glasses.

Maintenance personnel have hands-free access to manuals, repair guides, graphical plant diagrams and troubleshooting tips along with assistance on machinery procedures and remote experts, as well as receiving early warnings of safety risks. These solutions can enable plant personnel who need to look and hear equipment and processes to diagnose issues remotely, giving them information to prepare the proper tools and potential repair parts before physically going in the field.

Rather than a supervisor physically having to come to help a production line worker, augmented reality (AR) allows the supervisor to see exactly what the worker is seeing and provide help remotely. This allows organizations to multiply their experienced personnel and efficiently provide valuable mentoring to new people.

Further enhancements include adding QR codes or signs on machines, work cells and process equipment.

These codes can be used to automatically bring up information using smart devices, making it simple and safer for workers.

Another viable technology is the use of wired and wireless industrial video cameras—some including audio—which can be used to keep track of machines and process vital signs remotely. Combined with image recognition software, videos can be used for real-time closed-loop quality monitoring and control.

A great example is a factory worker using a smartphone, tablet, or smart glasses can simultaneously view a physical machine, real-time variables and technical manuals. Spatial computing is related to both AR and virtual reality (VR). AR means overlaying digital content onto the real world, typically using a phone or smart glasses. Mixed reality (MR) employs a blend of AR and VR enhancing the user’s understanding of operations, for example, showing a representation of the inside of a machine and the real-time operating data.

Devices that employ spatial computing might also have speech recognition features to support voice commands, enabling hands-free operation. In addition, people can collaborate with remote experts who can see the same information and can advise.

While using robotic systems helps with general efficiency and productivity of an assembly plant, there are additional benefits that accompany the incorporation of a vision system with the robot. A robotic vision system consists of one or more cameras connected to a computer. The computer contains a processing software program that helps the robot interpret what it sees, for example identifying parts in assembly processes without requiring specific placement and performing real-time quality analysis.

See Automation.com articles about machine vision.

 


Connected worker technology


Employees can be empowered with mobile devices, giving them information and control capabilities that have traditionally been fixed in the control room to work more efficiently and effectively. Devices include smartphones, tablets and smart glasses incorporating front-facing high-definition cameras, audio and visual information. This capability has been available for some time, but the cost has become significantly lower driven by commercial and consumer products.

New technology is enabling remote monitoring capabilities to improve operational effectiveness. This presents users with opportunities and challenges to be evaluated for practical applications. The goal is to improve manufacturing or processing uptime and efficiency. Subject matter experts are becoming increasingly hard to find and companies need to find ways to use them more efficiently. The latest remote monitoring tools allow experts to analyze problems and abnormal situations and determine ways to improve and optimize operations without traveling to the site.

Worker productivity and responsiveness are being improved with technologies that directly connect workers to manufacturing systems making them an informed integral part of production in real time. Mobile computing and communications technology cost reductions and increased performance continue to increase the ability to increase the capabilities and value of workers in production. The connection of workers is being accelerated using the wide expanding range of commercial off-the-shelf technologies including voice and video headsets, Smart glasses and virtual reality devices and systems that are providing workers with productivity enhancers including:


  • Manuals anywhere
  • Equipment identification and lookup
  • Real-time superimposed data
  • Audiovisual linking to subject matter experts
  • Direct access to production availability information.


 See Automation.com articles about connected worker.

 


Remote expert services


Connectivity and edge processors empower suppliers to offer remote expert monitoring services. Experts and analytic software continuously monitor controllers and control systems for abnormal situations and advise site personnel of current problems or predictions of future problems. Control suppliers that offer these services have experts and software that can quickly detect issues with the controllers, components and software that they provide.

Since most plants have equipment from multiple suppliers, the value of this service may be limited if the provider does not monitor all equipment and applications. In some general equipment and process control applications, contract experts can detect and advise on plant production issues. Subject matter experts in specific manufacturing and process areas can be used on demand for special problems and issues. A big advantage of the services approach is a third party has a remote, 24/7 operations center to constantly monitor your systems.

Some providers may collect performance analytics information to learn how machinery is performing and provide alerts when data falls outside of predefined parameters. This requires the development of rules with input from plant staff because they understand the plant operations. Alternatively, manufacturing companies can run an inference engine with rules developed by plant staff that understand the dynamics of operations.

Ultimately, when most problems and issues are identified, someone needs to be onsite with the right tools, information and spare parts to get things working. Determining the best methods to achieve improved uptime and efficiency is the overall challenge.

 

See Automation.com articles about remote expert services.

 


Robotics


The cost and ease of use of robotics have changed dramatically, particularly with collaborative robots (cobots). More possibilities are being created with the growing trend of modular industrial robot components that can be used to assemble the optimal robot structures for different applications on an individual and flexible basis. In addition, easy-to-use software tools are allowing people and plants to directly define robot actions without programming.

The 2023 annual report from the International Federation of Robotics says that robot installations hit a new record level of 553,052 units and, for the second year in a row, annual installations exceeded the 500,000-unit mark, adding another 5% to the previous record figure of 526,144 units installed in 2021.

The major customer industries, automotive and electronics, installed substantially more robots than in 2021. Supply chain disruptions and the scarcity of inputs as well as different local or regional headwinds still hampered the completion of projects, but the problems were less severe than in the previous year.

The electronics industry was the largest customer of robots, a position it gained in 2020 and has maintained since, claiming 28% (+1 pp) of all robots newly installed in 2022. The automotive industry followed with 25% of installations (+3 pp), growing in both the car manufacturer and the parts supplier segments. The metal and machinery industry retained its third place (12%; -1 pp), followed by the plastic and chemical products industry (4%) and the food and beverage industry (3%). Note that for 17% of the robot installations (-3 pp), there is no information on the customer industry.

Cobots. The application of robotics and particularly the growing use of cobots has become a high-return-on-investment opportunity for manufacturers. A cobot is a high-impact automation tool that can improve manufacturing in companies of all sizes and improve worker safety. Cobots are a new breed of lightweight and inexpensive robots, with safety features specifically designed to enable people to work cooperatively with these devices in a production environment.

Cobots can sense both humans and obstacles and respond by automatically, stopping before they cause harm or destruction. With these robots, protective fences and cages are not required, and therefore they can enable flexibility and lower implementation costs. Cobots are particularly attractive investments with a typical cost of less than $40,000 U.S. Robot software provides simplified programming, allowing deployment without hiring specialized engineers. The programming process, which involves moving the robot arms and end effectors to the desired positions, is a physical form of the popular computer programming concept called “what you see is what you get” (WYSIWYG). It is designed to be intuitive for users and has been proven in many implementations to broaden the application of technology.

Integrated vision and end effectors. Coupling robots with vision systems and image recognition software expands robot use for more free-form applications. Robots can grab dissimilar parts in assembly settings, pack boxes in shipping facilities, organize bins, load machine tools, inspect parts and perform many other helpful tasks.

End effectors include devices for picking up a wide range of parts of various types. An end effector is the last link (or end) of the robot device at the end of a robotic arm, designed to interact with the environment. In addition, end effectors with built-in tools are being used to perform industrial applications including grinding, sanding, welding, riveting, screwing bolts to a specific torque, spray painting, machine part tending and material handling.

The number of innovative robotics end effector devices has increased significantly. It is worth noting, there is an acceleration of robot use and other applications including restaurants, construction and health care, creating a broader array of end effectors.

See Automation.com articles about robotics.

 


Automated material flow


Driverless vehicles, personal robots and other innovations may be in the future for today’s average consumer but for industry, the technologies are available now to increase productivity and efficiency, automating industrial material flow. These technologies transform operations supporting Lean production methods, eliminating numerous non-value-added human touches each requiring multiple manual double checks and associated activities such as adding handwritten tags to pallets of material.

Material flow status is synchronized in real-time with physical production activities for the most productive process flow and coordinated with warehouse management system (WMS) and quality control software. Work centers are streamlined with minimal material buffer quantities required due to synchronized and just-in-time (JIT) delivery of materials and assemblies.

Robotics, mechatronics, vision and other technologies are creating opportunities for automated material handling to improve productivity with JIT material flow to machines and people. This is important for all manufacturing to increase productivity including automated machines, batch manufacturing and manual assembly.

Integrated into industrial digitalization, automated material handling systems are complete integrated solutions including a combination of automated material handling equipment, software and controls, designed in a way that they can automatically move, sort, store or transport goods, products or materials within a facility or warehouse without the need for direct human intervention.

An automated material handling system typically includes components to streamline material handling processes and optimize operational efficiency. Some standard components of an automated material handling system may consist of conveyor systems, linear magnetic transport systems, robots, WMS, autonomous mobile robots (AMRs), automated guided vehicles (AGVs) and more.

Leveraging systems to improve operations are a primary driver for automated production material flow. This enables improved operations including stage by component (SBC) delivery of every component required in the production process to an operator or assembly person when they need it. One advance allows automated guided vehicles (AGVs) to be deployed with laser navigation, accurate to a quarter of an inch. This enables the facility to avoid the requirement for in-floor wires and combines IT, engineering and operations systems to manage material flow for production from the shop floor to the enterprise systems. Overall goals include:


  • System-driven quality and compliance
  • Ensuring correct material/batch deliveries
  • Confirming material quality status before use
  • Delivering real-time materials traceability at the point of use
  • Providing end-of-order reconciliation based on actual consumption
  • Maintaining systems clearance of foreign materials before starting the next order.


Automated material flow facilitates manual workstation Lean 5s methods of workplace organization where the location of everything in the workspace is defined and clearly marked with material delivered as required based on production plans.

 

See Automation.com articles about material handling.

This feature was originally published in AUTOMATION 2024: 9th Annual Industrial Automation & Control Trends Report.



About The Author


Bill Lydon is editor emeritus of Automation.com and ISA’s InTech magazine. He has more than 25 years of experience designing and applying automation and control technology, including computer-based machine tool controls, software for chiller and boiler plant optimization, and a new generation building automation system. Lydon was also a product manager for a multimillion-dollar controls and automation product line, and later cofounder and president of an industrial control software company.



Download AUTOMATION 2024: 9th Annual Industrial Automation & Control Trends Report


Did you enjoy this great article?

Check out our free e-newsletters to read more great articles..

Subscribe






link

Leave a Reply

Your email address will not be published. Required fields are marked *