Power – IIOT Connection https://www.iiotconnection.com CONNECTING INNOVATIONS WITH INSIGHT Wed, 21 Apr 2021 13:45:41 -0400 en-US hourly 1 https://wordpress.org/?v=5.3 https://www.iiotconnection.com/wp-content/uploads/2018/10/icon.png Power – IIOT Connection https://www.iiotconnection.com 32 32 POWER magazine and Chemical Engineering magazine announce Eastman Chemical as the Host Chemical Process Industries (CPI) Sponsor for the 5th annual Connected Plant Conference https://www.iiotconnection.com/power-magazine-and-chemical-engineering-magazine-announce-eastman-chemical-as-the-host-chemical-process-industries-cpi-sponsor-for-the-5th-annual-connected-plant-conference/ https://www.iiotconnection.com/power-magazine-and-chemical-engineering-magazine-announce-eastman-chemical-as-the-host-chemical-process-industries-cpi-sponsor-for-the-5th-annual-connected-plant-conference/#respond Wed, 21 Apr 2021 13:45:41 +0000 https://www.powermag.com/?post_type=press-releases&p=162749 HOUSTON, TX, April 21 – The 5th Annual Connected Plant Conference is proud to announce Eastman Chemical as the host Chemical Process Industries (CPI) sponsor. The 2021 event will convene at The Renaissance Austin Hotel in Austin, Texas, August 30 – September 2. Hosted by Chemical Engineering and POWER magazines, the Connected Plant Conference will […]

The post POWER magazine and Chemical Engineering magazine announce Eastman Chemical as the Host Chemical Process Industries (CPI) Sponsor for the 5th annual Connected Plant Conference appeared first on IIOT Connection.

]]>
HOUSTON, TX, April 21 – The 5th Annual Connected Plant Conference is proud to announce Eastman Chemical as the host Chemical Process Industries (CPI) sponsor. The 2021 event will convene at The Renaissance Austin Hotel in Austin, Texas, August 30 – September 2. Hosted by Chemical Engineering and POWER magazines, the Connected Plant Conference will provide attendees with the latest digital monitoring, diagnostic, analytics, Industrial Internet of Things, and decision-support technology for the power generation and chemical process industries.

Attend this one-of-a-kind event to develop your digital roadmap, benchmark where you are compared to the rest of the industry and get a better understanding of all the technology available right now.

Dorothy Lozowski, Editorial Director of Chemical Engineering magazine, said, “For the fifth year in a row, Chemical Engineering will co-host this digital transformation event. Attendees will learn how others are adopting the latest digital technologies for process, product and enterprise improvements. We are thrilled that our industry colleagues Eastman Chemical are supporting the Connected Plant Conference, showing their support of peer-to-peer networking and advancement of digital technology solutions."

The Advisory Board and editors of Chemical Engineering and POWER magazines are hard at work to bring vital content to this year's Connected Plant Conference. From the dynamic world of artificial intelligence and machine learning to the enabling technologies driving advanced connectivity and data analytics, leaders will come away with tools to increase efficiency, productivity, reliability, and resiliency.

Attendees will also have the opportunity to meet with leading digital solution providers in the networking arena to discuss specific plant challenges and the digital technologies available to solve them.

Make plans today for your team and you to join Eastman Chemical at the Connected Plant Conference in Austin, August 30 – September 1, 2021. Registration is open, so visit our website today to register and receive the lowest rate available.

About Chemical Engineering magazine
Chemical Engineering magazine was launched in 1902 and is the most widely respected global information source for the chemical process industries (CPI). Chemical Engineering has been the leading source for news, technology and analysis used by engineers, operators, plant managers, senior managers and consultants worldwide. This combination of technology, analysis and experience makes Chemical Engineering the primary publication for the most important and influential people in the industry.

About POWER magazine
POWER, the single global resource for print, media, and events in the power and energy industry, was established in 1882 and is the only industry publication that addresses all power generation, and related technologies and fuels, utilized throughout the world, providing news and information for this increasingly complex sector. The POWER brand is dedicated to providing its global audience with exclusive analyses of the latest trends, best practices, and insight on power generation and related projects through several platforms, including print, digital, and in-person events. POWER equips generation professionals and those who support them with the resources they need to make informed decisions that power the future.

About Access Intelligence
Access Intelligence is a privately held b-to-b media and information company headquartered in Rockville, MD, serving the marketing, media, PR, cable, healthcare management, defense, energy, infrastructure, engineering, satellite and aviation markets. Leading brands include AdExchanger, AdMonsters, Chemical Engineering, Cynopsis, Cablefax, Chief Marketer, Defense Daily, Event Marketer, LeadsCon, POWER, Via Satellite and P3C. Market-leading conferences and trade shows include LeadsCon, AdMonsters OPS and Publisher Summits, Experiential Marketing Summit, SATELLITE Conference and Exhibition, OR Manager Conference, LDC Gas Forums, Clean Gulf, Experience POWER and The P3C Conference & Expo.

The post POWER magazine and Chemical Engineering magazine announce Eastman Chemical as the Host Chemical Process Industries (CPI) Sponsor for the 5th annual Connected Plant Conference appeared first on IIOT Connection.

]]>
https://www.iiotconnection.com/power-magazine-and-chemical-engineering-magazine-announce-eastman-chemical-as-the-host-chemical-process-industries-cpi-sponsor-for-the-5th-annual-connected-plant-conference/feed/ 0
Self-Tuning Artificial Intelligence Improves Plant Efficiency and Flexibility https://www.iiotconnection.com/self-tuning-artificial-intelligence-improves-plant-efficiency-and-flexibility/ https://www.iiotconnection.com/self-tuning-artificial-intelligence-improves-plant-efficiency-and-flexibility/#respond Wed, 14 Apr 2021 03:00:32 +0000 https://www.powermag.com/?p=162482 Flexible plant operations are highly desirable in today's power generation industry. Every plant owner desires increased ramp rates and the ability to operate at lower loads so their plants will remain "in the money" longer in today's competitive power markets. This goal, while laudable, remains elusive. The ADEX self-tuning artificial intelligence (AI) system allows plants […]

The post Self-Tuning Artificial Intelligence Improves Plant Efficiency and Flexibility appeared first on IIOT Connection.

]]>
Flexible plant operations are highly desirable in today's power generation industry. Every plant owner desires increased ramp rates and the ability to operate at lower loads so their plants will remain "in the money" longer in today's competitive power markets. This goal, while laudable, remains elusive. The ADEX self-tuning artificial intelligence (AI) system allows plants to continuously optimize plant performance at any operating point rather than being constrained to a static "design point" commonly found in gas- and coal-fired plants. Better yet, no changes to the plant distributed control system (DCS) are required.

Renewable energy resources, coupled with internet technologies and emerging large-scale energy storage, fundamentally change the modern electricity markets. Wind and photovoltaic electricity resources have proved to be the most disruptive renewable energy technologies over the past few years and will continue to do so. The challenge for grid operators is managing these disparate electricity providers to match weather-dependent supply with consumers demanding more independence and flexibility to manage personal demand patterns. The solution is multi-layered. A new regulatory framework is needed to address accelerating market changes, and electricity generators must evolve their business models to remain competitive. Technology developers must provide regulators and generators with the necessary computer tools to ensure a reliable and efficient electricity supply system continues.

Not all electricity suppliers are equipped to compete in this new electricity marketplace, particularly those with significant coal-fired infrastructure. Initially designed to operate at baseload for months on end, coal plants are often disadvantaged in a competitive electricity market that values flexibility in plant operations, particularly in those regions with sizeable amounts of renewable resources. Cycling most coal plants adversely affect equipment life, the startup is time-consuming, and operation at load low for an extended period of continuous load-following operation is challenging. Gas-fired plants, principally combined cycle plants (CCPs), feel the competitive heat from increased grid-connected renewable energy resources placed higher in the dispatch order. The common need by conventional steam plants and more modern CCPs is a plant control system designed for more efficient and flexible operation.

What does "flexible operation" mean to the power plant owner? As with any competitive market, the plant that delivers electricity under any operating condition will be called upon more often than plants unable to respond to the modern grid's dynamic load change characteristics. For example, flexible plant operations are usually stated in terms of its minimum continuous operating load, ramp rates, and fast startup when grid imbalances necessitate, while remaining in compliance with regulatory requirements. If a plant, coal or gas, cannot perform these three functions, then that plant is at a significant competitive disadvantage. Thus, regional system operators force thermal power plants to run under new operating conditions, much closer to their constraints and outside their stable zones, ruled by their control systems (Figure 1).

1. Thermal plants are tuned for stable operation under a fixed set of operating conditions. These plants must now operate more flexibly to remain "in the money." Courtesy: ADEX

The Control Instability Problem

Thermal power plants under automatic control, like many other industrial plants, can suffer instability problems, particularly when power generation has to meet changes in demand. The main reason is that current automatic control systems are designed with fixed parameters, while the process dynamics are time-varying.

Power plants have several advanced process control (APC) systems available in the market based on various control methodologies, such as fuzzy logic, neural networks, or model predictive control. Each applies their control actions through well-known proportional-integral-derivative (PID) controllers at the DCS level, which interfaces with the plant's actuators and sensors.

Each control methodology also shares a common paradigm: their parameters are set to control a model of the process, which were previously developed under defined operating conditions. However, when the time-varying process dynamics diverge from the model dynamics, the control system and the PID parameters become detuned, and the control performance can become unstable. When the control performance is erratic, flexible plant operations are limited, and dispatched hours are reduced.

Some facilities have turned to AI as a means to improve plant response. The drawback to AI is the lengthy period of "deep learning," including "big data" collection under all possible operating scenarios before an AI controller can function independently. However, many of these technologies have proven to be more hype than hope, and the power industry has tested and rejected many AI-enabled solutions. The "control problem" now becomes correctly identifying the new dynamic relationship between control system input-output variables at all possible operating states, not just those that are predictable. For a modern steam plant, this is an impossible task.

Self-tuning AI: The ADEX Controller

Many electricity generators that are frustrated with their poor performance in a competitive marketplace may be surprised to learn that their plants are mechanically capable of greater operating flexibility, only to find that their plant controls are the limiting factor. If plant computing power or capability is not the issue, expensive hardware retrofits are also not the answer.

ADEX is a novel control technology that runs parallel with the plant DCS, not in the cloud. ADEX auto-adjusts its parameters in real-time to any of the infinite possible operating contexts and can guide process variables through desired trajectories, with no big data or machine learning systems. All this power relies on a new multivariable ADEX adaptive predictive controller that expands the stable process zone of operation (Figure 2) to include new challenging optimal scenarios.

2. Self-tuning AI extends the stable zone of plant operation. Courtesy: ADEX

An overview of how the ADEX controller accomplishes this impressive task is shown in the ADEX controller block diagram (Figure 3).

3. A descriptive block diagram of an adaptive predictive expert control system is illustrated. The ADEX controller employs a self-tuning mechanism that adjusts control parameters in real time to compensate for changes in the operating condition. In addition, the controller predicts the evolution of the process variables and drives them into a stable optimal operating state, thus achieving optimized control performance, without the need for additional plant instrumentation and control (I&C). Courtesy: ADEX

The methodological principles that govern ADEX controller operation and performance may be described in three parts.

1. Optimal Process Guidance describes the forward path of the ADEX controller block diagram that determines the controller operation (Figure 3). First, the process control signal is calculated by an adaptive predictive (AP) model to make the predicted process output equal to the desired output. Second, this desired output belongs to a future desired output trajectory produced by a Driver Block that goes from the actual process output to the setpoint. Finally, the Driver Block optimizes a process performance criterion that determines Optimal Process Guidance when there are no prediction errors.

2. Self-Tuning Artificial Intelligence (ST AI) is provided by an adaptive mechanism shown in the lower feedback path of Figure 3. From the process input-output data and the prediction error, it adjusts in real time the AP model parameters so that the square of the prediction error tends towards zero in the gradient direction. Thus, the process output trajectory will converge towards the desired output trajectory resulting in process stability and optimized control performance.

3. Expert Control is represented by the Expert Block on top of Figure 3. An ADEX controller is configured for different process operation domains based on the operator knowledge of process dynamics in these domains. The Expert Block introduces rules to make the controller configuration always the appropriate one for the current process domain of operation. Thus, prior operator knowledge on the process dynamics improves the closed-loop performance, robustness, and stability.

In sum, the ADEX implementation of AI based on this model does not require predefined operating points before it is implemented. It can adjust key operational variables by itself in real time to accommodate the application's time-varying context.

Improving Plant Operating Flexibility

The ADEX ST AI system identifies in real time the time-varying process dynamics, which enables the process output variables under control to be on target within an extended operational stable zone, where the thermodynamical, mechanical, and other design features of the plant enable greater operating flexibility. The mismatch of current fixed-parameter control systems and PID controllers with the time-varying plant dynamics can cause instability, such as oscillations of the process output variables, which reduces the operational stable zone of the plant and restricts flexible operation that could be achieved otherwise. Thus, the limiting factor for greater flexibility is not the plant's design capabilities, but the existing control system.

ADEX does not require replacing a plant's control system, the addition of new I&C, nor "big data" or long-term machine learning processes. ADEX obtains its data through the plant local area network (LAN) and control network plus minimal DCS reprogramming. ADEX can be turned on or off by the control room operators at any time. Using the same inputs as the existing plant control system, the ADEX Controller determines a different behavior on the actuator that brings a different process performance and different results.

Figure 4 illustrates the ADEX controller's performance when installed on the superheat control system of a conventional steam plant. The superheat (SH) steam temperature (green) is shown with its setpoint (red) during two hours of variable load operation. The left-side graph illustrates the existing APC's performance when controlling the SH spray valve position (orange). The lack of anticipation and the PID reaction to the error allows the process to oscillate, with relevant peaks and drops of temperature. The right-side graph illustrates the performance of the ADEX system's adaptive predictive technology when switched on. Note that a different behavior of the valve brings the SH steam temperature stabilization, even at minimum load. The following two case studies illustrate the application of the ADEX ST AI.

4. The superheat spray valve performance with APC (left) is compared to ADEX self-tuning AI (right). Courtesy: ADEX

Case Study 1: Adding Operating Flexibility to a Coal-fired Power Plant

An ENEL three-unit 1.1-GW lignite-burning plant was struggling to deliver the load-following performance required by the system operator. Based on PID controllers, the existing control system produced unacceptable steam pressure and temperature oscillations, preventing the plant from accurately following the system operator's required load profile (Figure 5). The top graph shows how the existing APC was limiting the ramp rate to 1.5 MW/minute due to the oscillations in SH steam pressure (grey) with a maximum amplitude of 237 psi (1.650 kPa) and SH steam temperature (green) of up to 30F (16.6C).

After ADEX ST AI was applied, the ramp rate doubled to 3 MW/minute as shown in the bottom graph of Figure 5. The load delivered (in black) followed dispatch orders (in red) 73% more accurately than before. Steam pressure and temperature excursions were eliminated, increasing the stability of both processes by 45% and 47%, respectively. Thermal stress on the steam generator was significantly reduced, and its capability to cycle was enhanced. The time from minimum load to full load on the plant was halved from 3.5 hours to 1 hour 45 minutes. This performance improvement allowed the plant to be dispatched more often in a load-following mode.

5. The overall effect on control of steam pressure and temperature during load-following operation is shown before ADEX ST AI was activated (top), and after (bottom). Courtesy: ENEL

Case Study 2: Boosting the Efficiency of a Combined Cycle Plant

The 786-MW Amorebieta 2×1 CCP, located in the north of Spain, is owned by Bizkaia Energia. The plant uses two General Electric MS9001 GA DLN 2.0+ combustion turbines (CTs) and an Alstom DKYZZ3-2N41 steam turbine. The plant operates in automatic generation control (AGC) mode, cycling more than 30% of load an average of 150 times per day. However, ramp rates and plant minimum load were limited by SH temperature excursions and subjected the plant to excessive thermal stress. The left-hand side of Figure 6 shows the plant has experienced peaks of +7C (+12.6F) and temperature drops of up to –30C (–54F) in SH steam temperature during cycling operation.

6. The illustration compares the superheat steam temperature control using APC (left) and under ADEX control (right). Courtesy: Bizkaia Energia

Bizkaia Energia developed a game plan to enhance the plant's dispatchability by increasing the plant's ramp rate and reducing its minimum load capability by successfully controlling the SH temperature oscillations. The right-hand side of Figure 6 shows how, since ADEX Self-tuning AI was applied, SH temperatures have maintained ±1.5C (±2.7F) around the setpoint 100% of the time, even under constant cycling operation.

With steam temperature excursions eliminated, the average steam temperature at the turbine inlet was raised by 4.8C (8.64F), which resulted in a heat rate improvement of 0.12%. The stability provided by ADEX optimizers also allowed an increase in the steam temperature setpoint of 2.1C (3.78F) for an additional heat rate improvement of 0.05%. Further, close control of SH temperature has reduced the induced thermal stress on the plant's superheater headers, increasing component life expectancy and plant availability.

Additionally, the plant has enhanced its dispatchability by increasing its ramp rate from 15 MW/minute to 24 MW/minute (a 60% improvement), and reducing minimum plant load from 270 MW to 205 MW (24%) in 1×1 operation. Ramp rate improvement will also be reflected in the plant's increased dispatch rate, particularly under AGC. The left-hand side of Figure 7 illustrates superheated steam temperature excursions that occurred during load changes under the previous plant control system operation. The right-hand side graph shows the operation of the plant after the ADEX optimizer was installed. The result was an increase in plant capacity delivered under AGC with SH temperature under close control.

7. With steam SH temperature controlled by ADEX, Bizkaia Energia's Amorebieta plant is much more responsive when dispatched under AGC. Courtesy: Bizkaia Energia

Those higher ramp rates and lower minimum loads produced more demanding combustion conditions in the CTs too, affecting flame stability, NOx formation, and combustion dynamics. The ADEX Self-tuning AI enabled the solution of those problems at Bizkaia Energia through the optimization of the fuel gas heaters, controlling the gas temperature five times more accurately than previously, following the Wobbe Index setpoint and meeting the demanding performance conditions imposed by flexible plant operations.

Conclusions

The ADEX Self-tuning AI allows under-performing plants to improve their thermal efficiency while minimizing thermal stress to the equipment, reduce startup time, and improve operating flexibility for both coal-fired and combined cycle plants. Flexible plants, particularly responsive plants with excellent ramp rates, high turn-down, and good load-following capabilities under AGC, are particularly valuable to a system operator. Also, the plant moves up in the dispatch order, which produces a greater economic return on the owner's plant investment.

Jose Martinez is CEO of ADEX Group, Isaias Martin-Hoyo is COO of ADEX USA, and Ravi Krishnan is managing director of Krishnan & Associates. This article was written for ADEX in cooperation with Krishnan & Associates, a specialized energy industry marketing firm.

The post Self-Tuning Artificial Intelligence Improves Plant Efficiency and Flexibility appeared first on IIOT Connection.

]]>
https://www.iiotconnection.com/self-tuning-artificial-intelligence-improves-plant-efficiency-and-flexibility/feed/ 0
How to Put the Power Grid to Work to Prevent Wildfires https://www.iiotconnection.com/how-to-put-the-power-grid-to-work-to-prevent-wildfires/ https://www.iiotconnection.com/how-to-put-the-power-grid-to-work-to-prevent-wildfires/#respond Mon, 01 Mar 2021 05:02:00 +0000 https://www.powermag.com/?p=160919 While not a new occurrence, in recent years wildfires have wreaked havoc across the western U.S. We have also seen Mother Nature batter coastlines and landscapes around the globe, and the ever-mounting pace of

The post How to Put the Power Grid to Work to Prevent Wildfires appeared first on IIOT Connection.

]]>
While not a new occurrence, in recent years wildfires have wreaked havoc across the western U.S. We have also seen Mother Nature batter coastlines and landscapes around the globe, and the ever-mounting pace of these natural disasters serves as a warning about our own level of preparedness. As of now, the argument can be made that utility operators, governments, and even consumers are not doing enough to mitigate these potential risks. However, the technology to more efficiently mitigate and manage the risks of climate change, while promoting a more sustainable economy is already in existence. It just needs to be used more effectively to promote sustainability, safety, and ultimately bottom lines.

One area where this is particularly a challenge is our power grid, which is increasingly threatened by changing weather patterns and has become more strained over time to meet the needs of growing populations with new consumption patterns. A lack of efficient inspection capabilities and other outdated asset management techniques means that new risks arise across the grid–from substations to customers. Even the power lines themselves pose a risk, as the right mix of dry and windy conditions can quite literally spark fires.

This is top of mind for utilities, as the industry has witnessed spikes in company-imposed blackouts in wildfire-prone areas like California. While this practice has proven to be effective, it causes significant headaches to customers and is costly to conduct, leaving many to desire a more efficient way to manage the grid during these events.

Data Can Help Prevent Disaster

One way to limit the need for public safety power shutoffs is through the effective use of data. For example, utility companies often rely on manual or in-person line inspections, but given the sheer scale of activity across thousands of miles of utility lines and distribution substations, it can be tedious and challenging to conduct thorough examinations. In recent years, the industry has turned to technology for help.

This is where integrating Internet of Things (IoT) sensors, imaging, light detection and ranging (LiDAR), weather predictions, and other technologies into the grid helps operators identify disturbances or anomalies, analyze activity in the grid, and even receive real-time alerts regarding line interference. This is critical to stopping potential sparks or flames before they start. As this data is collected over time, the effective use of the information can reduce the risk of wildfires through the ability to take action when and where it's needed–quickly and even preemptively.

However, these sensors, robots, and other monitoring tools generate mountains of data. While it can better inform future decisions, the scale and quality of data generated can be challenging to manage and effectively glean insights from. Thus, operators must look for ways to "liberate" this data from its silos in order to power data-driven insights that mitigate risk and protect employees, as well as consumers.

Liberate Data to Mitigate Risk and Boost Efficiency

The abundance of data at the disposal of utility companies and most organizations nowadays is unprecedented. The collection and storage of data has never been cheaper, but to get the maximum value out of data, and operate in a safer and smarter way, operators must be able to analyze and gather insights from the data before an impactful event occurs.

While gleaning insights from data may seem easy, it can be a major challenge when uncontextualized. That is why operators have struggled to this day, and also why utility companies and other heavy-asset industries are investing in data management. Through these investments, utility companies are working to make their organizational data more readily available and accessible to stakeholders. This allows all the data stemming from a utility company's operation to be processed in milliseconds, meaning decisions can be made faster, with strong reasoning based on data, which is critical amidst a natural disaster such as a wildfire where every second counts and a misstep can result in catastrophe.

The liberation and contextualization of data to mitigate the risk of natural disasters is just one use case for utility companies. For example, large grid operators can use data to monitor the condition of distributed transformer networks. With data-driven insights, grid experts and employees can proactively identify risks (without manual inspections) and schedule maintenance–decreasing the likelihood of minor failures, as well as major failures.

The Technology Is Ready and Waiting

Humanity's continued fight against Mother Nature has gone on for millennia, but we are no longer completely powerless in mitigating the impact and destruction of wildfires and other natural disasters. The proper use of data is our best defense.

Collecting it is no longer enough, power companies must proactively fuse their data, making it easily and quickly available to all stakeholders. This technology exists today, and is already being implemented by leading U.S. operators and other forward-looking companies around the world. To not only survive the energy transition, but also the effects of climate change on operations, data must be put to work.

Francois Laborie, PhD is president of Cognite North America.

The post How to Put the Power Grid to Work to Prevent Wildfires appeared first on IIOT Connection.

]]>
https://www.iiotconnection.com/how-to-put-the-power-grid-to-work-to-prevent-wildfires/feed/ 0
Focus on Sensors https://www.iiotconnection.com/focus-on-sensors-4/ https://www.iiotconnection.com/focus-on-sensors-4/#respond Mon, 01 Mar 2021 00:00:00 +0000 https://www.chemengonline.com/?p=212623 Oxygen measurement with Ex approvals & SIL2 certification The new Zirkor200 oxygen-measurement analyzer (photo) adds features for integration into safety-related process controls. The user-friendly

The post Focus on Sensors appeared first on IIOT Connection.

]]>
Oxygen measurement with Ex approvals & SIL2 certification

SICK

The new Zirkor200 oxygen-measurement analyzer (photo) adds features for integration into safety-related process controls. The user-friendly, extremely rugged and precise zirconium-dioxide analyzers are not only available for gas explosion-hazardous areas (Zirkor200 Ex-G), but for use in dust explosive atmospheres (Zirkor200 Ex-D) as well. The Zirkor200 now also features SIL2 certification for integration into safety-related process controls. Both explosion-proof variants are approved in accordance with ATEX and IECEx. The Zirkor200 Ex-G for Zone 1 works well primarily in the chemicals, petrochemicals, refineries and oil-and-gas industries. With the Zirkor200 Ex-D for Zone 21, the focus is on applications in the cement and power-generation industries, and in the fields of waste-processing and recycling. In the majority of these industries, the Zirkor200 with SIL2 option also enables safety-relevant measurements with only one system (1oo1; one out of one). The analyzers of the Zirkor200 series handle process-gas temperatures up to 1,600°C. – SICK Inc., Minneapolis, Minn.

www.sick.com

The post Focus on Sensors appeared first on IIOT Connection.

]]>
https://www.iiotconnection.com/focus-on-sensors-4/feed/ 0
Free firmware upgrade adds MQTT IIoT support https://www.iiotconnection.com/free-firmware-upgrade-adds-mqtt-iiot-support/ https://www.iiotconnection.com/free-firmware-upgrade-adds-mqtt-iiot-support/#respond Mon, 01 Mar 2021 00:00:00 +0000 https://www.chemengonline.com/?p=212656 This company has released a free firmware upgrade enabling new and existing MicroSmart FC6A Plus PLC CPUs (photo) to support the industry-standard MQTT protocol. The upgrade can be downloaded to the FC6A CPU

The post Free firmware upgrade adds MQTT IIoT support appeared first on IIOT Connection.

]]>

IDEC

This company has released a free firmware upgrade enabling new and existing MicroSmart FC6A Plus PLC CPUs (photo) to support the industry-standard MQTT protocol. The upgrade can be downloaded to the FC6A CPU, so it is easy for users to connect all types of field data to on-site and cloud-based brokers, and make the information readily available for users and analytical applications. Users can also send commands to the FC6A using MQTT. MQTT has emerged as the preferred industrial internet of things (IIoT) communications protocol because it uses a lightweight and efficient publish/subscribe methodology for secure messaging between devices and centralized brokers, making information easily available for all authorized applications. A large number of clients can publish data to the broker, subscribe to any broker data, or bi-directionally do both. – IDEC Corp., Sunnyvale, Calif.

www.idec.com/usa

 

The post Free firmware upgrade adds MQTT IIoT support appeared first on IIOT Connection.

]]>
https://www.iiotconnection.com/free-firmware-upgrade-adds-mqtt-iiot-support/feed/ 0
How to Optimize Online Meeting Communication and Collaboration https://www.iiotconnection.com/how-to-optimize-online-meeting-communication-and-collaboration/ https://www.iiotconnection.com/how-to-optimize-online-meeting-communication-and-collaboration/#respond Tue, 23 Feb 2021 13:44:20 +0000 https://www.powermag.com/?p=161117 Are we prepared for what the future of work holds? According to a recent McKinsey report, that answer will depend on our ability to adapt to a hybrid workforce. The report, which analyzed more than 2,000 activities across 800 occupations, found that hybrid models of remote work will accelerate and continue after the COVID-19 pandemic […]

The post How to Optimize Online Meeting Communication and Collaboration appeared first on IIOT Connection.

]]>
Are we prepared for what the future of work holds? According to a recent McKinsey report, that answer will depend on our ability to adapt to a hybrid workforce.

The report, which analyzed more than 2,000 activities across 800 occupations, found that hybrid models of remote work will accelerate and continue after the COVID-19 pandemic ends. Remote work will primarily be held by a highly skilled, highly educated workforce in a few key industries. Within the utilities sector, an estimated 31% to 37% of employees have the effective potential to work remotely without diminishing productivity.

While the pandemic quickly unveiled the benefits of remote work, companies face ongoing challenges of workforce configuration, remote coaching, and collaborating from a distance. To be successful, these new models of work will require effective digital communication and collaboration at a distance. One element that offers space for significant improvement in productivity is how employees plan and conduct online meetings. In the 2020 State of Online Meetings Report by consulting firm Interaction Associates, nearly 46% of employees reported their online meetings are seldom or never effective. Unproductive meetings are often caused by unclear processes, poor attendance or engagement, and lack of explicit agreement building.

Today, energy sector companies have an opportunity to optimize employees' time, maintain productivity at a distance, and develop cultures where hybrid models can work. The prize? A boost in outcomes and employee retention, especially among millennials. This is especially important in the energy sector where 25% of workers are eligible to retire within the next five years and 50% are eligible in the next 10 years. To make hybrid arrangements work, here are three levers to pull to begin to improve productivity in online meetings.

Define Clear Meeting Objectives

For many employees, scheduling and attending online meetings is often done without detailed planning and purpose. Standing meetings to "review progress" or "connect" are common as remote employees don't have the advantage of being in the same room. While the technology to support online meetings is widely used (78% of respondents indicated their company meetings were conducted via an online meeting platform), many employees are experiencing meeting overload. Without clear guidelines and planning, meeting invitations can quickly appear on calendars and drag down productivity. To get focused, ensure that every meeting has a defined purpose. Writing "Desired Outcome Statements" is a simple but powerful way to answer the question, "What will we leave this meeting with?".

Clarify Roles and Participation

Hybrid work models can be attractive due to reduced operational costs, higher employee productivity, and a greater selection of job candidates. However, remote work can also be frustrating if there is a lack of guidelines for how people are involved in conversations, decisions, and meetings. It's easy for meeting attendance to quickly swell in group size. To ensure there is high engagement and participation, the meeting leader should ensure that clear meeting roles exist (Figure 1). If necessary, assign others to take on a role (such as scribe or timekeeper).

1. To ensure there is high engagement and participation, the meeting leader should ensure that clear meeting roles exist. Courtesy: Interaction Associates

As affirmed in Incident Prevention, safety within utility operations is especially critical, and poor decisions are often formed by employee habits, which can quickly lead to errors.Similar to preventing safety incidents, clear procedures and employee habits in clarifying roles within meetings can boost communication and collaboration.

Capture and Articulate Action

In the 2020 State of Online Meetings Report, one key element stood out in translating conversation into results and achievement: having someone in the meeting capture key ideas and action items on a shared screen, and giving everyone access to the notes after the meeting.

If someone captured key ideas half the time, only 49% of respondents left the meeting with a clear understanding of decisions and action items. For those situations where key ideas were always captured, 89% of respondents left the meeting with clarity on next-step actions. If the scribe shared the key ideas with the group after the meeting, 92% of people completed action items on time. For those meetings where notes were seldom or never shared, only 35% of people completed action items.Capturing specifics and articulating desired actions moves work forward.

In summary, online meetings will continue to be a major component of how employees connect, deliberate, plan projects, and drive accountability. Implementing a skilled meeting culture where employees communicate clear meeting objectives, roles, and capture key ideas and decisions helps to propel initiatives–and companies–forward.

Chris Williams is the director of Business Operations for Interaction Associates, serving the top companies in the energy industry. Interaction Associates is best known for introducing the concept and practice of group facilitation to the business world in the early 1970s. For more than 50 years, IA has provided thousands of leaders and teams with practical, simple, and effective programs, tools, and techniques for leading, meeting, and working better across functions, viewpoints, and geographies.

The post How to Optimize Online Meeting Communication and Collaboration appeared first on IIOT Connection.

]]>
https://www.iiotconnection.com/how-to-optimize-online-meeting-communication-and-collaboration/feed/ 0
Safety and Digitalization Big Parts of Sustainability https://www.iiotconnection.com/safety-and-digitalization-big-parts-of-sustainability/ https://www.iiotconnection.com/safety-and-digitalization-big-parts-of-sustainability/#respond Wed, 17 Feb 2021 19:32:35 +0000 https://www.powermag.com/?p=160903 Company leaders around the globe are more focused than ever on sustainability. The trend has been driven not only by an innate human desire to "do the right thing," but also because investors and environmentally conscious consumers are demanding that companies evaluate how their operations are affecting the world and make positive changes to reduce […]

The post Safety and Digitalization Big Parts of Sustainability appeared first on IIOT Connection.

]]>
Company leaders around the globe are more focused than ever on sustainability. The trend has been driven not only by an innate human desire to "do the right thing," but also because investors and environmentally conscious consumers are demanding that companies evaluate how their operations are affecting the world and make positive changes to reduce unwanted impacts.

"This is just by far the hottest discussion point at the senior management or executive level at energy companies, as well as chemical companies. They're investing as fast as possible to become ‘sustainable.' There are several reasons, but the biggest is investment pressure and stock price pressure," said Ron Beck, market strategy director at Aspen Technology Inc., a provider of enterprise asset performance management, monitoring, and optimization solutions.

AspenTech recently commissioned a study on sustainability, which was conducted by ARC Advisory Group. The analysts surveyed more than 200 energy and chemical industry professionals from around the world and found that 90% of their companies have sustainability initiatives in place. The researchers noted, in general, energy companies tended to focus on the transition to a lower-carbon future, while chemical companies often place more emphasis on producing sustainable products. Regardless of the objective, however, Beck said improving energy efficiency can have the biggest impact on most companies' sustainability metrics. "If you reduce your energy use, you increase your profitability," he said. "It's a net-net positive."

Somewhat surprisingly, when ranking sustainability goals, survey respondents said improving operational safety was their top priority, above such things as implementing net carbon reductions, reducing emissions, adopting cleaner energy sources, and reducing waste and water use. The report says, "Process safety incidents impact environment, personnel, brand, profitability, and the ability to grow a company." The study suggests leaders know very well that one major safety incident can limit a company's license to operate, and therefore, they consider safety of utmost importance to sustainability.

Another interesting finding from the study was that 75% of respondents said digital transformation is extremely important or very important for achieving sustainability goals. ARC's analysts said digital technology that "augments people and knowledge" is already actively being used "to improve both business and environmental performance and sustainability."

"One of the objectives of digitalization is to measure, and measure in real time," Beck told POWER. "Instead of taking all year to put together your annual sustainability report, if you actually want to take actions to avoid unpleasant surprises at the end of the year, executives need the data now. ‘Why is this plant performing worse than that one? What is it doing wrong? Let's investigate or let's send the team that's doing the best to teach everybody else what they're doing.' I mean, it's sometimes as simple as that," he said.

So, what's holding back sustainability initiatives? The majority of survey respondents identified a "lack of capital or resources" and "aging assets" as top barriers to meeting sustainability goals. The report says: "While most companies have a chief financial officer, chief legal officer, and other traditional leadership roles; there is not usually a chief sustainability officer. Since sustainability decisions are made by whoever controls the asset, those decisions are often made without considering the secondary effects of climate emissions. Furthermore, even at those companies with sustainability leadership, there is often a disconnect between the sustainability leadership and operational roles such as plant operations, engineering, or supply chain."

ARC recommended three actions companies could take to more effectively meet their sustainability objectives. First, rethink short-term goals in light of COVID-19 disruptions. However, it said leaders must not let the pandemic distract them from longer-term business and sustainability objectives. Secondly, make targeted investments in supply chain and process optimization technologies, as well as in predictive and prescriptive analytics. Lastly, invest in people and develop a culture of waste elimination, with the ultimate goal of efficiently delivering environmentally friendly products safely. That's were true sustainability resides.

Aaron Larson is POWER's executive editor.

The post Safety and Digitalization Big Parts of Sustainability appeared first on IIOT Connection.

]]>
https://www.iiotconnection.com/safety-and-digitalization-big-parts-of-sustainability/feed/ 0
Swarm CEO Sara Spangelo Sets Disruptive Pricing on New Satellite IoT Service https://www.iiotconnection.com/swarm-ceo-sara-spangelo-sets-disruptive-pricing-on-new-satellite-iot-service/ https://www.iiotconnection.com/swarm-ceo-sara-spangelo-sets-disruptive-pricing-on-new-satellite-iot-service/#respond Wed, 10 Feb 2021 21:58:39 +0000 https://www.satellitetoday.com/?p=322494 Swarm's Internet of Things (IoT) network that connects sandwich-sized satellites to tiny, handheld hardware is now live, offering remote connectivity at the market-disrupting price of $5 per month per device.  Swarm, which was founded in 2017 and garnered media attention in 2018 for launching satellites without FCC approval, is now an operational player in the […]

The post Swarm CEO Sara Spangelo Sets Disruptive Pricing on New Satellite IoT Service appeared first on IIOT Connection.

]]>

Swarm co-founders Ben Longmier and Sara Spangelo holding the Swarm Tile, and a Swarm satellite. Photo: Swarm

Swarm's Internet of Things (IoT) network that connects sandwich-sized satellites to tiny, handheld hardware is now live, offering remote connectivity at the market-disrupting price of $5 per month per device. 

Swarm, which was founded in 2017 and garnered media attention in 2018 for launching satellites without FCC approval, is now an operational player in the satellite IoT market. The company announced its commercial availability on Tuesday, after it launched 36 satellites on the recent SpaceX rideshare mission. The company now has 81 satellites on orbit, 72 of which are commercial. 

The company is vertically integrated, and built and designed hardware, software, and protocols for its satellites and the user modem it calls the Swarm Tile. The Tile costs $119 per device and is a modem that can be embedded into any IoT device that operates in a remote location. 

The Tile communicates with satellites in Low-Earth Orbit (LEO) that weigh just 400 grams. Satellites pass the data to ground stations, pulling customer data into an API where it can be easily accessed. It's a two-way network, allowing a customer to send a command to the sensor. At this point, satellites pass over Swarm's ground stations about every three hours. When the company has launched its full constellation of 150 satellites, it will allow data to be transmitted at any time. 

Sara Spangelo, CEO and co-founder told Via Satellite the Swarm pricing model is four to 20 times cheaper than similar satellite offerings today, and Swarm can transmit customer data more often because of its large number of satellites. Spangelo said this pricing allows customers who may pay $10,000 each month to another provider to spend $500 per month with Swarm. The company's goal with the pricing is to bring in new customers who haven't been able to afford remote IoT connectivity before. 

"For small companies that are bootstrapped, doing low-margin things like agriculture and logistics, this is an incredible benefit to their businesses – how they operate their price points, the risks they can take, the markets they can enter," Spangelo said. "It also is at the point where people that have only previously used cell [connectivity], but want to expand their business beyond where cellular coverage is, can start to think about it as affordable." 

The company has customers from small and medium-sized businesses to Ford-owned Autonomic, which has a connected vehicle infrastructure called the Transportation Mobility Cloud. Another company SweetSense, which offers remote water and energy monitoring solutions, has a partnership with Swarm, whose pricing allows SweetSense to monitor more water supplies in Africa. 

Spangelo said Swarm has the highest demand in agriculture for water monitoring applications, and logistics for tracking trucks or maritime tracking. She also reports demand in energy monitoring, environmental tracking, and government uses, but emphasizes that the service "vertical agnostic." 

"It’s not a moisture sensor or an asset tracker. It’s a modem, or a connectivity device that can be integrated or plugged into any sort of device," Spangelo said. "We do that on purpose. We have ideas of the verticals and use cases, but we’ve been very surprised what people come around with, ones we would have never expected."

 

The post Swarm CEO Sara Spangelo Sets Disruptive Pricing on New Satellite IoT Service appeared first on IIOT Connection.

]]>
https://www.iiotconnection.com/swarm-ceo-sara-spangelo-sets-disruptive-pricing-on-new-satellite-iot-service/feed/ 0
Bayshore Networks and GE Digital Expand Partnership to Secure Industrial and Critical Infrastructure Networks https://www.iiotconnection.com/bayshore-networks-and-ge-digital-expand-partnership-to-secure-industrial-and-critical-infrastructure-networks/ https://www.iiotconnection.com/bayshore-networks-and-ge-digital-expand-partnership-to-secure-industrial-and-critical-infrastructure-networks/#respond Mon, 08 Feb 2021 18:11:41 +0000 https://www.powermag.com/?post_type=press-releases&p=160665 GE Digital’s OpShield technology to be integrated into Bayshore Networks’ solutions DURHAM, N.C., Feb. 8, 2021 /PRNewswire/ — Bayshore Networks and GE Digital today announced an expansion to their partnership to integrate their solutions to address the growing need to secure industrial and critical infrastructure networks. GE Digital’s OpShield technology will be integrated into Bayshore Networks’ advanced […]

The post Bayshore Networks and GE Digital Expand Partnership to Secure Industrial and Critical Infrastructure Networks appeared first on IIOT Connection.

]]>
GE Digital’s OpShield technology to be integrated into Bayshore Networks’ solutions

DURHAM, N.C., Feb. 8, 2021 /PRNewswire/ — Bayshore Networks and GE Digital today announced an expansion to their partnership to integrate their solutions to address the growing need to secure industrial and critical infrastructure networks. GE Digital’s OpShield technology will be integrated into Bayshore Networks’ advanced solutions providing sophisticated industrial cybersecurity and active prevention/protection for industrial equipment, including programmable logic controllers (PLCs), human machine interface (HMIs), and engineering workstations.

“We’re pleased to announce another way we will support organizations who need to protect operational technology (OT) environments, industrial processes, and plant operations,” said Steve Pavlosky, Director, Digital Product Management at GE Digital. “Being at the heart of an operation’s data visualization, control, and reporting, it is critically important to ensure companies are taking steps to protect this key element to their operations. The combination of Bayshore’s In-depth Policy Engine with GE Digital’s OpShield Management Console and Advanced Protocol technology addresses the fact that while companies may have threat analytics or detection solutions as part of a Cyber Security triad, they must have advanced prevention capabilities.”

GE Digital began working with Bayshore in 2019 to bring cybersecurity support to GE Proficy installations. With this extended partnership, Bayshore and GE Digital look forward to providing customers in all industries with software that includes Bayshore Networks’ advanced cybersecurity technology with GE Digital’s OpShield capabilities.

“Bayshore is tremendously excited to see the relationship with GE Digital expand to combine our joint technologies with the goal of launching OpShield NextGeneration as the premier detection/active prevention solution for the entire industrial marketplace as we jointly work to secure the world’s industrial and critical infrastructure networks,” said Kevin Senator, CEO of Bayshore Networks. “Together, we will support existing GE Digital customers as well as new customers with technology to protect their OT endpoints and networks from ever-changing and increasing cyber threats as well as advancing this combined technology to a broad range of control products from a variety of vendors. Bayshore’s advanced technology brings a whole new level of safety and resilience within the reach and control of plant operations everywhere regardless of PLC brand in use.”

This partnership combines GE Digital’s OpShield security technology with Bayshore’s Deep Content Inspection and Advanced Policy Learning and Enforcement, enabling Bayshore to create an integrated product line, to be called OpShield NextGeneration. Bayshore will be the exclusive provider of this combined technology to customers worldwide through GE Digital, Bayshore, and other sales channels. OpShield NextGeneration can protect most HMI and supervisory control and data acquisition (SCADA) systems from unauthorized and potentially high-risk or dangerous network activity such as unscheduled configuration changes, unscheduled maintenance events, indicators of reconnaissance and surveillance, Denial of Service (DoS) attacks, network spoofing and piggybacking.

“Industrial companies will now usually agree that they have hosts and applications which are no longer separated, or “air-gapped” off for safe, isolated operations from the rest of the company or from outsiders and the internet.” said Sid Snitkin, Vice President Cybersecurity Advisory Services, ARC Advisory, “These types of systems are susceptible to certain OT network attacks. And with the influence of the pandemic, the industrial attack surface and the resulting cyber risk just continues to increase.”

“Bayshore understands industrial protocols and can easily retrofit into existing network deployments without having to change existing infrastructure, security practices, or even configuration changes to the equipment,” said Kevin Senator.

Bayshore Networks will begin offering the current OpShield product line to customers in late Q1 with an intended launched of OpShield NextGeneration in 2021 that covers most major PLC vendors with leading edge active protection.

About Bayshore Networks
Bayshore Networks is the leading provider of active industrial cybersecurity protection solutions specifically designed for OT environments, automation engineers, and plant operators. The company created OTfuse®, NetWall™ and OTaccess™ to address the digital and physical security risks which can compromise the safety and availability of OT environments. Their solutions securely protect ICS systems, SCADA, industrial applications, networks, machines, and workers from cyber threats. Bayshore Networks is backed by ForgePoint Capital, Benhamou Global Ventures and Bayshore technology is in use by GE Digital, Kimberly Clark, AT&T, and companies in process manufacturing industries such as oil & gas, chemical, and water utilities, districts, and wastewater treatment sites.

For more information visit us at: www.bayshorenetworks.com.

About GE Digital
GE Digital transforms how our customers solve their toughest challenges by putting industrial data to work. Our mission is to bring simplicity, speed, and scale to digital transformation activities, with industrial software that delivers breakthrough business outcomes. GE Digital’s product portfolio – including grid optimization and analytics, asset and operations performance management, and manufacturing operations and automation – helps industrial companies in the utility, power generation, oil & gas, aviation, and manufacturing sectors change the way industry works. For more information, visit www.ge.com/digital.

SOURCE Bayshore Networks

The post Bayshore Networks and GE Digital Expand Partnership to Secure Industrial and Critical Infrastructure Networks appeared first on IIOT Connection.

]]>
https://www.iiotconnection.com/bayshore-networks-and-ge-digital-expand-partnership-to-secure-industrial-and-critical-infrastructure-networks/feed/ 0
The POWER Interview: AI, Big Data, and Efficiency https://www.iiotconnection.com/the-power-interview-ai-big-data-and-efficiency/ https://www.iiotconnection.com/the-power-interview-ai-big-data-and-efficiency/#respond Sun, 07 Feb 2021 18:35:41 +0000 https://www.powermag.com/?p=160628 The increased use of artificial intelligence (AI) and machine learning (ML) in the power generation sector has a goal of making electricity production both more efficient, and secure. Developing ways to more quickly analyze ever-larger amounts of data is driving innovation among those people responsible for the operation of power plants and generation equipment. Many […]

The post The POWER Interview: AI, Big Data, and Efficiency appeared first on IIOT Connection.

]]>
The increased use of artificial intelligence (AI) and machine learning (ML) in the power generation sector has a goal of making electricity production both more efficient, and secure. Developing ways to more quickly analyze ever-larger amounts of data is driving innovation among those people responsible for the operation of power plants and generation equipment.

Many companies are involved in the research and development of technologies to support AI and ML. Beyond Limits, a California company launched in 2014, works in several markets, from energy–including the power generation and oil and gas exploration sectors–to manufacturing and industrial, as well as healthcare. The company says, “Our mission is to create automated solutions with human-like powers of reasoning that amplify the talents and capabilities of people. We specialize in complex challenges in extreme environments.”

Stephen Kwan

Stephen Kwan, the company’s director of product management for Power Generation/Grid Management, provided POWER with his insight into AI and ML as it applies to the electricity sector, with a look at what the future holds as the power generation landscape continues to change.

POWER: How can the power generation industry benefit from AI and machine learning?

Kwan: Machine learning (ML) and advanced artificial intelligence (AI) solutions are valuable sources of technology currently powering the development of smart applications that have the ability to make very accurate decisions, autonomously, based on learned historical data. Such technologies are being used extensively by the power generation industry to develop customer-centric solutions that understand the evolution of their needs, leveraging essential data such as historical knowledge, expertise, and best practices to make automatic recommendations when decisions to challenging scenarios have to be made in real-time.

Globally, experienced power generation operators represent an ever-shrinking workforce whose deep domain know-how and expertise are critical to efficient, safe, and reliable operations. This subject matter expertise needs to be captured, digitized, and made accessible across the workforce in order to ensure the long-term continuity, efficiency, and reliability of operations across the power generation domain. Novel hybrid AI/ML approaches represent a necessary avenue for combining the value of data and domain knowledge to tackle global challenges facing the power generation industry.

POWER: What about the impact of AI and ML on distributed power generation?

Kwan: AI/ML approaches will add significant value to the ever-growing deployment of large-scale, geographically distributed energy resources. The optimal management of trade-offs between meeting demand, ensuring low risk and reliable operations, and maintaining the integrity of key assets will require AI/ML systems capable of building and supporting holistic models of specific power planning and scheduling, generation, and distribution processes.

Advanced AI/ML solutions can significantly improve the management of the integrity and health of key large-scale assets spanning the generation (e.g. turbines) and distribution (e.g. grids/smart meters, etc.) domains. This can be achieved by combining predictive models (both supervised–learning to recognize and predict specific events and patterns– and unsupervised–learning to detect anomalies and potential events of interest) that learn from a wealth of historical data on critical assets and their operations. Such systems are educated and informed by relevant domain expertise captured and digitized from skilled operators, engineers, and decision-makers.

Asset health and performance are constantly changing as time progresses. In order to maximize profits, it’s important to continually learn from, and adapt to, these changes. AI solutions can provide a dynamic assessment of key parameters, in combination with leveraged domain expertise, leading to more optimal operations of valuable power generation assets. Power generation facilities are often optimized to run within a specific operating envelope and with most operators being trained to support these scenarios. With the proliferation of various energy resources, such as solar and wind, operators must adapt to running such equipment independent of traditional operating ranges. AI can take into consideration the interactions of these unconventional operating set points–in combination with subject-matter expert expertise–to more effectively operate the assets optimally, with lower risk.

POWER: Can machines used in power generation learn from their experiences; would an example be that a machine could perform more efficiently over time based on past experience?

Kwan: Advanced artificial intelligence solutions have the ability to empower software applications to help the power generation industry better analyze large data sets and ID patterns. Such AI systems continually monitor entire processes and learn over time based on historical experiences and data, helping power generation facilities better detect anomalies and make more precise predictions, thus improving overall operational efficiency.

Advanced AI approaches have the potential to revolutionize the management of critical assets across the power generation domain, combining historical and real-time operational datasets with deep subject matter expertise to provide recommendations on asset health management to maximize the efficiency, reliability, and safety of operations while meeting power generation demand and financial objectives.

An example would be leveraging AI solutions to help minimize unexpected system trips and other risks to asset health, integrity, and longevity. Many variables and factors, both historical and real-time operations, such as planning and scheduling targets, equipment constraints, maintenance and more must be monitored holistically in order to remediate and mitigate potential issues in a timely manner, reducing risks to generated power output targets, personnel, assets, and reliability of operations.

AI/ML solutions, trained with and exposed to data from both normal operations, and–when applicable–circumstances representing anomalies or abnormal operations, can extract relevant patterns across historical and current data, predicting and detecting potential system trips and quantifying the risk of unexpected downtime. By means of digitizing relevant domain knowledge, such advanced solutions can then generate explainable and actionable recommendations to help mitigate and remediate potential risks identified by data-driven models targeting critical equipment and assets.

POWER: Are there specific challenges for power plant operators that AI and ML can help solve?

Kwan: A challenging scenario could be that a particular power generation facility may need to take drastic actions if another facility in the grid goes down. In such an instance, machine learning and other AI approaches can be used to predict a facility’s limit to ramp up in support of the generation facility experiencing loss. For example, if a power plant goes down due to a trip, the remaining generation facilities participating in the grid would need to make up the loss of many megawatt hours. In that instance, it’s critical to know the maximum each facility can produce to make up for the loss.

Machine learning and other AI approaches continually learn from historical operating parameters and conditions, decreasing the odds of faltering in the execution necessary to achieve specific targets and goals. This fact remains true under all operating conditions regardless of the skillset, experience, or knowledge base of any one operator. In addition, such solutions digitalize and streamline how essential knowledge is passed between operators thereby decreasing misinterpretation or miscommunication.

POWER: There are certainly many uses for AI and ML in the power industry; what are some of the major ways the technology can support operations?

Kwan: Artificial intelligence can advance entire operations in the power industry by improving infrastructure and asset monitoring, power trading, and outage supervision, prediction, and planning. AI systems can also help better forecast scenarios of various natures including load and power generation. Such unparalleled insights could yield greater potential for power trading and increased risk mitigation.

Outside of uptime optimization for power generation equipment and facilities (e.g. turbines, distributed energy resources, etc.), advanced AI solutions also have the capacity to digitalize and democratize domain knowledge and operational know-how captured from skilled workforces to help solve the challenge of a shrinking global workforce. Improved asset integrity management, predictive maintenance, and anomaly detection with recommendations for remediation/mitigation lead to maximization of operational efficiency against risk, reliability, demand planning, and other similar constraints. This is accomplished via optimized placement and selection of sensors distributed across critical assets to maximize the accuracy of forecasting and detecting operational anomalies.

The issue of distributed energy resources can also be impacted by artificial intelligence. Optimizing operations in a safe and reliable manner without negatively affecting asset health and longevity (e.g. faster start-up and shutdown of gas turbines) can be the basis for minimizing cost, maximizing profit, and ensuring stability for grid equipment health/lifespan. Advanced AI software can act as virtual sensors to infer what a measurement would be in the absence of a real sensor or unmeasurable property, thus providing support for better assessment and prediction of electric grid stability and potential brownouts or blackouts during urban planning processes.

POWER: How can the trend of decentralized power generation benefit from AI?

Kwan: The global energy market is transforming from highly centralized power distribution systems to more complex networks of decentralized systems and the industry is having trouble integrating distributed energy resources and renewables during this transitional period. Intelligent asset management strategies and notifications systems provided by AI solutions can increase grid reliability and provide better maintenance for an aging infrastructure (transmission and distribution systems).

The variability and uncertainty associated with decentralized systems further complicate the management of critical assets as opposed to smaller-scale, localized power generation facilities. AI/ML approaches will play a significant role in augmenting, monitoring, and managing capabilities across decentralized resources, leveraging a wealth of current and historical data streams while incorporating the deep industry know-how required to ensure the adoption and deployment of such AI-based systems.

Decentralized systems often operate in silos, affecting their ability to detect and account for perturbations that can be introduced by others. AI provides the basis to understand the effects of individual actions on the stability and reliability of the grid and overall generation capabilities.

POWER: How important is AI to smart grids? How important is AI to the integration of e-mobility (electric vehicles, etc.) to the grid?

Kwan: Artificial intelligence is very important to smart grids as they consist of smart meters/appliances, renewable and other energy-efficient resources, and a large number of other devices. These devices provide a gold mine of data that only intelligent systems can decipher and convert into valuable insights and forecasts. It is also extremely difficult to model those devices because it’s challenging to understand the effects of changes on the overall availability and reliability of the grid. Advanced AI solutions can provide the ability to model and understand changes that can affect the operations of the electric grid–such as the effect of increasing EV usage, for example.

POWER: How can AI be used in power trading, with regard to forecasts, etc.?

Kwan: Artificial intelligence solutions such as supervised learning models can predict the behavior of creditors or consumers with a high degree of accuracy. Algorithmic trading leverages reinforcement learning to reward and/or rebuke trading bots in response to how much money is made. Rules must be put in place such as stop-loss prevention, of course. This is comparable to a self-driving car security mechanism that would prevent pedestrian/car collisions. Advanced AI can reduce risks from current events and their inherent impacts on the market as models can be retrained with new market conditions front-of-mind. While human intervention and careful monitoring of deployed algorithms will still be required for the foreseeable future, the support of AI has the potential to help the industry ramp up technological advancements at a more accelerated pace.

The accuracy of electricity demands has a huge impact on power trading.  Inaccurate forecasts can lead to losses for the traders and (most importantly) affect power availability. For example, if forecasts are wrong and power generation units are sitting idle when power shortfalls exist in the grid, this can lead to very costly and disruptive mitigations. AI solutions can support complex forecasting models with a large number of inputs and simple updating capabilities to reflect dynamic changes that provide more accurate near real-time forecasts.

POWER: How important is AI to the design and "construction" of virtual power plants?

Kwan: A virtual power plant is a clustering of distributed heterogeneous generation units and an integral part of the “Internet of Energy” (improved monitoring and control of smart grid). Intelligent technological solutions are necessary to utilize before the plant construction phase to ensure the design meets the intended supply generation needs and for identification of potential problems, the projected location and cause of those problems, and probable actions for minimization of adverse impacts. Intelligent electronic devices are used to control the flow of power and operate equipment while making local decisions. AI solutions can enable the analysis and interpretation of “what-if” scenarios enabled by the design and construction of virtual power plants, enhancing learning capabilities that can ultimately be transferred to real-world operations in the form of predictive models and decision-making advisors. AI models can supplement first-principle based models when developing a virtual power plant – especially in cases where first-principle methods are too slow or too complex.

POWER: Machine learning and AI in power generation rely on digitalization. As the use of data becomes more important, what steps need to be taken to support AI and machine learning while still accounting for cybersecurity?

Kwan: Given AI requires a lot of data, it’s vital to incorporate subject matter experts when creating the machine learning/AI models so as to understand the minimum required data set. It’s also important to follow IT best practices to ensure data is isolated from critical infrastructure like the control network, confirm only the least-required privilege is given to the users or applications, and implement logging and audits per industry security measures.

In a typical power generation facility, the control system resides on an isolated and dedicated network. Meanwhile, business users are typically on separate networks that are more “open” for cybersecurity threats. All AI/ML applications require good/clean data for optimal functionality. As such, any data necessary for AI/ML solutions–that reside on the control network– need to be pushed to business networks in a safe and reliable manner. AI/ML applications should avoid reaching across isolated networks to access necessary data. This requires good IT practices, employing a solution to take advantage of products that permit only one-way data flow (e.g. data diodes).

By preventing network traffic into the control network and only allowing data to be pushed from the control network, it decreases the chance of cybersecurity incidents on the control network. Industry best practices for ensuring data integrity should be deployed to ensure data is not changed. If changes are necessary, they must be auditable. Lastly, any recommendations for the AI/ML application should be explainable so that subject matter experts can verify the results if/when appropriate.

POWER: What do you see as the future of AI and machine learning for power generation / utilities?

Kwan: Integration and adoption of machine learning and other artificial intelligence solutions are already increasing exponentially. At this pace, the outlook for advanced technologies points toward AI approaches becoming the gold standard for the future growth of the power generation/utilities industry.

Darrell Proctor is associate editor for POWER (@POWERmagazine).

The post The POWER Interview: AI, Big Data, and Efficiency appeared first on IIOT Connection.

]]>
https://www.iiotconnection.com/the-power-interview-ai-big-data-and-efficiency/feed/ 0