A
Paperitalo
Publication
Vol 1 Issue 8
May 2019
 
Article1

Big Data
  
Last year I was at a paper mill sitting in front on a monitor looking at control logic and tags during a startup.  I was able to force bits to override interlocks and enable equipment to run. I was able to make logic changes. I was able to loop test instrumentation.  It was a fine day.

The next day I sat in front of the same monitor, unable to do any of those things.  The workstation could not connect to any of the controllers. None of the workstations could connect.  This is a frightening condition anytime, and during a startup it creates additional anxiety. We looked at switches, network cables, servers, network interface cards, processor freetime, memory usage, and anything else we could think of.  

Nothing seemed to explain our inability to communicate until we received some critical information.  Approximately 2,000 tags had been added to a data historian. This additional load on the network rendered the whole network into a state of inoperability.

The age of Big Data provides big benefits.  Having data that we can easily access and analyze can unlock answers to the many mysteries of our processes.  However, Big Data without big design and execution can be perilous.

In my previous articles on Industry 4.0, IIoT, and Digitalization I described the enabling technologies the make Big Data possible.  As stated in those articles, our current age does not necessarily mean that there is a lot more data in our mills. It does mean it can be a lot easier and lower cost to get that data where you want it.  Instead of the data being isolated to a field device or control network server, we are bringing that data up into higher levels of the network hierarchy. That means the volume of data on our networks is indeed dramatically growing.

This Big Data impact is being seen in nearly every industry.  In an article "What's next for big data in process manufacturing" from the May-June 2018 issue of Intech, included was a study from McKinsey and Company.  It reported that in 2010 the Communication and Media Industry generated 776 Petabytes. The Banking industry did about the same with 773 Petabytes. Beating them was Government with 911 Petabytes.  That sure seems like a lot, until you look at our industry. Manufacturing came in at 1,812 Petabytes. That makes second place not even close. That was nearly a decade ago.

Since then, the flow rate of data has only gone up.  Since our industry is the biggest player in the game, we have a lot of applications on the market to use this data.  Many of us are becoming more familiar with the more advanced features of Excel, but there are also applications custom made to turn our process data into results.  There are many places to put the data.

No matter where you put it, it won't matter if the data can't get there.  It also won't matter if the data is useless. Before investing in the destination, make sure the source doesn't require an enabling investment.
  • It is critical to have sufficient testing, calibration, and maintenance of instrumentation.  This is the origin of our data. We need to know that those measurements are telling us the truth.
  • It is challenging to keep PID loops tuned.  It requires actuators that work with minimal stiction and hysteresis and routinely analyzing PID performance.  If these loops are oscillating or sluggish, our destination will be filled with data that is noisy or reflective of an underperforming process with lots of loops in manual mode.  Analyzing data like this is not likely to produce beneficial results.
  • The network needs to handle the load in order to get the data where you want it without impacting operations.  If you can't see what you are doing when you are making the product, analyzing downtime data won't help profitability.
All of this is the foundation that you build the house on.  A weak foundation will make the most elaborate house crumble.  The infrastructure needs to be in place before more data is being sent to historians and applications.

In the same way that it is tempting to focus all of our attention on the finished product at the reel and not pay enough attention to the fiberline that enables us to make a good sheet, Big Data can tend to make us focus on the destination instead of the source.  When investing in Big Data, keep an eye on the foundation so that we are not running our process blindly.


Pat Dixon is Southwest Region Engineering Manager for Global Process Automation (GPA), a controls system integration firm.   


Pat and his colleague Bill Medcalf will be presenting a tutorial "The Internet of Things/Machine Learning" at the IEEE Pulp and Paper conference on June 27. This tutorial is intended to help the user base know what questions to ask and what concerns to address before investing in IoT based technologies. The registration link is https://pulppaper.org/tutorials/
 

Article3

Smart IoT Investments in Little Things Lead to Big Payoffs
 

The idea of automating homes and businesses to make them smarter isn't new. For decades, pundits and vendors have touted everything from automated light switches to connected thermostats and smart refrigerators. At this year's CES, a wide range of smart objects made their debut, including a smart dispenser that gives treats to dogs when no one's around, smart toilets, a kitchen touchscreen that responds to voice commands via Google Assistant, and a self-cleaning litter box.

As innovative as these home-friendly products are, they don't provide the value and impact delivered by smart objects for business. Emerging technologies such as artificial intelligence (AI), blockchain and the internet of things (IoT) have redefined the business landscape, and the IoT evolution has pushed the smart business model to a tipping point.





Matthew Lieberman is Marketing Leader at PwC and an innovative executive at the crossroads of marketing, media, and technology.  



Article4
6 Steps to a More Secure IoT
  
Vishal Salvi is Senior Vice President, Chief Information Security Officer and Head of Cyber Security Practice at Infosys.    

  Article5
Rittal puts Industry 4.0 theory into practice



Rittal is creating a smart factory with a view to creating the world's most advanced production plant for compact and small enclosures. 

With a € 250 million investment, the Haiger plant will cover 24,000m² of floor space and will soon house more than 100 high-tech machines. Around 9,000 AX compact and KX small enclosures will be manufactured every day, processing approximately 35,000 metric tons of steel annually.

Read the entire article here.

Author Bethan Grylls

  Article7
  Industry 4.0 to create more smart factories


 
The application of disruptive technologies, such as the industrial Internet of things (IIOT) advanced robotics, digital twinning and blockchain, in the manufacturing sector, holds promise in creating the future smart factory.

This is according to a report by data analytics company GlobalData, which notes the use of advanced technologies, data exchange techniques and flexible automation will play a vital role in addressing much inefficiency in the traditional manufacturing sector.

In the advent of Industry 4.0, the global manufacturing sector is expected to see enhanced human-machine interaction, driving interconnectivity, information transparency and autonomous decision-making.

Read the entire article here.                              

Author Sibahle Malinga, ITWeb journalist    






 
Coming up next month...
  • IoT cybersecurity innovates as more businesses adopt devices
  • Vodafone: How 5G will impact IoT's future
  • We Need to Push the AI Button to Unlock IoT
  • and much more

onlyPulpandPaperJobs.com


The only career center focused on meeting the needs of employers and job seekers in the Pulp and Paper Industry worldwide.




for valuable articles that can help you with your career decisions

Try our Paperitalo Supplier Directory


If you are looking for goods and services in the Pulp and Paper Industry, you need to check out the Paperitalo Directory!