Bring to mind purchasing some Windex from an online marketplace only to discover that what you were actually sold was a crude mixture of rubbing alcohol and blue dye.
Would you buy from that marketplace again? Not likely.
For online retailers, there are a handful of surer ways to lose your consumers’ trust and their business than by unwittingly trafficking in fakes. But unluckily, the problem of pirated and counterfeit goods is reaching a fever pitch in the U.S., and manufacturers face mounting pressure to evolve their methods of erasing counterfeits from their supply. This problem highlights the need for next-gen solutions, and points to the emerging role of smart sensors.
Elevated Scrutiny for Manufacturers
On April 3, President Donald Trump signed a presidential memorandum aimed at wiping out trafficking in fake and pirated goods. It defines the federal government’s plans to double down on its efforts curbing the spread of imitative products.
Additionally, the memorandum commissions the composition of a Report on the State of Counterfeit and Pirated Goods Trafficking, to be prepared and issued within the next seven months. That report could be accompanied by more stringent enforcement actions, along with the implementation of regulatory and policy changes that could hold third-party intermediaries like Amazon and eBay, and the manufacturers on their supply chains, to blame for fraud.
White House National Trade Council Director Peter Navarro called the memorandum a “warning shot across the bow” for e-commerce giants like Amazon. But manufacturers should view it exactly the same.
Countering Counterfeits With Strategic Solutions
Faced with increased government scrutiny and pressure from online marketplaces, today’s manufacturers must make sure the legitimacy of their entire output.
While tracking each and every item produced might not have been justifiable from a cost perspective even a few years ago, the scenery is not the same now. With fraud running rampant (the market for brand counterfeiting is projected to reach $1.82 trillion globally by 2020) the onus is on manufacturers to make every product provably legitimate. The alternative is potentially massive product recalls alongside the looming threat of federal regulations and fines.
With a 210-day interim before the federal report is released, it pays to be proactive.
Here are the three steps I’d suggest manufacturers take during that time to get ahead of the problem:
• Evaluate existing quality control processes: If ever there were a time for manufacturers to take a close look at how they’re curbing fraud in their supply, it’s now, before potential enforcement actions force the point. By convening an inner power team of decision makers to proactively determine weak points in existing processes, manufacturers can take a critical first step toward evolving their fraud prevention tactics.
• Look into smart sensor technology: Luckily for us for manufacturers, tech providers could offer significantly sophisticated tools to fight the rise of counterfeiting. Exclusively, intelligence-driven sensors are emerging as the manufacturing industry’s best bet at eliminating fraud. These embedded sensors have the potential to ensure the legitimacy of every item of a manufacturers’ output. And because they’re being developed to be both tiny (smaller than a pencil tip) and cost-efficient, they are poised to be industry standard across manufacturers of all sizes.
• Closely follow government actions and align with law enforcement: As President Trump’s memorandum makes clear, the federal government will develop a centralized role in keeping track of for fraud, and potentially hold both marketplaces and manufacturers accountable as well. Because of this, manufacturers should seek out opportunities to work proactively with government agencies to retain fraud at bay. While not every manufacturer will be able to launch large-scale fraud prevention initiatives like Alibaba, these types of efforts will serve manufacturers well.
By approaching the problem of counterfeiting with a proactive eye, manufacturers can take a critical step toward defending their output — and their brand identity — against an progressively sophisticated and pervasive threat.
The British government has not decided yet no matter whether to allow for China's Huawei to supply parts for the U.K.'s new 5G wireless network, Digital Secretary Jeremy Wright said Thursday, as he condemned leaks from private government discussions on the issue.
Wright said government authorities and U.K. intelligence agencies are still undertaking an assessment on how best to strike the 'difficult balance between security and prosperity.'
He told lawmakers in the House of Commons that 'there has not been a final decision made on this subject.' The United States is still lobbying allies to keep out Huawei from all 5G networks, saying that the Chinese government can force the company to give it backdoor access to data on its networks.
Huawei officials have rejected that the company is a security risk, stating they do not have links to the Chinese government and operate like any other international company.
Wright said it was unlikely to try to eradicate all Chinese equipment from U.K. telecoms systems. 'Huawei is a significant player in this market; there are very few others,' he said.
Wright also warned lawmakers against leaking details of meetings of the National Security Council, after the Daily Telegraph newspaper stated Wednesday that the council had accepted Huawei's involvement in 'non-core' parts of the 5G network.
'Officials, including the security and intelligence agencies, need to feel that they can give advice to ministers which ministers will treat seriously and keep private,' Wright said. 'If they do not feel that, they will not give us that advice and government will be worse as a result.'
Labour Party lawmaker Jo Platt said the government should possess an extensive request into the leak, which comes amid a Brexit-fueled breakdown in government discipline. With Prime Minister Theresa May weakened by her failure to take Britain out of the European Union, many ministers are positioning themselves to try to replace her.
Platt said suggestions that a minister leaked the information as part of Conservative leadership jockeying were 'truly shocking. '
'Critical issues of national security should be handled with utmost care, not used as political ammunition in a Tory Party civil war,' she said.
The terms “horizontal integration” and “vertical integration” are well known from a number of contexts. From the operational perspective, a horizontally integrated company concentrates its strategies around its core competencies and establishes partnerships to build out an end-to-end value chain. A vertically integrated company, on the other hand, keeps as much of its value chain in-house as it can—from product development to manufacturing, marketing, sales, and distribution.
In the world of business growth strategy, horizontal integration refers to the acquisition of companies that address the same customer base with different but complementary products or services. In this manner, the acquiring company can raise market share, diversify their product offerings, and etc. A vertical growth strategy, conversely, involves acquiring companies that bring new capabilities to the table so as to minimize manufacturing costs, protect access to important supplies, interact faster to new market opportunities, and more.
When it is about production, horizontal integration has come to refer to well-integrated processes at the production-floor level too, while vertical integration means that the production floor is tightly coordinated with higher-level business processes such as for instance procurement and quality control.
In this article we discover how Industry 4.0 has further more exaggerate the significance of horizontal and vertical integration, making them the very backbone on which the Smart Factory is built.
Horizontal/Vertical Integration As the Backbone of Industry 4.0
When it comes to horizontal integration, Industry 4.0 envisions connected networks of cyber-physical and enterprise systems that insert new levels of automation, flexibility, and operating effectiveness into production processes. This horizontal integration takes place at several levels:
• On the production floor: Always-connected machines and production units each become an object with well-defined properties within the production network. They repeatedly communicate their performance status and, together, respond autonomously to dynamic production requirements. The greatest objective is that smart production floors are able to cost-effectively produce set sizes of one as well as minimize costly downtime through predictive maintenance.
• Across multiple production facilities: If an enterprise has distributed production facilities, Industry 4.0 boost horizontal integration across plant-level Manufacturing Execution Systems (MES). In this situation, production facility data (inventory levels, unexpected delays, and so on) are shared seamlessly around the entire enterprise and, where possible, production tasks are repositioned automatically among facilities in order to respond fast and efficiently to production variables.
• Across the entire supply chain: Industry 4.0 proposes data transparency and high levels of automated collaboration all over the upstream supply and logistics chain that provisions the production processes themselves in addition to the downstream chain that brings the finished products to market. Third-party suppliers and service providers must be securely but tightly incorporated horizontally into the enterprise’s production and logistics control systems.
Vertical integration in Industry 4.0 aspires to tie together all logical layers inside of the organization from the field layer (i.e., the production floor) up through R&D, quality assurance, product management, IT, sales and marketing, and so forth. Data flows freely and transparently up and down these layers to make sure that both strategic and tactical decisions will be data-driven. The vertically integrated Industry 4.0 enterprise obtains a crucial competitive edge by being able to respond appropriately and with agility to changing market signals and new opportunities.
The Challenges of Horizontal/Vertical Integration in Industry 4.0
The horizontal and vertical integration aspirations of Industry 4.0 are quite clear and easy to understand. But, as is often the case in life in general, the challenges to obtaining this vision are considerable.
Breaking Down Silos
Industry 4.0 levels of horizontal and vertical integration need breaking down data and knowledge silos, which is not an easy task. It starts off with the production floor itself, where equipment and production units from a range of vendors provide varying levels of automation are equipped with a lot of sensors and use different communications protocols. Put simply, they often try not to “speak the same language,” and a meta-network needs to be established that resolves these communications disparities.
Data Security and Privacy
Horizontal integration in Industry 4.0 requires the sharing of data outside the organization with suppliers, subcontractors, partners, and, in many cases, customers also. This level of transparency is quite empowering as for production agility and flexibility, but it raises the challenge of making sure of that the data of all stakeholders are kept secure and made available purely on a need-to-know basis.
Scaling IT Systems and Infrastructure
Industry 4.0 drastically increases the volume and velocity of data being collected and analyzed in order to support enhanced levels of horizontal and vertical integration. In many instances, IT systems and infrastructure will be forced to undergo a fundamental change in order to support the enterprise’s journey towards digital transformation. Industry 4.0 implementations are often a compelling catalyst for moving enterprise databases and workloads to the cloud, where they are more easily accessible to a wide range of stakeholders.
Strong Orchestration
As the organization’s IT systems and production processes become more integrated and more complex, enterprises will likely need to choose strong orchestration platforms that can provide end-to-end visibility and actionable insights across diverse, distributed systems and entities. These platforms typically aggregate structured and unstructured data from diverse existing enterprise systems in order to extract domain-specific actionable insights and should provide end-to-end product-analytics solutions for both manufacturing companies and their suppliers.
Industry 4.0 combines cutting-edge data analytics, machine learning, and artificial intelligence technologies with a purpose to streamline and customize manufacturing processes. At its very core is the vision of horizontally integrating production processes themselves so that they can be self-learning, self-healing, and agile. For Industry 4.0 horizontal integration also means developing a seamless, data-centric, collaborative network across the organization’s entire supply chain. Vertical integration does the same thing for the organization’s own business units, ensuring an unprecedented level of alignment between production processes and core business activities such as ICT, sales, marketing, logistics, engineering, and more.
The measurable benefits of such integration include lower production costs and the enhanced ability to cost-effectively manufacture small customized batches—all without having detracting from the highest quality standards.
NEWTON, Mass., USA - Aircuity,the leading global provider of measurably greater environments, is excited to announce several new products and additional features to its analytic platform. Based on two years of experience on its new cloud-based platform, these innovations are exclusively aimed to the fast-growing commercial building health & wellness market. Aircuity recently completed shipment of its solution to two of the most iconic U.S. commercial office projects and is extremely active internationally in the same space thanks to its best-in-class life cycle cost and most exact Indoor Environmental Quality (IEQ) sensing approach.
'With our 2.0 software platform finished, these newly released innovations will allow us to deliver an even stronger value proposition to both current and new clients, starting with a more convincing first cost and improved ROI,' says Aircuity CEO Dan Diehl. 'fundamentally, we have been selling and supplying improved environments for a decade and this will tolerate mass market acceptance of a premium solution.'
These new products and features address the fast-growing building wellness movement (and corresponding standards like WELL™ and Fitwel®), which are structured on latest academic results linking environmental quality to improved cognitive function. Luca Mazzei, Aircuity's Chief Growth Officer, states: 'Healthy, safe and more productive buildings are the fastest growing segment in the intelligent buildings category and we see increasing implementation in the global market.'
The solution enhancements include:
• MD100 Tubing: This is a plenum rated, more cost effective MicroDuct tube for multi-parameter sensing in healthy building applications. It allows for precise capture and measurement of volatile organic compounds and particles in addition to carbon dioxide and carbon monoxide levels, as required by a growing number of certifications like WELL.
• Architectural Series Wall Probe: This new wall probe is designed to be modern and aesthetically pleasing in commercial offices and other public gathering facilities.
• Increased number of sensed locations on each sensor suite: For healthy building applications, this allows for significant cost reductions, reducing first cost and increasing ROI.
MILWAUKEE, April 29, 2019 — Industrial laborers can now more smoothly use the data from their devices to estimate production dilemmas and develop processes with their present automation and control skill set. The new FactoryTalk Analytics LogixAI module, originally known as Project Sherlock, uses artificial intelligence (AI) to detect production anomalies and alert workers so they can investigate or intervene, as necessary.
Many existing analytics technologies demand deep expertise in both data science and industrial processes. But this add-on module for ControlLogix controllers minimizes that burden by doing the job of a data scientist. It fits directly into a control chassis and streams controller data over the backplane to build predictive models. It can regularly monitor a production operation, detecting anomalies against its derived understanding.
'The FactoryTalk Analytics LogixAI module makes predictive analytics more accessible to help more workers make better production decisions,' said Jonathan Wise, product manager, Rockwell Automation. 'The module learns your ControlLogix application and tells operators and technicians when things are transforming in unexpected ways. This can help them get onwards of product quality issues and protect process integrity.'
For example, the module can help operators spot performance deviations in equipment like mixers that could impact product quality or turn to downtime. It can also be used as a virtual sensor. Instead of workers taking a reading, like the humidity of a prepared food product, the module can analyze variables from line assets like sprayers, dryers and burners to predict a measurement, virtually.
Workers can then be notified of problems by configuring alarms on a human machine interface (HMI) or dashboard. Future features of the module will go further, helping workers focus their problem-solving or automate the optimization of a process.
The FactoryTalk Analytics LogixAI module is the latest addition to the FactoryTalk Analytics portfolio from Rockwell Automation. The portfolio consists of FactoryTalk Analytics for Devices, which learns about an automation system's structure to tell workers about problems with individual devices. The LogixAI module expands on this by learning about an automation system's application and helping identify anomalies with its overall function.
Both products work individually, but each will benefit the other in future iterations. The FactoryTalk Analytics platform aggregates multiple sources of data, so workers can discover new insights. FactoryTalk Analytics for Devices and the LogixAI module will both be data sources for the platform going forward.
TOKYO, JAPAN (April 25, 2019) – SoftBank Corp.’s HAPSMobile and Alphabet's Loon have shaped a long-term strategic relationship to advance the use of high altitude vehicles, such as balloons and unmanned aircraft systems (UAS), to bring connectivity to more people, places, and things worldwide. As part of the new relationship and HAPSMobile’s financial and investment strategy, HAPSMobile has made a decision to invest $125 million USD in Loon. Loon has obtained the right to spend the same amount in HAPSMobile in the future. Additionally, to improve the relationship, the companies are actively exploring commercial collaborations to accelerate the deployment of high altitude network connectivity solutions, with a focus on expanding mobile internet penetration, providing internet of things (IoT) applications, and aiding in the deployment of 5G.
Exclusively, the companies have entered into formal negotiations on a number of areas of expected technical and commercial collaboration, including:
A wholesale business that would allow HAPSMobile to utilize Loon’s fully-functioning vehicle and technology. Likewise, Loon would be able to utilize HAPSMobile’s aircraft, which is currently in development, upon its completion.
A jointly developed communications payload that is adjustable to multiple flight vehicles and various ITU compliant frequency bands.
A popular gateway or ground station that could be deployed globally and utilized by both Loon and HAPSMobile to give connectivity over their respective platforms.
Adapting and optimizing Loon’s fleet management system and temporospatial SDN for use by HAPSMobile.
Producing an alliance to promote the use of high altitude communications solution with regulators and officials worldwide.
Enabling flight vehicles from each party to associate and share the same network connectivity in the air.
With the deployment of such technology, people will be reachable in areas where connectivity is lacking; such as mountainous terrain, remote islands, and developing countries.
Each company gives important and complementary strengths to the table. HAPSMobile, SoftBank’s subsidiary, is a joint venture with AeroVironment, Inc. HAPSMobile has completed its development of HAWK 30, its aircraft-type stratospheric telecommunications platform. SoftBank is one of Japan’s telecommunication carriers and has considerable experience in network planning, operations, and bringing connectivity solutions to the market.
Loon brings technical leadership through an established, fully-functioning high altitude vehicle and communications system that has therefore flown over 30 million kilometers and connected hundreds of thousands of users worldwide. In addition, Loon has nearly a decade of experience developing, launching, flying, and managing a high altitude platform.
High altitude network connectivity platforms operate in the stratosphere, which is above ground infrastructure, but below satellites, allowing fornear ubiquitous coverage that prevents ground clutter and significant latency issues. These strengths make such vehicles a promising solution for increasing mobile coverage to those who need it as well as IoT and 5G use-cases. HAPSMobile and Loon, cooperating with strengths and technology from each party, seek to provide next generation global connectivity and revolutionize the world's mobile networks.
It may be the sixth year for the Brooklyn 5G Summit, but in the minds of several speakers, 2019 is also Year Zero for 6G. The annual summit, hosted by Nokia and NYU Wireless, is a four-day event that covers all things 5G, including deployments, lessons learned, and what comes next.
This year, that meant initial analysis into terahertz waves, the frequencies that some researcher believe will make up a key component of the next next generation of wireless. In back-to-back talks, Gerhard Fettweis, a professor at TU Dresden, and Ted Rappaport, the founder and director of NYU Wireless, talked up the potential of terahertz waves.
As a quick primer on the electromagnetic spectrum, terahertz waves (despite what the name implies) occupy the 300 gigahertz to 3 terahertz band of spectrum. This means the frequencies are higher than the highest frequencies that will be used by 5G, which are understood as millimeter waves, and fall between 30 and 300 GHz.
In his talk, Fettweis discussed the potential of terahertz waves and 6G to solve some of the problems of 5G. He indicated to the trend established by preceding generations of wireless: While 1G provided us with mobile telephony, 2G improved on that and addressed some of its predecessor’s shortcomings. 3G and 4G did the same with mobile data. Now that we’re moving on to 5G, which is expected to support many new applications like the Internet of Things and AR/VR, Fettweis said it was only natural that 6G will function similarly to 2G and 4G to correct the flaws of the previous generation.
As to what, exactly, terahertz waves will correct—that’s still largely unidentified. Service providers around the world are only now rolling out their mobile 5G networks, and it will take time to recognize the shortcomings. Even so, the actual properties of terahertz waves point to some standard ways in which they could help.
Terahertz waves, as said, have smaller wavelengths and higher frequencies than millimeter waves. That suggests terahertz waves should be set to carry more data more conveniently, however they will not be able to propagate as far. In general, that implies that the introduction of terahertz waves into mobile networks could address any areas in which 5G isn’t able to present high enough data throughput or low enough latency. During his talk, Fettweis revealed the results of tests in which terahertz waves were able to transmit 1 terabit per second of data for a grand total of 20 meters (yeah, not very far at all).
But if you presume those results are less than impressive, they don’t dissuade Rappaport, who gave a very serious talk on the future of terahertz waves as they relate to 6G and, dare I say it, 7G. Rappaport, who was one of the pioneering researchers into millimeter waves and played a large role in proving they would be viable for 5G networks, suggested that with these frequencies, as well as additional improvements in cellular technology, we’ll someday see thousand-dollar smartphones that have the computational power of the human brain.
Of course, it’s all highly speculative at this point, but if past trends continue, we can expect to see service providers harnessing terahertz waves for communications in areas with many devices or large amounts of data a decade from now. And that will all be thanks to the fundamental research that’s just getting underway today.
As the world races to utilize speedy 5G mobile networks on the ground, some companies remain centered on floating cell towers in the sky. During the final meeting of the sixth annual Brooklyn 5G Summit on Thursday, Silicon Valley and telecom leaders talked whether aerial drones and balloons could finally start off providing commercial mobile phone and Internet service from the air.
That same day, Alphabet subsidiary Loon, a balloon-focused graduate of the Google X research lab, unveiled a proper cooperation with Softbank’s HAPSMobile to improve both solar-powered balloons and drones to grow mobile Internet coverage and aid in deploying 5G networks. No high-altitude network connectivity services have taken off commercially so far, but some Brooklyn 5G Summit speakers were confident that it would happen rapidly.
“ The benefit is in our hands in terms of truly leveraging 5G in combination with the significant paradigm shift when it comes to UAS—drones—and also satellites,” said Volker Ziegler, CTO at Nokia Bell Labs.
Nobody needs the high-flying Loon balloons and HAPSMobile’s drones to compete directly with ground-based 5G networks in the near future. Until recently, it hasn’t been easy to develop a balloon or drone platform that is cost-effective enough to even consider using for telecommunications, said Salvatore Candido, principal engineer at Alphabet and CTO of Loon. But such high-flying platforms may help fill the gaps when coverage is lacking in rural or otherwise under-served communities. (Even rural parts of the United States may miss out under current 5G network deployment plans.)
Fleets of balloons and drones could also create plans on a temporary basis, such as during a major pre-planned event like the Super Bowl or in the wake of a natural disaster. Nokia formerly partnered with Alphabet’s Loon when the latter delatter deployed its experimental balloon fleet to create basic Internet service to 200,000 people in Puerto Rico after the U.S. island territory was left devastated by Hurricane Maria iployed its experimental balloon fleet to provide basic Internet service to 200,000 people in Puerto Rico after the U.S. island territory was left devastated by Hurricane Maria in 2017. The balloons carried LTE technology from Nokia as part of a bigger coalition involving AT&T and T-Mobile.
“ There’s a billion people in the world who don’t have enough connectivity, whether that’s irregular because of a hurricane or just because of where they live,” Candido said. “I think all these new technologies coming together makes it possible to create networks that might begin to cover huge numbers of those people.”
Loon has not yet begun deploying 5G equipment on its balloons—though the cooperation with Softbank’s HAPSMobile suggests that might someday be possible. But the advent of terrestrial 5G networks could also make it easier for companies to deploy Internet drones or Internet balloons. Nokia’s Ziegler pointed out that 5G offers advantages over 4G LTE when implementing a relay system that bounces the signal around between groups of balloons or drones to extend coverage well beyond the ground station where the signal originates.
The availability of 5G network technology could also make it easier from an air traffic control standpoint, to track and manage a large group of drones, said Giuseppe Loinno, an assistant professor in electrical and computer engineering at the New York University and director of the Agile Robotics and Perception Lab.
When the time comes, it will be important for telecommunications companies to create demand for high-flying mobile phone and Internet services by showing what they can do for communities or customers, said Dallas Brooks, director of the Raspet Flight Research Laboratory at Mississippi State University and associate director of the ASSURE FAA UAS Center of Excellence. He invited Brooklyn 5G Summit attendees to collaborate with him and other universities participating in the Federal Aviation Administration’s research and testing program for integrating drones into U.S. national airspace.
Loon may be among the first to take that advice with its balloons—even if they won’t deliver 5G service in the beginning. The company’s stratospheric balloons have already won their first commercial contract with Telkom Kenya to provide mobile phone service for some of Kenya’s almost 50 million citizens. But Loon certainly won’t be alone in trying to make such projects work in the 5G era. “There is no shortage of people trying to create pseudosatellites in the stratosphere,” Candido said.
Gone are the days when cyberattacks were a passing deep concern for technology manufacturing companies. As smart machines swap legacy equipment, the volume of cyberattacks is fast growing, maximizing the risk of production slowdowns, product defects, and lower productivity.
Hackers know that many manufacturers, particularly those that have 24/7 production lines or operate in a just-in-time manufacturing environment, are not able to put up with a lengthy disruption without adverse business effects. This vulnerability has resulted in a sharp increase in ransomware attacks that use malicious software to hold a system hostage until a ransom is paid. Essentially, it is a style of felony.
The Chubb Cyber Index reveals that ransomware assaults against manufacturers exceed similar attacks against all other industry segments, including healthcare—a traditional target. Hackers believe that a hospital is more inclined to pay a ransom in order to restore operations for patient safety, and they expect manufacturers doing the same to keep the factory humming.
Similarly, Verizon’s 2018 Data Breach Industry Report, which cites cyber espionage as an increasing pressure, reflected that data breaches affecting manufacturers had also developed. On top of extorting a ransom, hackers are also increasingly interested in searching for a company’s research and development data, proprietary product blueprints, and intellectual property to sell on the Dark Web.
Upsides and Downsides
The advanced threat of a cyberattack puts technology manufacturers in a new and difficult position: a previously low-risk industry now has a high-risk profile. This is generally the result of the industry’s embrace of the Industrial Internet of Things (IIoT)—the internet-enabled connections between operational technology (OT) and information technology (IT).
Leveraging smart machines, technology manufacturers can make high quality products, boost productivity and obtain real-time insights into the supply chain to shift production where needed. These and other benefits are attainable thanks to sensor-produced machine data that travels from OT systems to IT systems where exactly the data is screened for business purposes. While these sensors can deliver great benefits to the manufacturer, they even offer a new avenue for hackers to exploit, providing a new opportunity for the data to be stolen or compromised.
Regardless of the very serious risks posed, the rewards of the IIoT have made it an integral part of efficient methods of production and its use will stay to increase. For that reason, technology manufacturers must strengthen the connections between their OT systems and IT systems to decrease unauthorized network intrusions. But how?
The 1st step in this process is to conduct a tech audit of the IT and OT systems to determine which assets are connected with the network. For one, it's not uncommon to find an old printer connected to the network. Over the past, having a random printer on the network wasn’t much of cyber risk, but now that the IT and OT systems also are on that same network, a hacker can potentially enter the printer’s antiquated operating system to gain entry onto the network and into the OT systems.
An audit will ferret out proof of unauthorized wireless local area networks within the plant’s perimeter and closeness. It’s advisable for audits to adhere to the cybersecurity standards, guidelines and best practices of a certified framework, like the one provided by the National Institutes of Standards and Technology (NIST).
Segment the Network
Start thinking about an office building with a locked front door but many rooms with unlocked doors inside it. Just after a burglar gets through the first door, he has access to the rest of the building. Network segmentation locks all the doors, and only those with keys can enter to various sections of the network. When defenses against unauthorized network access are applied, more sensitive data can be segmented behind other “doors” that are locked with higher levels of security.
Hiring a third-party penetration testing firm is one more smart tactic. They employ technicians who will try to defeat security methodologies and hack into the network. The lessons learned from these exercises can be used to further bolster security measures.
Keep in mind, vendors add value to a business, but they also put it at increased risk. Leverage software and other due diligence methods to clear and pre-qualify third- and fourth-party vendors just before they even enter the business ecosystem.
Last but not least, it’s incumbent upon all companies to learn and teach their employees to detect and report evidence of a phishing attack or other types of harmful programming, given the role that social engineering plays in hackers’ strategies. Some manufacturers also perform mock phishing attacks to find particular vulnerabilities and perk up their training programs—a smart tactic, given that the Chubb Cyber Index notes that more than 30 percent of cyber claims in 2018 involved phishing attacks.
A bad night's sleep, or a night without enough sleep, may not just trigger weight gain, it can actually kill you, or leave you badly injured. A Harvard Medical School study — one of many on the subject — found that insomnia is to blame for 274,000 workplace injuries and errors each year, costing companies billions.
That this is the case is not really surprising. According to the CDC's National Institute for Occupational Safety and Health (NIOSH), 38 percent of American workers get less than the doctor-recommended seven hours of sleep a night. It makes sense, given the pressure-packed lifestyle of many Americans — long commuting times, overtime at work, night shifts, moonlighting, and plain old family pressures.
But, workers' sleep concerns are employers' sleep problems — specifically for workers in the manufacturing sector. According to the National Safety Council, 63 percent of workers in the manufacturing sector reported feeling tired at work, while a massive 55 percent of employers reported finding workers sleeping on the job. Meanwhile, 73 percent of employers in the manufacturing sector report that weakness causes decreases in productivity, and 44 percent said stress was a contributing factor to incidents on the job. Overall, 13 percent of all workplace injuries can be attributed to fatigue, the study said.
Those injuries, according to the Harvard study, cost companies just as much as $31 billion a year. But those are just the “direct” costs of dealing with injuries; it doesn't take into account productivity costs due to “presenteeism,” where workers show, but are too tired to do their job productively.
A 2018 Japanese study shows that while companies, on average, lost $520 per employee per annum, they had an annual loss of $3,055 due to presenteeism. That study covered losses from employees suffering from health problems, but it goes to show just how extensive the hidden costs of non- or lower-productive employees can be, and that is just as relevant for losses due to workers who don't sleep enough.
If not enough sleep is as much a problem for employers as it is for workers, it stands to reason that employers would like to implement policies that encourage workers to get a good night's sleep. There are several things they can do:
According to the National Safety Council, there are numerous risk factors on the job that encourage stress. Among them: shift work, especially overnight/rotating or irregular shifts; overtime, where workers spend more than 50 hours a week on the job; long shifts; particularly physically demanding work; and long commutes.
A large number of employees—and employers—report being the subject of more than one of these risk factors, and many are subjected to two. “While a conclusion cannot be drawn in a definable fashion that two or more risk factors multiply risk, it is common sense that the more risk factors individuals have, the higher the probability that their work quality, productivity and safety will be affected,” the Council said, adding more that when possible, employers can and should “look for ways to design working days to minimize the number of concurrent fatigue risks. When concurrent fatigue risks are unavoidable, employees should be given additional recovery time.”
Alleviating risk factors, however, doesn't continually work. If an employee lives an hour's travel away from the facility, there's little the company can do to change that. Yet, organizations do need to make certain that their employees are getting a good night's sleep. If risk factors can't be eliminated, maybe they can be reduced. Therefore, an organization can supply lodging for employees who work late for the occasional times they can't leave because of a problem or an emergency. Organizations can also structure work schedules to allow consistent hours for workers, enabling them to work set hours, ensuring that their circadian rhythms are stable.
Companies can also reach out and provide credits for employees to get the seven hours of sleep scientists say they need. Sleep can be “gamified,” using apps and contests that provide perks for workers who “win” by getting a good night's sleep.
Organizations should also encourage the use of technology to promote sleep. As a holiday gift, like for example, companies can provide employees with a wearable sleep headband, noise-masking sleepbuds, or other sensor or technology-based device that increases sleep.
An employee who loses a finger on the production line, a worker who, as a result of fatigue, misses an important step in the production process—the direct losses mount up, ending in higher health care costs and lost money and time because of poor quality production. Then, there are the “affiliated” costs, like the lawsuits from customers who didn't get their merchandise on time, the civil lawsuit by the injured employee, and the eventual loss of customers and talent. These are nightmare circumstances, and insufficient sleep can be a significant factor in all of them. Companies need to defend themselves, and they can do so by helping their workers sleep better.
Based upon the statistics, 47.3 percent of internet users made purchases through the Internet in 2018. It's 6 percent more compared with the year 2013. And this figure may possibly grow, as it makes the process of shopping rather more convenient. As well as, it’s not time-consuming mainly because people don’t have to spend time on the way to the supermarket or even abroad.
In view of the growing number of falsified schemes on the web, and day-fly sites which disappear after getting money, the security of the deals assumes prominence for clients. As a result, it becomes more and more difficult for e-commerce sites to head above water.
If your range of items is diverse and pricing is attractive, but sales leave much to be desired, it is time to change the situation. Try to implement the following tips in practice:
Add customer notes.
The psychologists discuss the necessity of references for a long time. Robert Beno Cialdini in his book “Influence: The Psychology of Persuasion” details upon the issue of the principle of social proof which influences people when taking a decision.
The figures prove this fact: · Customer notes might increase page view by 10 percent within 2 days. · 63 percent of consumers prefer to buy goods on sites with ratings and reviews. · On average, a user looks through from 4 to 7 reviews before paying for the purchase. · 96 percent of retailers reckon that reviews contribute to better sales. · Positive reviews serve to increase the probability of purchase by 55 percent. · 67 percent of users give credence to reviews as a source of information.
And here is a bit of advice: show preference to authorization through social networks. For this reason, potential customers will see that the reviews are posted by real-life users. And this e-commerce site is trustworthy to use.
Let consumers know your team by sight.
In view of the fact that some e-commerce sites are created with the aim to get money for nothing (when the transactions are conducted, such online shops cease to exist), it’s advisable to present future clients with the information about your team (references to their social network profiles are welcome). The further information people know about an e-commerce site and its team, the more trustworthy is looks.
Demonstrate safety data sheet.
To start, let us determine the notion of safety data sheet (SDS) of a site. SDS is data through which the site informs users that it is safe to share data with such site. In some way, it might be compared to the submission of documents. Before entrusting your safety to a person introducing oneself to you as a policeman, you have a right to inquire about this person to verify his/her identity. In such a way, you will see for yourself that this man works for the law enforcement body, and it's worth believing him/her.
Presence of a safety data sheet proves that: · the e-commerce site uses https ciphering · the site belongs to a publicly listed company (not a fake one) · the domain name holder has reaffirmed the right on the site
Provide clients with a transparent privacy policy
It is usually essential to figure out what information is stored by the site one browses. Nowadays, it determines whether or not a user will be spammed with ads or not. In addition to, a client’s secrecy is under threat.
Thus, it is recommended not to store personal information about the e-commerce site’s consumers. And of course, this fact should be highlighted in its privacy policy statement.
Make your e-commerce site safe
The security of a website influences the security of its future users. When the team of a website takes all reasonable precautions to guard the site against possible threats, the site is doomed to success.
Here’s what to do: · protect your corporate network by means of a VPN service (hackers won’t manage to steal corporate information or monitor the activity of your users); · inform your team about possible threats (knowledge is power); · update the devices used for the successful work of your site on a regular basis (regular updates serve to make the site resistant to the new viruses and malware); · don’t forget about backups (in the event an e-commerce site fails to withstand a hacking attack, the data will be restored).
Show preference to a reliable method of payment
Although it is the fact that cryptocurrency payment is the securest these days, it's not comprehensible for all users. For that reason, it is better to choose more traditional payment methods: credit and debit cards, PayPal, etc.
In addition, it’s recommended to introduce an additional layer of protection to transactions of the users.
In such a way, the combination of security measures and some transparency of your e-commerce site will positively solicit more buyers and fatten the profit.
Reshoring is a hot topic in the United States. Politically charged, the practice promises a growth in jobs for workers, profits for manufacturers and a rebirth of the ‘Made in the United States’ movement. Small- to medium-sized manufacturers can prepare for a manufacturing resurgence, driven by reshoring and regulatory changes.
Reshoring is the relocating of a business operation that was moved overseas, straight back to the country it originated in. In this case, United States manufacturers are bringing their services home from other manufacturing economies, such as Asia. But, what’s driving this change?
Impending regulation by the International Maritime Organization (IMO) has mandated a reduction in fuel emissions, in a bid to minimize the environmental impact of ships. The regulation would be in effect January 2020 and is anticipated to instantly increase demand for higher quality fuels for shipment.
By improving the cost of shipping goods across seas, the regulation means it will become more cost effective for goods normally made in China (that’s 20 percent of the worlds manufacturing output) to be sourced from homegrown or near-shore manufacturing facilities. Sounds great, doesn’t it?
News headlines suggest that manufacturing is returning home. And yet, a renaissance of America’s industry was never likely to be so simple, specifically for small and medium sized manufacturers. Back in the 1970s, when the United States manufactured 18 percent of the world’s total goods, the industry looked a lot different to the sector we know today. While automation and robotics were present in some factories, the technology was certainly not commonplace. In fact, one of the arguments as to why Asia has stormed ahead in the global manufacturing race is because of the region’s aggressive deployment of automation.
China, for instance, is the world’s biggest market for industrial robotics, boasting sales near to the combined volume of Europe and both North and South America in 2016. The United States is actually not averse to automation, but its use has been limited to large-scale facilities. As the manufacturing renaissance begins, this will need to change.
Small- to medium-sized manufacturers have for sure heard of the smart factory movement. However, they would be forgiven for thinking investing is too expensive. Luckily for us, that's not the case. Investing doesn’t require a whole systems overhaul. Instead, industrial parts suppliers are enabling manufacturers to make small incremental changes to automate production.
Consider this as an example. A manufacturer of peripheral products for automotive production, for instance electrical control and security products, may already use a SCARA robot to assemble circuit boards. And yet, the facility may not have a suitable programmable logic controller (PLC) to cope with the robot and any associated automation, such as a conveyor, in the most effective manner.
By investing in new technology to complement automation, the manufacturer could reap extensive production rewards. In cases like this, the PLC system could enable complete synchronization of the conveyor and SCARA robot, enabling circuit boards to be assembled without pauses in production.
Should the IMO’s fuel emission regulation cause increased costs for offshore imports, improved productivity in U.S. facilities are likely to be extremely important. America cannot create entirely new production facilities to swap the huge amounts of manufactured goods we currently import from Asia. Still, manufacturers can prepare to increase their capacity by using automation.
When large organizations are hit with larger invoices by their overseas suppliers, America’s small and medium sized manufacturers need to become the go-to guys. For this to be successful, these businesses should have a chance to access technology that will allow them to manufacture as efficiently and quickly as their overseas predecessors.
No enterprise wants to be a dinosaur when it comes to innovation, and today, AI is on the frontlines. With an estimated 80 percent of enterprises at the moment using AI in some form, the change to AI looks like as widespread as the transition from typewriters to PCs.
Despite the hype, enterprises ‘sense’ the challenge: in a recent study, 91 percent of companies foresee significant barriers to AI adoption, including a lack of IT infrastructure, and shortage of AI experts to guide the transition.
Nevertheless, a couple of organizations truly understand what lies ahead of them, and what it really takes to transition out of the AI Jurassic era. Let’s look more closely at the underlying realities of AI adoption that your internal AI group or consultant will never tell you.
The Use Case: Turning a Traditional Enterprise Into an AI-Enabled Organization
To paint a picture, let’s consider a hypothetical company, Global Heavy Industry Corporation (GHIC). Maybe their goal is to reduce costs and improve quality in its production facilities via a corporate-wide deployment of AI.
The company makes industrial machinery that needs skilled workers to assemble complex machinery from parts, and has a series of control checkpoints to retain production quality. At this point, the process is totally manual.
With the recent raise in AI awareness, coupled with competitive pressures from lower-cost producers, GHIC has established an aggressive roadmap of bringing in visual-based AI in their factories, by leveraging existing security camera infrastructures.
The first step? Collecting pertinent data for their models.
Myth No. 1: All the Data I Need for My AI Is Freely Available
The first hurdle GHIC faces is gathering and developing data for their visual AI. Data is AI’s DNA: neural networks and deep learning architectures depend upon deriving a function to map input data to output data.
The effectiveness of this mapping function hinges on both the quality and quantity of the data provided. As a whole, having a much bigger training set has been shown make it possible for more effective features in the network, resulting to better performance. In essence, large quantities of high-quality data lead to better AI.
But how do companies go about producing and preparing this data? Amassing and labeling (or annotating) is commonly the most time-consuming and expensive step in data preparation. This process enables a system to recognize categories or objects of interest in data, and defines the appropriate outcome the algorithm should predict once deployed.
In many cases, internal annotation is the sole option for enterprises, due to privacy or quality concerns. This might be because data can’t leave the facility or needs extremely accurate tagging from an expert.
Myth No. 2: I Can Easily Hire AI Experts to Build an Internal AI Solution
Just after the data is prepared and ready, the second task is to set up the initial implementation of the AI system. This is just where the next set of challenges lies for GHIC. While there are a plethora of AI tools for developers, AI expertise is nearly impossible to find. By some quotes, there's just around 300,000 AI experts worldwide (22K PhD qualified).
Definitely, the need for AI talent outweighs the demand. While the option of accelerating training in AI is unfeasible — it still requires four years to earn a Ph.D. — the only viable option is to lessen the bar to entry, by introducing software frameworks that sidestep the need for in-depth knowledge of the field. Otherwise, organizations risk waiting forever to find adequate AI talent.
Myth No. 3: I Have a PoC, Building a Final AI Solution Is Just ‘a Bit More Work’
If GHIC gets to the point of finding the internal/external AI resources to put into action a Proof of Concept (PoC), they may assume that they are only steps away from deploying a final solution.
The truth is, AI adoption requires a multi-step approach. For many organizations, the first step is a PoC. After many years of working in AI, I have seen countless PoCs don't succeed of implementation. In order to prevent wasted time and money, organizations need to set a timeline, and determine criteria in advance that will ascertain whether the tech should go into production. A simple benchmark such as “if the PoC delivers X at Y functionality, then we will launch it here and here” would go a long way to help enterprises define an actual deployment scenario.
Myth No. 4: When I Get a Good Performance From My AI, I Don’t Need to Touch It Anymore
Let’s assume GHIC gets past all the obstacles above, and successfully implements AI. In the long haul, GHIC will be challenged by frequently up and coming use cases or changing conditions, and the need to adapt their AI in a prompt and inexpensive fashion.
Successful organizations look beyond today, to ask how their AI solution can scale in the long run. With AI systems of greater complications, data storage/management, retraining costs/time, and overall AI lifecycle management tools are required to assure an AI project fails to become a mess, or worse, ineffective.
Beyond AI Myths: AI Is Not a One-Off, It Is Here to Stay
GHIC has learned the hard way that AI is not a simple, one-off project. To the contrary, it can come to be a long, costly endeavor.
To effectively implement AI, enterprises will need to build internal teams mixing engineering, R&D and product, that work closely in building, testing, and delivering the application, and will supervise maintaining and iterating on it in the future.
And new tools are letting a lot more organizations to adopt AI adoption. By taking back control of their AI strategy, enterprises’ teams will be able to swiftly build AI solutions, deploy and develop them over the AI lifecycle.
Google affiliate Wing Aviation has received federal approval enabling it to make commercial deliveries by drone.
This is the first time a company has gotten a federal air carrier certification for drone deliveries.
The permission from the Federal Aviation Administration means that Wing can operate commercial drone flights in part of Virginia, which it schedules to commence later this year.
The FAA stated Tuesday that the company satisfied the agency's safety requirements by engaging in a pilot program in Virginia with the Mid-Atlantic Aviation Partnership and Virginia Tech, and by conducting thousands of flights in Australia over the past several years.
'This is a critical step forward for the safe testing and integration of drones into our economy,' Transportation Secretary Elaine Chao said in a statement. Wing said the approval 'means that we can begin a commercial service delivering goods from local businesses to homes in the United States.'
The company did not name any businesses that would take part in commercial deliveries. It said it plans to take the following a few months demonstrating its technology and answering questions from people and businesses in Blacksburg and Christiansburg, Virginia. Wing said it will 'solicit feedback with the goal of launching a delivery trial later this year.'
Wing revealed that to win FAA certification it required to show that one of its drone deliveries would pose less hazard to pedestrians than the same trip made in a car. The company said its drones have flown more than 70,000 test flights and made around 3,000 deliveries to customers in Australia.
The company is offering a number of benefits from deliveries by electric drones. It says medicine and food can be delivered faster, that drones will be mostly useful to consumers who need help getting around, and that they can shorten traffic and emissions.
Drone usage in the U.S. is continuing to grow swiftly in some industries such as for example utilities, pipelines and agriculture. But drones have faced more limitations in delivering retail packages and food because of federal regulations that bar most flights over crowds of people and beyond sight of the operator without a waiver from the FAA.
The federal government recently estimated that about 110,000 commercial drones were operating in the U.S., and that number is supposed to zoom to about 450,000 in 2022. Amazon is working on drone delivery, a topic keen to CEO Jeff Bezos. Delivery companies including UPS and DHL have also conducted tests.
Manufacturing facilities are well-known for maintaining a lean and mean operation. Every person from CEO-level administrators to floor-level plant managers are searching for ways to raise efficiencies, cut waste, and streamline operations. Commonly, much attention is focused on keeping productivity levels high on the plant floor, but this sometimes comes at the expense of administrative office functions being over looked or remaining stagnant.
Perhaps the leading time and money waster in office administration is the old way of processing of employee time and attendance records — specifically for bigger facilities whose employees can number in the hundreds, if not thousands.
While many companies have invested heavily in Enterprise Resource Planning regarding time and attendance recording, in many cases the data that drives these systems is curiously still handwritten out on paper and collected one by one. Accuracy depends on the employee’s handwriting being legible, and timesheets not being lost in transition.
Once the paper timesheets are collected and transported to accounting, someone will need to manually key the timesheets into other systems for processing. This generally will take a manager to sit at a desk for hours, reviewing time off, and entering totals. This process inevitably will cause human error—from employees clocking in more often than once to forgetting to clock out. This, essentially, leads to a large part of each payroll processing day being spent solving errors. And if a time and attendance issue does arise and remain unknown until it’s too late, correcting the issue may not even be possible.
Overall, it’s a time-consuming manual process that is neither efficient nor accurate and leaves virtually no chance of determining the actual costs to produce a product.
The Solution
Today’s cloud-based time and attendance technology enables managers to review and process information through web-based software applications. Besides offering flexibility and convenience, cloud-based technology enables significant time savings, greater efficiency, and information updates in real time and available 24/7.
With the cloud, the challenges mentioned above are removed and administrative functions come to be automated. Through the usage of new employee time clocks that use biometrics or employee proximity badges, employees can to clock-in and out from different locations while delivering employee time data into one central online database. Plant managers welcome this innovation seeing that it gives them the flexibility to track their employees’ time and attendance at their convenience from any Internet connection.
Some other benefits to cloud-based time and attendance technology include the ability to directly produce updated information — including seeing when an employee clocks in at the facility from any place in the world that has an Internet connection — and time records that easily integrate with major payroll software programs.
Let’s take a good look at some of the challenges that cloud-based employee tracking software addresses.
Time and Attendance — A fully automated solution that explains “who,” “what,” and “when” through better management, collection, and processing of employee time at the workplace. Eliminates complicated paper clocking systems that result in inaccurate time reports, administrative errors, and increased labor costs. The system also automatically updates so that the legal risk of non-compliance is got rid of. Proven to maximize efficiency, save time and money, and pay for itself in as little as a couple of months.
Time Allocation System — A Time Allocation System (TAS) monitors and records employee working time against specific jobs, projects or tasks by helping employees to login to the exact duty they’re performing. This is a separate function from the Time and Attendance solution described above. The TAS data allows managers and supervisors to see which employees and departments in an organization are most efficiently using their workday thereby ensuring projects stay on budget and that fair compensation is awarded to the worker.
Access Control System — The bigger the factory, the harder it is to be aware of if the right people are in the right places operating the right duties. An Access Control System allows managers and supervisors to regulate and record the movement of staff, sub-contractors, and visitors in real-time, at multiple entry points. It also synchronizes seamlessly with Time and Attendance and TAS.
Employee Self Service — An Employee Self Service system allows on-site and remote workers the flexibility to reach their schedules and request time-off online which their managers can quickly accept or deny. Giving employees control of their own time management allows HR and supervisors to pay attention to their various other administrative duties. Real-time data is caught in the cloud and can be automatically accessed by both employees and managers alike.
Clocking-In Options — Today’s manufacturing facility needs convenient timekeeping systems for today’s modern workforce. Various clocking-in suggestions include web-based where workers register on the database and manage their time via the web, a biometric time clock that records information at the touch of a finger on a biometric scanner; text alert clocking that allows employees to text message their attendance time to a virtual mobile phone number; landline clocking that is ideal for mobile and remote workforces and that uses a landline phone; mobile clocking that gives employees the ability to clock-in or -out and onto tasks or projects on their mobile device; and a desktop biometric reader that provides the full functionality of a clocking terminal at a fraction of the cost.
To sum it up….
A state-of-the-art time and attendance system is now a need for every major industry, such as manufacturing, energy, retail, government, healthcare, and pharmaceutical.
These highly sophisticated and comprehensive solutions are straight forward to integrate and use and provide easily accessible management and reporting tools. They also appeal to today’s younger employees who are generally searching for the ability to work from another location and cope with their schedule.
An Arizona-based semiconductor supplier will buy GlobalFoundries' computer chip manufacturing plant in the Hudson Valley, creating 150 new jobs and preserving hundreds of others as part of a $720 million expansion plan, Gov. Andrew Cuomo announced Monday.
The Democrat said in a statement that Phoenix-based ON Semiconductor will establish its first 300mm chip manufacturing plant at GlobalFoundries' Fab 10 facility in East Fishkill, 60 miles (96 kilometers) north of New York City.
ON Semiconductor has fully committed to creating 150 new jobs in East Fishkill and maintaining a minimum of 950 existing jobs at the facility, Cuomo said. 'We're glad ON Semiconductor is choosing to deepen its roots here and its commitment to New York State,' he said. The deal calls for GlobalFoundries to transfer manufacturing from Fab 10 to their other facilities, including Fab 8 in Malta, New York, 100 miles (161 kilometers) north of East Fishkill.
The state has offered ON Semiconductor up to $17.5 million in grants for the purchase of the Fab 10 property, and also another $22.5 million in tax credits over 10 years if the company meets its job creation and investment commitments.
'The increased level of capability and capacity enabled by this expansion is our answer to meet critical market needs,' Keith Jackson, president and CEO of ON Semiconductor, said in a statement.
After a 3 year transitional period, ON Semiconductor will obtain ownership of the East Fishkill facility, which GlobalFoundries purchased from IBM in a $1.5 billion deal announced in 2014. Under terms of the deal announced Monday, ON Semiconductor may pay off $100 million at the agreement signing with GlobalFoundries and $330 million at the end of 2022.
Last summer, GlobalFoundries stated it was cutting out about 400 jobs at its Fab 8 plant in Malta as it restructured operations and moved away from cutting-edge chip technology. GlobalFoundries is owned by the government-backed Mubadala Development Company in Abu Dhabi, United Arab Emirates.
The high-tech industry deal was the second main economic develop win announced by the Cuomo administration over the last four days. On Thursday, the third-term governor declared that Netflix was widening its corporate offices in Manhattan and building six sound stages in Brooklyn, where thousands of production jobs will be located.
That announcement came two months after Amazon backed out of a deal that perhaps could have brought more than 25,000 new jobs to Queens. The internet retailer ditched New York on Valentine's Day, stating inflexible backlash from some local politicians in opposition to the $3 billion in city and state tax breaks offered to Amazon.
For facility operations managers, initiating a 5S lean manufacturing program can reduce waste and boost productivity through better workplace organization. With each S (Sort, Set in order, Shine, Standardize, and Sustain) accounting for a separate 5S stage, the process can seem like frustrating, but does not have to be.
“You don’t have to get the 5S process perfect right away,” says Tina Huff, Group Product Manager at Avery Products Corporation, a commercial and industrial label manufacturer. “Just get started and keep improving over time.”
Because nothing is set in stone, using industrial brands for organization and identification that are created to be utilized and removed, as well as customized, can help to boost each single stage of the process.
Sort
The first 5S stage, Sort: purges clutter and unneeded goods from the work area. This leaves just the parts, tools, machines, and supplies important for daily use on the manufacturing or warehouse floor. In this phase, impermanent labels can be put on items as the facility’s staff sorts through them.
“One method is to mark items as TBD, and then wait a month or so,” says Huff. “When you use an item, consider how it’s used or who uses it to determine its permanent home. If you haven’t used it in that time, consider discarding it.”
For greatest results, ensure that to use labels that are won’t fall away during the Sort stage, however try not to cause damage or leave behind residue when you need to remove them.
Set in Order
The second stage, Set in order: locates parts, people, tools, and equipment in the most efficient, ergonomic positions, so operators do not waste time or effort searching for needed items.
To increase workplace productivity, identification labels can be built in to identify and classify parts, tools and equipment, so items are readily accessible and everything has a home. Racks, shelves and cabinets also make sense to label, as do smaller portable items like bins, totes and toolboxes, which help with organization.
Color coding labels can add another level of organization to items. With labels that are printed on a laser or inkjet printer, it is simple to include color, icons or even photos on labels. This makes it easier to quickly identify items and determine where they belong.
Shine
The third stage, Shine: makes sure the workplace stays totally free of clutter, grime and malfunction. This helps for preventing serious work breakdowns or slowdowns. In this process, cleaning and inspecting through cleaning is critical. Then again, labels that are not lasting can get torn or otherwise unreadable, which is a particular problem with barcode labels.
“During inspection, it’s good to replace any damaged or inaccurate labels,” says Huff. Also, keep track of machine maintenance with inspection labels so routine maintenance isn’t forgotten. This helps to prevent costly downtime and improve safety by reducing malfunction related accidents. Be sure that there's a process in place so whoever is responsible knows which machines they need to examine, and how frequently. A good inspection label will include areas for marking the date and person who conducted the inspection.
To reliably perform through the Shine stage, however, it is significant to create ID labels that are designed to withstand daily wear and tear, dirt, grease, oil, chemicals, and wash downs while providing good barcode readability.
Standardize
The fourth stage, Standardize: systematizes the most beneficial work methods with clear standards. Utilizing schedules, checklists and standard operating procedures are an integral part of this process.
Without standard operating procedures and clear processes, a facility will not run consistently and properly, even if everything is labeled accurately. Wherever possible, it is useful to place procedures and checklists on labels posted near work areas.
Sustain
The fifth stage, Sustain: trains and maintains company standards and procedures until they end up a habit and are continually followed. Because 5S is a continuous process, however, organizations will reorganize or improve processes throughout the year, as well as accommodate changes in data, format, and regulation.
In this case, easily removing old labels and printing custom, updated ones can be important. If not, employees can spend time tossing old labels off, using heat guns or even razor blades. They may be hesitant to update labels if it is difficult to get the old ones off, so may make do with sub-standard situations.
The ideal solution would be to use industrial labels for organization and identification that are very easy to adhere, resilient enough to endure harsh conditions, yet be removed cleanly when necessary, leaving no trace.
Available in a variety of sizes, they print easily on standard laser or inkjet printers, enabling custom, do-it-yourself labels for signs and identification. The company’s free online Design and Print Software allows customizable printing utilizing OSHA/ANSI compliant and 5S templates. Employees can create and print their own informal, official, or compliant labels from pre-designed templates or create them step-by-step at their desk. Employees find the process intuitive, since it resembles creating an office document from pre-designed templates.
To accommodate warehouse settings, the software’s barcode generator makes it easy to add text, graphics, serialized numbers or barcodes in a few steps. The combination of a bright white label material with superior ink/toner anchorage further enables accurate barcode scanning, even at extended distances.
With all the equipment, supplies, racks etc. used in industrial settings, the ability to conveniently print new 5S sign and identification labels in minutes, while being able to cleanly remove the old, will go a long way toward keeping facilities efficiently up to date without the hassle, mess, or cost.
Led by Katsumasa Tanaka, a top climate risk researcher at the National Institute for Environmental Studies in Japan, the study examined global circumstances for transitioning from coal to gas using a novel approach that applied metrics developed for climate impact assessments to the coal-gas controversy the first time. Working on the world's main power generators — China, Germany, India, and the United States — the study evaluated the impacts from many different direct and indirect emissions of such a shift on both shorter and longer timescales covering for a few decades to a century.
'Many previous studies were somewhat ambivalent about the climate benefits of the coal-to-gas shift,' said Tanaka. 'Our study makes a stronger case for the climate benefits that would result from this energy transition, because we carefully chose metrics to evaluate the climate impacts in light of recent advances in understanding metrics.'
'Given the current political situation, we deliver a much-needed message to help facilitate the energy shift away from coal under the Paris Agreement,' Tanaka said. 'However, natural gas just isn't an end goal; we consider it as a bridge fuel on the road to more eco friendly forms of energy in the long run as we move toward decarbonization.'
Issues about methane leakage from natural gas have been seriously argued, in particular in the United States given the increasing use of fracking over the past decade. Recent scientific efforts have advanced understanding of the degree of methane leakage in the United States, but the possible affects of methane leakage continue highly uncertain in the rest of the world.
'Our conclusion that the benefits of natural gas outweigh the possible risks is robust under a broad range of methane leakage, and under uncertainties in emissions data and metrics,' Tanaka said.
This research was partially supported by the Environment Research and Technology Development Fund (2-1702) of the Environmental Restoration and Conservation Agency in Japan, with additional support from the Institute for Advanced Sustainability Studies in Germany and the Research Council of Norway.
Multiple Metrics to Simultaneously Examine Short- and Long-Term Climate Impacts
Emissions metrics, or indicators to evaluate the impacts to climate vary from different emission types, are useful tools to gain insights into climate impacts without the need for climate model runs.
These metrics work similarly to weighting factors when calculating CO2-equivalent emissions from the emissions of a variety of greenhouse gases. In spite of this, the resulting climate impacts observed through CO2-equivalent emissions are sensitive to the specific metrics chosen.
'Because the outcome can strongly depend on which metrics are chosen and applied, there is a need for careful reflection about the meaning and implications of each specific choice,' said Francesco Cherubini, a professor at the Norwegian University of Science and Technology. 'Each emission type elicits a different climate system response. The diverging outcomes in previous studies may well stem from the type of metric that was chosen.'
The study joined together a number of metrics to address both short- and long-term climate impacts in parallel. It was uncovered that natural gas power plants have both smaller short- and long-term impacts than coal power plants, even when high potential methane leakage rates, a full array of greenhouse gases and air pollutants, or uncertainty issues are considered.
Regional Differences
To ensure that possible regional disparities were taken into account in the global study, the study compared global metrics with regional metrics to more precisely examine impacts.
'We considered a suite of so-called short-lived climate pollutants (SLCPs), such as SOx, NOx, and black carbon, that can be emitted from these plants,' said Bill Collins, a professor at the University of Reading, in the United Kingdom. 'This required a regional analysis because climate impacts from SLCPs depend on where they are emitted, due to their short lifetimes in the atmosphere.'
Future Directions and Policy Relevance
The study by Tanaka and coauthors is part of an emerging body of literature that reaffirms the need to phase out coal to be able to reduce rising global temperatures and slow or reverse negative impacts of climate change.
Future related work could think about supply chains and trade within and across nations and other environmental factors, in addition to work on maximizing the consistency of metrics for comparing climate impacts.
'Air quality is not part of our analysis, but including it would likely strengthen our conclusion,' said Tanaka. Other environmental effects, such as drinking water contamination and induced seismic activities, could also add important dimensions to the debate.'
When financing industrial or manufacturing equipment, usually there are a number of pitfalls to avoid. Some of the “gotchas” in equipment financing and leasing contracts are outright swindles, while others are more subtle. Understanding the potential financing risks can help keep from a costly and unpleasant experience when acquiring equipment.
One pitfall to be aware of is “interim” or “pro-rata” rent on prefunding. Pro-rata rent means payments that are made up front of a lease or finance commencement date. Pro-rate is very much like rent on an apartment which is due on the first of the month. If a renter moved in on the 15th of the month they would be charged for half a month of rent.
Just picture an order for $2 million worth of production line equipment. Sellers in most cases will never start work before receiving the first progress payment (often 25-50 percent of total cost). Also imagine it will require 15 months between the first progress payment and delivery, installation and inspection of a completed production line. Then finally, assume having been approved for an 84-month term with payments of $28,000 monthly.
When a lender makes a payment in advance of delivery, the industry jargon word used is “pre-funding.” Many equipment leases stipulate that pro-rata payments must be made for the time period between pre-funding and delivery, installation and inspection of the completed order. In the above scenario, interim may consist of 15 months. During that time, 15 payments of $28,000 may be imposed, and those payments do not decrease the principal balance of the equipment lease or finance contract. In the above example, pro-rata could represent $420,000 in unplanned finance charges. Due to this, equipment lease companies will gladly offer to pre-fund progress payments to an equipment dealer, as those pro-rata payments represent almost pure profit.
Large interim payments are a usual incident in “non-bank” equipment financing. Knowing the dangers of interim payments in advance gives a company options to minimize or negotiate irrelevant finance charges. As an example, one can negotiate upfront that interim is to be paid only on the advance amount (i.e. on a 25 percent progress payment towards a $2 million equipment purchase, pro-rata can be negotiated to be paid on the $500,000 advance, versus the entire $2 million). On the other hand, short-term credit lines might be a method to fund progress payments. In some cases, qualified buyers can negotiate with equipment lenders to have pro-rata payments removed or significantly much lower.
The other lure to be mindful of involves equipment financing or equipment leases with a quarterly payment. These transactions also can carry “pro-rata” language within contracts which is commonly abused by unscrupulous lenders. Some lenders create lease commencement dates every business day of the year; this allows them to collect 89 days of interim rental payments regardless of the delivery date of the equipment. For example, going back to the model above with $2 million worth of equipment, slipping an extra 89 days of “rent” into a contract allows the leasing company to get an additional $83,066 in payments without creating any actual value whatsoever.
Evergreen lease clauses can also become a challenge for many businesses. Many equipment lease contracts are “lease to own,” meaning ownership occurs immediately upon the last payment. Other lease contracts are written as a “lease with an option to own.” Consequently after making the final payment, the company may purchase the equipment or return it. However, hidden deep within some contracts is language stipulating that intent to purchase must be made between 90 and 180 days prior to the end of the lease; failure to provide such notice can trigger an automatic 12-month extension of that lease. Back to our $2 million production line, that extension can represent an additional $336,000 in payments. Equipment lease companies often times do not notify customers of upcoming lease expirations. There have been reports of companies that have made several years of various other payments because they were unaware that leases had “rolled over.”
All the above financing and leasing traps can often be stopped by carefully reading any equipment financing or leasing contract. It is wise to have an attorney review contracts prior to signing; this is specifically true for purchases of equipment that exceed $150,000 in costs. A further recommendation is to have an attorney at law that specializes in equipment leasing review contracts as they should be familiar with most of the common problems that companies run into when financing equipment. These simple steps can possibly save a company thousands when financing equipment.
In view of the current fires and loss of life in refurbished buildings, Cintec™ has been completely aware of the danger of fire and has constantly prepared fire resistant remedial anchors and reinforcement.
In the earlier few years, fire has been an increasing concern when buildings are being retrofitted, restored, or modernized. On June 14th, 2017, a fire broke out in the 24 storey Grenfell Tower, widely reported as the worst residential fire to take place in the U.K since the second World War, due to failures by design.
As fears over fire safety spread through the country, Prime Minister Theresa May ordered a public inquiry. As claimed in the final report of the review of building regulations and fire safety, England is by no means alone in needing to augment building safety, as building regulations are a global concern.
Under the terms of the EC Construction Products Directive, resistance to fire is one of the important requirements for which performance tests are desired. In Europe, thin-joint mortars have become popular, thus there are a range of situations where fixings are created using organic polymers as either the tie body, or in the type of resin glues. Such ties are not inherently fire resistant and could fail and shorten the life of a cavity wall in a fire, or lead to the collapse of cladding resulting in danger to escaping occupants and fire fighters.
As a leader in functional preservation, Cintec™ has continually been an advocate for the restoration of safe buildings through fire-resistant anchors, as long ago as 1993 fire tests were carried out by the internationally recognized Building Research Establishment in accordance with BS476( ), ISO( ), and CEN( ). The fire rig was developed for use in the measurement of the performance of Cintec's™ anchors in a fire situation while subjected to a technical load which might be a result of wind suction of fire-induced thermal movement.
Cintec's™ remedial anchors survived a two-hour test without failure of any of the samples. Every sample achieved multiple hundred degrees in the part of the anchor nearest to the fire face. This suggests the anchor system can be advisable for repair work to buildings needing a fire rating of up to two hours.
Cintec™ Anchors were later, unintentionally, put to the test at the Fullers Brewery in London. The Cintec™ anchor system had been used greatly to repair and restore the Brewery's facade. A brutal fire followed, destroying large sections of the building. Irrespective of the brickwork being subjected to exceptionally high temperatures, tests revealed that the cementitious Cintec™ anchors did not fail, executing to their original design. They retained their integrity and could be reused for the repair work. If the anchors were an epoxy or resin type, they would have melted, releasing probably damaging fumes in the process and have been conveniently pulled out, granting the wall to crash. One could say that the Cintec™ Anchors having survived the fire are in fact fire proof.
In New York City, according to the NYC Buildings Department, adhesive anchors are not granted to assist fire-resistance rated construction unless the use of such anchors meets the conditions set forth in the recognition criteria. Post-installed anchors in masonry must be manufactured in accordance with the NYC Construction Codes, describing the masonry substrate type and condition, as well as proof of pull tests.
Howard Zimmerman, the well-known NY Architects, had issues about the lack of fire ratings for resin-based anchor systems in high rise apartments near Central Park, New York. After analyzing Fire Test Data and performance tests on a severely damaged building after a fire, it was determined that the Cintec™ System was the best anchor to meet the project engineers' concerns.
Cintec™ anchors are the cementitious fire-resistant alternative to resin anchors. Since Cintec's™ anchors are based on reliable restoration materials, cementitious grout and stainless steel, they easily provided the elusive fire rating typically absent in other systems.