Visit
Official Website

Fictron Industrial Supplies Sdn Bhd
No. 7 & 7A,
Jalan Tiara, Tiara Square,
Taman Perindustrian Sime UEP,
47600 Subang Jaya,
Selangor, Malaysia.
+603-8023 9829
+603-8023 7089
Fictron Industrial
Automation Pte Ltd

140 Paya Lebar Road, #03-01,
AZ @ Paya Lebar 409015,
Singapore.
+65 31388976
sg.sales@fictron.com

Latest News

NEC Face Recognition Technology Ranks First in NIST Accuracy Testing

Oct 11, 2019
NEC Face Recognition Technology Ranks First in NIST Accuracy Testing
View Full Size
NEC Corp claims that their face recognition technology gained the highest matching accuracy in the Face Recognition Vendor Test (FRVT) 2018 executed by the U.S. National Institute of Standards and Technology (NIST), with an error rate of 0.5% when registering 12 million people.
 
NEC’s technology rated No. 1 in NIST testing for the fifth time, after its top placement in the face recognition testing for video in 2017. The high performance of NEC’s technology is shown in the test results which placed the company dramatically ahead of the runner-up.
 
In the past few years, the rising convenience of biometric authentication technology, improved security awareness, and the remarkable development of artificial intelligence (AI), have driven companies in many countries to begin adopting biometric authentication technology. In particular, the use of face recognition technology is rather quickly enlarging across an array of fields throughout the world. Face recognition technologies are now in use in areas that require high reliability, convenience and long-term use, for example identity verification and national infrastructure, transaction settlements, bank account establishment, and passport verification.
 
Forty-nine organizations, along with companies from the United States, China, Russia, Europe, and Japan, participated in the NIST’s FRVT 2018, where the evaluation of face recognition accuracy was practiced. These tests are the most strenuous and fair benchmarks implemented by the NIST as each organization is required to submit and be reviewed on programs that were developed during the same period. By performing multi-stage matching, an impressive search speed of 230 million matchings per second was achieved. On top of that, leveraging NEC's deep learning methods to considerably decrease the identification error rate, NEC exactly matched images of a subject taken over a 10-year interval with an error rate that was 4 times lower than the runner-up.
 

How to Detect a Cyber Attack Against Your Company

Oct 11, 2019
How to Detect a Cyber Attack Against Your Company
View Full Size
How to Detect a Cyber Attack Against Your Company? But if your manufacturing facility was targeted by a cyber criminal, will you be able to recognize the threat? Or maybe, if an employee was doing something vicious, like diverting payments into their personal account, would you be capable of find the activity? Fast detection is key to successfully containing any fallout from an information breach. To respond quickly to a cyber attack, you must first have the perfect mechanisms in place to uncover the threat.
 
Install & Update Anti-virus & Other Cybersecurity Programs
 
If you have not already installed anti-virus, anti-malware, and anti-spyware software on every device at your manufacturing facility, now is the time. Install, use, and continuously update these cybersecurity measures on every computer, tablet, and smartphone.
 
These mechanisms can certainly help shield your company’s important data and information from malware, which is the catch-all term for malicious code. Written with the purpose to steal or cause harm to information systems, malware contains viruses, spyware, and ransomware. Destructive code can not just steal your computer memory; it can also facilitate a cyber criminal to record your computer actions and access sensitive information.
 
To get the most from your anti-malware programs, set the software to conveniently check for updates at a minimum once daily, or in real-time, if available. Set the settings to run a complete scan after daily updates.
 
A typical example of typical business anti-malware settings might include:
- Running anti-virus programs daily or nightly, such as at midnight
- Scheduling a virus scan to run about half an hour later (12:30 a.m.)
- Following up by running anti-spyware software a couple of hours later, such as at 2:30 a.m.
- Running a full system scan shortly afterward (3:00 a.m.)
 
This example is based on the assumption that a facility normally has a running, high-speed Internet connection for all devices. The time of your updates and scans may differ, but you need to do them daily. Do not forget to schedule them so that basically one activity takes place at any given time. For home-based employees or for employees’ personal devices, ensure that they have copies or access to the same anti-virus and anti-spyware software, and ask them to run frequent updates per the previous example.
 
It's vital that all employees understand why running anti-virus, anti-malware, and anti-spyware is vital to protecting company information and assets. Employees must also realize how early detection could potentially save the company from serious consequences associated with a cybersecurity incident or breach.
 
For redundant security, it's a good idea to use two different anti-virus solutions from different vendors. Using anti-malware protection from two different providers can increase your chances of detecting a virus. Routers, firewalls, or Intrusion Detection and Prevention Systems (IDPS) usually have some anti-virus capabilities; but you don’t want to rely on them exclusively to protect your network.
 
Understand that anti-virus solutions can only find known viruses. If a new virus is developed and deployed, your anti-virus may not be able to recognize it. It is essential to keep your anti-virus solutions up to date in order to detect the latest viruses.
 
Maintain & Monitor Detection Logs
 
Most malware protection and detection hardware or software is built with logging capability. Check your user manual for instructions on how to:
- Use your logs to identify suspicious activity
- Maintain regular log records that are valuable in an investigation
- Back up logs regularly and save them for at least a year (although some types of information may need to be stored for longer)
 
For added assurance, consider recruiting a cybersecurity professional to review your logs for any red-flag trends, such as an unusually large amount of time spent on a social media site or a high frequency of viruses consistently found on a single computer. This activity may present an important information security problem that requires stronger protection.
 

A Machine Learning Classifier Can Spot Serial Hijackers Before They Strike

Oct 11, 2019
A Machine Learning Classifier Can Spot Serial Hijackers Before They Strike
View Full Size
How will you response if, whenever you had to send very sensitive information somewhere, you relied on a chain of people playing the telephone game to have that information to where it needs to go? Appears like a terrible idea, right? Well, too bad, because that is how the Internet works.
 
Data is routed through the Internet’s several metaphorical tubes using what's called the Border Gateway Protocol (BGP). Any data moving over the Internet needs a physical path of networks and routers to make it from A to B. BGP is the protocol that moves information through those paths - though the downside, just as a person in a game of telephone, is that each junction in the path only knows what they have been told by their immediate neighbor.
 
Since a particular junction in a route recognizes only where the data it’s transmitting just came from and where it’s headed next, it’s relatively simple for someone to step in and divert the data. At these unique junctions, autonomous systems establish BGP connections. Like a party pooper intentionally ruining a game of telephone by whispering a totally different phrase than the one that was told to them, a hacker may perhaps insert their own autonomous system to reroute information. The worst offenders are serial hijackers, who continuously deviate data to skim information or enable distributed denial-of-service (DDOS) attacks. In 1998, various hackers testified to the U.S. Congress that the Internet could be taken down by a dedicated hacker in 30 minutes by deploying BGP hacking.
 
Over the years, serial hijackers have been difficult to stop. One recent example was Bitcanal, a Portuguese web hosting firm that devoted years helping serial hijackers in their attacks. It took years of coordinated effort from legitimate service providers to shut down Bitcanal, and meanwhile, many other serial hijackers still roam the Web. What’s worse, serial hijackers have to, as the name suggests, launch several attacks before it becomes clear that they're a bad-faith actor.
 
“BGP [hacking] is one way to sniff at traffic, or steal traffic,” says Cecilia Testart, a graduate student at MIT’s Computer Science and Artificial Intelligence Lab (CSAIL). ”Given that the Internet is becoming more and more critical, we should try and prevent these attacks.”
 
Testart is the lead author on a paper posted today [PDF] by a few researchers at CSAIL and the Center for Applied Internet Data Analysis (CAIDA). They have suggested that machine learning can be used to pro-actively stop serial hijackers from their hijinks. Serial hijackers, the researchers suggest, show some characteristic traits that make them be noticeable in comparison with ordinary network providers. They reveal that machine learning could dig out serial hijackers a lot faster than the standard method of identifying them only after multiple attacks.
 
The joint team used a machine learning technique which is called an extremely-randomized trees (extra-trees) classifier. In a test with their classifier, the classifier flagged 934 out of 19,103 autonomous systems it tested as potential serial hijackers. You can find extra-trees classifiers as though you were growing a forest of trees, where each tree represents a vote of confidence - as for instance, whether anybody is a serial hijacker - in line with a randomized subset of available information.
 
The resulting forest represents a consensus. If most trees have arrived at the choice that someone is a serial hijacker using the limited information available to them, then you possibly have one on your hands. Testart says extra-trees classifiers and other forest classifiers do not have the same bias toward a set of training data that a machine learning technique such as deep learning may have. Because the available data on known serial hijackers is so small, deep learning techniques may have skewed toward uncovering only ones most similar to known attackers and missed ones that might differ more.
 
Of course, for individual trees to cast a vote, they have to know what they are looking for. The research group identified a few ways in which serial hijackers differ from authentic network providers that usually route Internet traffic. For example, authentic providers tend to be online more regularly, as they are providing Internet service to genuine customers. Serial hijackers, on the contrary, would only be online while they are skimming data.
 
Serial hijackers furthermore usually have more diversified Internet Protocol (IP) blocks - primarily the street addresses of the Internet. Testart explains that an institution like MIT normally has a block of consecutive IP addresses that it uses. Hijackers, however, choose to pick up small strings of IP addresses as they become defunct from other users. One user with a bizarre selection of IP blocks, therefore, is more likely to be a serial hijacker.
 
These rules won't be set in stone. Testart notes there are times when a recognized network provider could go offline - for example, during an earthquake or blackout. Fat finger errors can also lead to typos and misconfigurations that could make a legitimate provider look suspicious at first glance. Testart says there is still plenty to be done with the work the research team has published so far. She suggests that an extra-trees classifier like the one the group developed could give network operators a sort of reputation score, to ensure serial hijackers would see their reputations drop quickly as they went about their nefarious business.
 
The other alternative is to update BGP and turn the game of the telephone into something more secure. But Testart doesn’t think that’s likely. “The Internet is a huge network,” she says. “It’s running on infrastructure set up many years ago. If you update a major protocol, you need to update all that infrastructure.” Think about the headaches of trying to get every network provider in the world to agree to change a protocol - it is far easier just to build a tool that can sniff out serial hijackers.
 

Will U.S. to Issue Licenses for Supply of Non-Sensitive Goods to Huawei?

Oct 10, 2019
Will U.S. to Issue Licenses for Supply of Non-Sensitive Goods to Huawei?
View Full Size
The United States will soon issue licenses authorizing some U.S. companies to supply non-sensitive goods to China’s Huawei, the New York Times said on Wednesday, as high-level officials from the two countries meet this week to continue trade talks.
 
Huawei Technologies Co Ltd, the world’s biggest telecoms gear maker, has been put on a U.S. trade blacklist since May, when trade talks between Washington and Beijing broke down. The United States says the company can spy on users, which Huawei denies. The blacklisting prevented Huawei from buying parts and components from U.S. companies without U.S. government approval, constraining its access to essential technologies particularly Google Mobile Services.
 
U.S. companies can seek a license for particular products to be exempted from the ban. The U.S. Commerce Department has accepted more than 130 applications from companies for licenses to sell U.S. goods to Huawei, Reuters reported in August. Government officials advised U.S. companies to apply for licenses following U.S. President Donald Trump’s pledge of relief, saying exports to Huawei of non-sensitive items that are readily replaced by foreign competitors would be permitted.
 
Trump's administration gave the green light last week to start approving licenses for a few American companies to bypass the curbs, the New York Times said nyti.ms/35pED2e, stating people familiar with the matter. A U.S. Commerce Department spokesman told Reuters that no official direction has been granted to the department on the matter as of Wednesday afternoon.
 

Samsung Develops 12-Layer 3D-TSV Chip Packaging Technology

Oct 10, 2019
Samsung Develops 12-Layer 3D-TSV Chip Packaging Technology
View Full Size
Samsung Electronics, a global leader in advanced semiconductor technology, today reported that it has developed the industry’s first 12-layer 3D-TSV (Through Silicon Via) technology. Samsung’s new innovation is regarded as one of the most challenging packaging technologies for mass production of high-performance chips, as it involves pinpoint accuracy to vertically interconnect 12 DRAM chips thru a three-dimensional configuration of a little over 60,000 TSV holes, each of which is one-twentieth the thickness of a single strand of human hair.
 
The thickness of the package (720?) remains the same as current 8-layer High Bandwidth Memory-2 (HBM2) products, which is a tremendous advancement in component design. This will help customers release next-generation, high-capacity products with better performance capacity without having to change their system configuration designs. Additionally, the 3D packaging technology also features a shorter data transmission time between chips than the currently existing wire bonding technology, resulting in considerably faster speed and lower power consumption.
 
“Packaging technology that secures all of the intricacies of ultra-performance memory is becoming tremendously important, with the wide variety of new-age applications, such as artificial intelligence (AI) and High Power Computing (HPC),” said Hong-Joo Baek, executive vice president of TSP (Test & System Package) at Samsung Electronics.
 
“As Moore’s law scaling reaches its limit, the role of 3D-TSV technology is expected to become even more critical. We want to be at the forefront of this state-of-the-art chip packaging technology.”
 
Relying on its 12-layer 3D-TSV technology, Samsung will offer the ultimate DRAM performance for applications that are data-intensive and amazingly high-speed. Also, by enhancing the number of stacked layers from eight to 12, Samsung will soon be able to mass produce 24-gigabyte (GB)* High Bandwidth Memory, which provides three times the capacity of 8GB high bandwidth memory on the market today.
 
Samsung will be able to meet the aggressively expanding market demand for high-capacity HBM solutions with its cutting-edge 12-layer 3D TSV technology and it hopes to solidify its leadership in the premium semiconductor market.
 

How to Protect Your Business from Cyber Attacks

Oct 10, 2019
How to Protect Your Business from Cyber Attacks
View Full Size
How to guard your business from Cyber Attacks? Mitigating these threats takes above and beyond a single anti-virus upgrade; it involves ongoing vigilance. But shielding your systems doesn’t have to be complicated. Here is how to get started.
 
Limit Access to Your User Data & Information
 
Controlling having access to your valuable user data lessens the chance for human error, which is the number-one information security threat. If a worker leaves your company, or transfers to another company location, take protective action right away, including deleting passwords and accounts from all systems and collecting company ID badges and entry keys. An ounce of access prevention can equal a pound of protection when it comes to limiting the impact of a disgruntled ex-employee.
 
Install Surge Protectors & Uninterruptible Power Supplies
 
Uninterruptible power supplies (UPS) can provide just enough battery life and time to save your data in the event of a power disruption. Check to make sure the UPS type and size meets your standards and requirements. Every computer and networked device should be plugged into a UPS. For less-sensitive electronics and non-networked equipment, standard surge protectors should suffice. Make sure to test and replace each UPS and surge protector as recommended by the manufacturer.
 
Patch Your Operating Systems & Software Regularly
 
Whatever new app can open the door to a cyber attack if you don’t on a regular basis patch and update all software on every device used by your employees. Regularly check for updates when purchasing a new computer or installing a new software system. Understand that software suppliers aren't required to provide security updates for unsupported products. For example, Microsoft® will stop supporting Windows 7 in January of 2020, so if you've not up-graded yet, now is the time. Do not holdup downloading operating system updates. Updates often include new or enhanced security features.
 
Install & Activate Software and Hardware Firewalls
 
Firewalls can combat malevolent hackers and hinder employees from browsing inappropriate websites. Install and update firewall systems on every employee computer, smartphone, and networked device. Include off-site employees, even in the event you use a cloud service provider (CSP) or a virtual private network (VPN). You may also want to install an intrusion detection/prevention system (IDPS) to provide a greater level of protection.
 
Secure All Wireless Access Points & Networks
 
For secure wireless networking, use these router best practices:
- Change the new device administrative password
- Set the wireless access point so that it does not broadcast its service set identifier (SSID)
- Set your router to use WiFi Protected Access 2 (WPA-2), with the Advanced Encryption Standard (AES) for encryption
- Avoid using WEP (Wired-Equivalent Privacy).
 
For guest WiFi access, use a separate network from your business account.
 
Set up Web & Email Filters
 
Use email and web browser filters to dissuade hackers and restrict spam from clogging employee inboxes. You can also download “blacklist” services to block users from browsing speculative websites that pose malware risks. Caution your employees against visiting sites that are frequently linked with cybersecurity threats, such as pornographic websites or social media. This might appear to be a no-brainer; but it only takes one employee to visit the wrong website to inadvertently download malware.
 
Use Encryption for Sensitive Business Information
 
Use full-disk encryption to shield all your computers, tablets, and smartphones. Save a copy of your encryption password or key in a protected place distinct from your stored backups. Email recipients typically need the same encryption capability in order to decrypt. Never send the password or key in the same email as the encrypted document. Give it to them via phone or some other method.
 
Dispose of Old Computers & Media Safely
 
Before donating or trashing old computers, you must remove all important hard drive information. Remove any sensitive business or personal data on old CDs, flash drives, or other old media. Then destruct these items or take them to a company that will shred them for you. Destroy sensitive paper information with a crosscut shredder or an incinerator.
 
Train Your Employees
 
Cyber-vigilant employees are your best protection against information security threats.
 
Every employee should know:
- What business and personal use is permitted for emails
- How to treat business information at the office or at home
- What to do if a cybersecurity incident occurs
 
Train any new employee to protect valuable data and get them to sign your information policy. Use newsletters and/or ongoing training to boost your culture of cybersecurity. Now that we've covered the key steps to protect your valuable data and information, we will show you how to install mechanisms for detecting and recognizing a cyber attack in part three of our series on “Cybersecurity for Manufacturers” from the MEP National Network.
 

Google Set to Release 5G Smartphone Ahead of Apple

Oct 10, 2019
Google Set to Release 5G Smartphone Ahead of Apple
View Full Size
Google has commenced test production of a 5G smartphone that it will probably bring out as soon as next week as part of its intense expansion into branded hardware that aims to steal a march on Apple and tie in consumers to its search and cloud-computing services. It is further learned that as well as working on the new 5G smartphone, which Google might announce at its launch event for new products on Oct. 15, the search giant will display two new 4G Pixel smartphones, as forecasted, and possibly a new smart watch and notebook too.
 
Such hi-tech gadgets are central to Google's strategy to draw consumers with its own-branded hardware and thereby integrate them ever-closer with the company's better-known search engine and artificial intelligence-driven software. ''The two [Pixel 4] smartphones are already proceeding into mass production and will be ready to ship after Google's [new products] unveiling next week,'' one of a few sources close to the situation said. ''Google is also working on a version with 5G technology, which is in test production.''
 
A marketing splashing generated by Google's step into 5G smartphones, which would be the first by a U.S. company, presents a direct challenge to Apple and a few other leading hardware and handset makers just like Samsung Electronics and Huawei Technologies. It may also throw down the gauntlet at Microsoft, Google's biggest U.S. rival in search and software, which proclaimed a return last week to the tremendously competitive smartphone market with its own folding model, after it exited in 2016.
 
Google has made no trick of its aim to move into hardware although the apparent acceleration of its investment plans, joined with the company's deep pockets, will raise the pressure on competitors. Alphabet, Google's parent, has around $117 billion in cash, while Apple has $102 billion.
 
''Bringing together software and hardware ... does have a lot of synergistic value ... and the main way to do that today for our core [search and software] products is by using hardware,'' Google CEO Sundar Pichai, told analysts in a February conference call. ''As we scale up our hardware efforts ... you can definitely glimpse the future.'' As a relative newcomer to the global smartphone market, Google's Pixel phone series nevertheless has less than a 0.5% market share. Nonetheless that is flourishing exponentially. This year the company strives ship as many as 10 million phones, sources said, more than twice last year's level.
 
''Although Google's smartphone shipments are still small, it's one of the clients that still has healthy volume growth,'' another source said. ''It offers better prices for suppliers too.'' Google's 5G model is currently being test produced in China. But to escape U.S. tariffs from the China trade dispute, and to source cheaper labor, all Google production meant for the U.S. will eventually be made outside China. A leading option is a Vietnamese facility due to be ready by the end to the year, as the Nikkei Asian Review first reported.
 
Google may eventually make a decision to release its new 5G model in spring next year, the sources said, together with the release of a budget Pixel phone. But that release date would still put Google prior to Apple in the race to produce the next generation of 5G phones. The Google 5G handset will include a Snapdragon 855 mobile platform, made by U.S. chipmaker Qualcomm, the sources added. For now the new flagship Pixel 4 series will sport advanced organic light-emitting diode screens, like Apple's premium iPhone 11 Pro and 11 Pro Max models launched in September.
 

Spooked by Libra, EU Pledges to Regulate Digital Currencies

Oct 9, 2019
Spooked by Libra, EU Pledges to Regulate Digital Currencies
View Full Size
The European Union’s finance commissioner pledged on Tuesday to advise new rules to regulate virtual currencies, in a reaction to Facebook’s (FB.O) plans to introduce Libra, which the EU considers a risk to financial stability. France and Germany have said that Libra, whose size would midget cryptocurrencies such as bitcoin, could control their monetary sovereignty.
 
“Europe needs a common approach on crypto-assets such as Libra,” Valdis Dombrovskis told EU lawmakers in a confirmation hearing. “I intend to propose new legislation on this.”
 
The EU doesn't have particular regulations on cryptocurrencies, which, until Libra was released in June, had been considered a marginal issue by most decision-makers because only a fraction of bitcoins or other digital coins are translated into euros. Dombrovskis has resisted regulating digital currencies in the five years he has served so far. He have cleared his change of heart stemmed from Facebook’s plans for Libra, a digital currency that “could have systemic effects on financial stability,” he told lawmakers.
 
The EU is now also stressing the G20 for global action on “stablecoins,” an EU document circulated last week said. Facebook’s planned Libra is the best-known of the stablecoins - cryptocurrencies backed by assets like conventional money deposits, short-term government securities or gold.
 
Libra’s scale would likely pose issues, Dombrovskis said, seeing that Facebook’s millions of users in Europe would be able to pay with the new digital currency. An EU Commission official said there was no timetable yet for advising the new rules. Dombrovskis said the crypto regulation should focus on protecting financial stability, protecting consumers and tackling the risks of money-laundering using crypto-assets, which can easily cross borders.
 

How to Make Your First Robot Integration a Success

Oct 9, 2019
How to Make Your First Robot Integration a Success
View Full Size
All over the United States, small and medium-sized manufacturers (SMMs) are seriously considering integrating industrial robots into their facilities. There's a growing awareness that progressively flexible and affordable robotics systems can help present workers in numerous different ways, taking on repetitive tasks and freeing up staff for higher level work and improving productivity overall.
 
To serve this increasing need, a lot of robotics systems integrators have come online and promise complete packages to guide manufacturers from initial assessment to fully realized industrial automation. But deferring to these experts can feel a bit imposing to manufacturers who rely on established processes they've developed internally.
 
So how does a manufacturer contemplating a first robot integration project participate fully so that the project is a success on their terms? Here are four suggestions to guide you during the process.
 
1. Be honest about the level of support you need.
 
You might have in-house robotics expertise or want to use your first robot integration project as the chance to learn. This is a good idea and yet do not forget to consider how much time it will take. While every small manufacturer is different, one thing is common across the sector: Everyone is always strapped for bandwidth. Even if you have all the capabilities required to implement your own robotics cell, if you don't have the time to dedicate to the project, it will definitely not be successful and could delay your ability to recoup your ROI. In which case, using a reputable systems integrator may be the best way to go.
 
2. Empower your existing experts.
 
Your existing processes work — and they work well because you have good people who own and administer them. Ensure these individuals are closely engaged with the implementation project so the new, automated process can build on the success of the existing process, while improving on deficient areas. Small details like occasional process inconsistencies can throw a big wrench in an automation project. The team members in charge of the manual process will be able to help head those issues off at the pass.
 
3. Identify a robotics champion at your company.
 
In a way, the work really starts after a robot implementation is complete — i.e., when your team starts to work with the new equipment. To make certain that rapid ROI and ongoing success, identify an in-house champion who will work alongside the implementation team and learn the system. Make sure, too, that this person has real cross-departmental authority and can broker engineering and production cooperation, which will be critical to success.
 
4. Keep it simple.
 
Introducing a robot or robots into your facility is an essential change. There are quite a few variables to any project of any scale — e.g., appropriately converting a manual cell, training key personnel, and minimizing the impact on production. Start your robot implementation simply and take that principle to heart as you begin to evolve how you use robots in your facility.
 

How the Model T Can Influence Your Current Industrial Automation Needs

Oct 9, 2019
How the Model T Can Influence Your Current Industrial Automation Needs
View Full Size
Amongst the biggest challenges for any successful business is understanding when it’s time to change.  All in all, conventional wisdom says “if it’s not broke, don’t fix it.” But with technology updating at such a swift pace, those who stand still will soon be left behind. The last time the world saw technological advancements at this pace, Henry Ford was just figuring out the assembly line.  But it is fairly possible that by looking back at Ford’s adoption of the ‘new’ technology of his time we may be able to know how to correctly read the signs of today’s technological trends so we can be ready to invest in AI and automation at the most advantageous time for our manufacturing, warehousing, and distribution systems.
 
Find a New Market
 
Henry Ford was not a newcomer to the car business when he began producing the Model T in 1908.  He had already been part of several automotive companies before the Ford Motor Company was established, and built different other car models including his Quadricycle and the 999.  But he imagined of a vehicle for ‘the great multitude,’  and so the Model T was born.  Unfortunately,  the original Model T was still too expensive for most Americans.  When Ford began churning the cars out via assembly line, however, their price dropped significantly.  In 1909, when workers were still using traditional methods to piece the cars together, a Model T was priced at $825, and below 11,000 were produced.  But in 1916, three years once Ford started using assembly line production,  the Ford Motor Company manufactured over 500,000 Model Ts and sold each one for $345.
 
Automation can lead you to consider opportunities in ways you have not before.  A shift in production capabilities and costs allow you to reconsider your market from a new perspective. Larger productivity and efficiency equals a lower per unit cost that will change how competitive you could end up within your market.
 
Help Your Workers
 
Henry Ford really improved the lives of his workers. In the beginning of assembly-line production within the Ford Motor Co, most workers managed 9-hour days for about $12 a week. But the shift was awkward, the work was hard, and turnover was high. So Ford changed the work periods to three 8-hour shifts and doubled the worker pay through a bonus structure.  This, in turn, diminished his labor turnover and allowed for smooth, uninterrupted production of his cars.
 
Inspite of fears, current technology does not take away jobs. In fact, automation may help increase the number of skilled, high-paying jobs within the manufacturing sector, since an automated shop has need of better trained, higher skilled workers. This could be a win-win for an existing company as well as their current workforce if management offers educational reimbursement to employees who want to retrain to gain more skills: your employees gain 21st century skills that can allow them to remain with the company in a better-paid position, and you retain good employees who have a proven track record of reliability.
 
Decrease Waste, Increase Efficiency
 
One of the top benefits to Henry Ford’s assembly line was its increased efficiency. The Model T went through 84 individual assembly processes using interchangeable parts that were all mass-produced anywhere else and assembled into a single car by workers trained to do one particular job. While this appears like common sense today, it was a revolutionary idea in 1913, and one that increased worker productivity to such a level that the time to put a single Model T together dropped from over 12 hours in 1908 to 93 minutes in 1914.
 
Automation can make the same kinds of advances in efficiency for your company.  When Factory Five Racing, Inc was looking to decline the time it took to produce their hot rod trim kits, they turned to robotic automation. The change allowed them to produce a higher quality trim kit consisting of four sets of panels (four trunks, four hoods, and eight doors) in 24 minutes, down from 7.5 hours. Increased efficiency tends to improve throughput, reduces environmental impact due to lower energy use, and cuts costs.
 
Be Flexible
 
Henry Ford famously said, “Any customer can have a car painted any color that he wants so long as it is black.”  While this doesn’t imply a flexible business outlook, Ford’s assembly line was great in its flexibility. At the height of Model T production, there were eleven different bodies built upon the basic Model T chassis, including a racer, a snowmobile, a police wagon, and a woody wagon, and each one could be custom-fitted with any of thousands of accessories.
 
Flexible automation makes your plant to switch over to take on new processes without a full retrofit. This is made less difficult by thinking ahead about what challenges might occur down the road and preparing for them during the original design phase. For example, choosing robots with an open interface will allow for the later connection of third-party equipment for customized processes. This is where working with an experienced robotics/AI design firm can pay substantial dividends later.
 
Another famous saying is “The more things change, the more they stay the same.”  While Henry Ford’s assembly line and basic business practices have long been replaced by modern lean manufacturing and Industry 4.0, we can still apply Ford’s reasons for improving processes to today’s changing industrial landscape.
 

Global Tablet Shipments Expected to Drop for Next Five Years

Oct 9, 2019
Global Tablet Shipments Expected to Drop for Next Five Years
View Full Size
Global tablet shipments are likely to slip vastly on year in 2019 amid decreasing demand for brand and education tablets. Sales of white-box models have been drastically undermined by brand-name devices, while demand for small-size white-box tablets will remain to slide in the next five years, according to Digitimes Research's 5-year forecast report on tablets.
 
Apple's iPad series, which accounts for the biggest portion of the brand-name tablet shipments, may not receive upgrades as keenly as before in the next few years, while iPads' price cuts are only having limited effect on stimulating demand, Digitimes Research noted.
 
Global tablet shipments will see hardships remaining at above 130 million units in 2020. After 2020, the global tablet market will become an area of rigid demand with shipments to get smaller 2-3% every year, and by 2024, it will be challenging to keep volumes above 120 million units.
 
White-box tablet shipments will remain weak in 2020 as most makers' key products are small-size tablets that have been cannibalized by large-screen smartphones and Amazon's inexpensive tablets. Since many white-box tablet makers have begun turning to manufacture non-tablet products and customized tablets, more makers are likely to quit the regular tablet business in the next few years.
 
Shipments of tablets with above 10-inch display or using an in-cell touch solution will jump considerably in 2020 as Apple has replaced its inexpensive 9.7-inch iPad with a new 10.2-inch one, while panels makers have been keenly advertising their in-cell touch solution with advantages in production and pricing to replace GFF one.
 
Microsoft's Windows-based tablet shipments are anticipated to boost tremendously in 2019 and will see its shipment share rise to 5.2% by 2020, trailing closely behind Lenovo.
 

Transforming Online Learning With Artificial Intelligence

Oct 8, 2019
Transforming Online Learning With Artificial Intelligence
View Full Size
As higher education costs remain to rise, students bear the ultimate burden of finding the right school, major, and delivery format to maximize post-graduation success. As opposed to previous generations, millennials and adult learners are searching for alternatives to full-time, on-campus programs, and universities are eager to offer non-traditional routes to a degree. 
 
Distance learning programs have actually existed since the 1980s, but technological innovation, content scalability, and widespread mobile adoption have made possible the online degree program to become a competitive option for aspiring students. Long gone are the days of aggressive marketing tactics and empty promises made by degree mills and unaccredited for-profit universities. Today, a learner can enter in competitive bachelor's and master's programs at U Penn, Columbia, Johns Hopkins, NYU, and more. In many cases, the virtual programs can pass tuition savings of 25-50% to enrolled candidates. 
 
Take, as an example, the Georgia Institute of Technology. An elite public institution, Georgia Tech boasts proud figures of #35 for national university rankings and #8 for best computer science programs. Backed by a $2 million investment from AT&T and a tendency to democratize education by extending beyond physical class sizes, Georgia Tech spun up a Master of Computer Science degree provided asynchronously in a virtual environment. The total price tag? ~$7,000 USD.
 
Georgia Tech is absolutely not all alone. The University of Illinois Urbana-Champaign suspended their on-campus MBA program to transition into an iMBA online, enrollment in individual online courses is tracking upwards year-over-year, and 25% of universities are envisioned to fail in the next 20 years due to heavy losses incurred by pricy facilities and declining student registration.
 
The acceleration of online-based learning has its positive factors, but it is not a silver bullet. But, a set of next-generation improvements facilitated by artificial intelligence (AI) stands to totally change the virtual experience.
 
The AI Advantage
 
To improve the current teaching model in which the teacher is the source of knowledge and the student is the recipient, we need to fundamentally reimagine the role of an educator in the university system. Advances in automated assignment grading and remote monitoring services (e.g. Proctorio) allow instructors to forego repetitive, time-intensive tasks and instead dedicate their saved time to higher-value work. For students that do not thrive in the regular classroom setting, AI-enabled learning management systems (LMS) can deploy surveys to categorize individuals into distinct learning buckets (e.g. visual, auditory, text), which can provide effective and targeted content that fits with each preferred learning style. Apart from just identification of preferences, the platforms can also break down long-form lectures and reading assignments into smaller, atomic components that are conveniently digestible.  
 
For international students, the language barrier may complicate progress, but cutting-edge research in text translation and machine learning seeks to create deep-learning systems that can translate English lectures into the student’s native tongue. Similar technologies in voice recognition and text summarization can transcribe a complete lecture with stunning accuracy and minimize paragraphs of text into just the relevant bullet points for review. Machine learning algorithms can equally be deployed over a course curriculum to flag areas of bias, complexity, and ambiguity for much closer review by the instructor. 
 
The climb is slow, but students, teachers, and administrations will without a doubt enjoy the benefits of AI as it evolves and matures over the next decade. The question of whether this field will be disrupted by a tech company, university, or research organization is still up for debate.
 

Siemens Adds Artificial Intelligence to Control Logic

Oct 8, 2019
Siemens Adds Artificial Intelligence to Control Logic
View Full Size
Imagine if a robot could automatically adjust its grip based on the size and shape of the object? Put simply, a robot that could fine-tune how it is holding an item so as not to drop it, much just as humans do. According to Siemens, it’s truly possible, and it all comes down to artificial intelligence (AI) based on neural networks.
 
Neural networking is a technology that copies the human brain in that it is able to distinguish complex patterns. Keeping that in mind, Siemens says that by adding AI via neural networks to traditional control programs - which were designed to execute a set task - the capabilities of the system can be extended to change based on the parameters of the product or process. Bottom line: machines turned out to be naturally flexible.
 
A year ago at the SPS/IPC/Drives showcase in Nuremberg, Germany, Siemens announced a module that will integrate AI capabilities into the company’s Simatic S7-1500 controller and the company’s ET 200MP I/O system. This year, at PACK EXPO Las Vegas, Siemens unveiled the offering in the U.S., setting the foundation for the future portfolio that will enable AI throughout all levels of SiemensTotally Integrated Automation (TIA) architecture, which is a combination of hardware and software that links everything together seamlessly. The goal with TIA is to apply AI within applications that span from Siemens’ MindSphere, a cloud-based Internet of Things (IoT) operating platform, out to the industrial edge and even to the controller and field devices.
 
With the launch of the S7-1500 TM NPU (neural processing unit) module for the Simatic S7-1500 controller and the ET 200MP I/O system, Siemens has brought AI right to the controller. The S7-1500 TM NPU module is more appropriate for use on the field level at the machine – and wherever reliable, fast, deterministic decisions are required — as it enables the transfer of human expertise to the machine through training, the company said.
 
“With artificial intelligence we are able to train, recognize, and adjust to allow more flexible machinery,” said Colm Gavin, Factory Automation digitalization specialist at Siemens during a press conference at PACK EXPO. “Because, do we want 10 machines to package 10 different types of products, or a tool that accommodates different packages and different sizes and automatically adjusts to the new format?”
 
The S7-1500 TM NPU module works using a trained neural system on an SD card. Users can connect Gigabit Ethernet- and USB 3.1-compatible sensors like cameras and microphones to the module’s integrated interfaces. CPU data transmitted by the backplane bus can also be used as input data. The processing results then are assessed in the CPU program.
 
In packaging, for instance, bottles are coming down a conveyor belt quickly and if the system is trained for pass/fail, the moment something goes out of tolerance, it will fail. But if using AI, where the camera is trained with neural networks to recognize a billion pictures of every possible combination, the system will be able to figure out the rules on its own.
 
According to Gavin, the benefits of the S7-1500 TM NPU module are:
 
Flexibility - it makes handling unknown objects easy without resource-intensive programming.
Quality - Expert knowledge for fast and reliable quality checks is transferred directly to the module thanks to the higher-level training of a neural network.
Greater efficiency - Machines can respond flexibly and automatically to situations that used to require manual intervention, which reduces downtime and increases availability.
Cost-effectiveness - The module makes it possible to detect problems in production early on and avoid the cost of having to rework or even discard the product.
 
Applications in the areas of robotics, quality assurance, and condition monitoring are certainly suitable for the TM NPU module, but applications are limited only by the user’s imagination. At PACK EXPO, Siemens was demonstrating a robot with “flexible grasping” using AI, which looks at a shape and calculates the optimal point the gripper can pick it up (see video here). Once it understands the best grasping point the AI tells it where to go.“You don’t need to program the robot as AI makes it possible to grasp arbitrarily shaped and positioned objects,” Gavin said, adding that the ability to mimic the human hand in manufacturing has the potential to be a very big business.
 

Five Steps Manufacturers Can Take to Combat Cyber Attacks

Oct 8, 2019
Five Steps Manufacturers Can Take to Combat Cyber Attacks
View Full Size
It’s not far-fetched to claim that the existing cybersecurity landscape is relatively tumultuous, and that is true in every single industry from retail to finance. Cyber strikes are on the rise, primarily in the manufacturing sector.
 
One of the popular motives why manufacturing has come under fire is simply because cyber threats have grown much more sophisticated in recent years. Things have advanced beyond just a software standpoint to hardware — processor vulnerabilities being a perfect example.
 
The fact is, a recent security report from SonicWall Capture Labs shown there were over 74,000 “never-before-seen” complex strikes in 2019. They were so fresh that most of those threats were without even a signature at the time of discovery.
 
This alarming information suggests that cyberattacks on manufacturers are going to grow more frequent, more advanced and more successful. There is a very clear need to protect not just for conventional manufacturing operations, but also all networks, systems and resulting data — especially as the manufacturing industry evolves into a more digital-centric ecosystem.
 
Fortunately, there are cybersecurity tips available to help manufacturers of all sizes protect themselves from cyber threats and prepare themselves for the brave new world of Industry 4.0.
 
Introducing the NIST Cybersecurity Framework
 
The National Institute of Standards and Technology (NIST) has developed a cybersecurity framework that includes some directions and best practices for coping with potential cybersecurity threats. Most importantly, it's attainable to all organizations, including small to medium-sized manufacturers.
 
Representatives of the MEP National NetworkTM, just like the Michigan Manufacturing Technology Center, offer flexible, cost-effective approaches to implementing cybersecurity programs that align with the NIST framework, making these protections accessible to even a cost-prohibited company.
 
The framework lays out five practical activities, or functions, that can be used to achieve a more secure operation. They include:
 
1. Identify
 
This first function expressly deals with understanding potential cybersecurity risks to an organization, including its systems, people, assets, data, capabilities and networks. The primary question is: What must be done to manage existing risks and mitigate the potential for damage?
 
Actions the framework recommends in this category include:
  • Controlling who has access to your information
  • Conducting background and security checks for all employees
  • Requiring individual user accounts for each employee
  • Creating cybersecurity policies and procedures
 
2. Protect
 
Commonly, learning about leads to taking action — which is the protection aspect of the framework. This is where a manufacturer must develop and implement safeguards for its operations or services. Actions you can take include to protect your operation include:
  • Limiting access to your user data and information
  • Installing surge protector and uninterruptible power supplies
  • Patching your operating systems and software regularly
  • Installing and activating software and hardware firewalls
  • Securing all wireless access points and network
  • Setting up web and email filters
  • Using encryption for sensitive business information
  • Disposing of old computers and media safely
  • Training your employees
 
3. Detect
 
A suitable monitoring system must be applied to determine either a recent cybersecurity event or one that’s ongoing. The well timed discovery of these attacks is necessary to an effective security strategy. Activities for detecting cyber attacks include:
 
  • Installing and updating anti-virus and other cybersecurity programs
  • Running anti-virus and anti-spyware programs daily
  • Conducting full system scans daily
  • Maintaining and monitoring detection logs
 
4. Respond
 
Upon discovery, every manufacturer must have controls available to respond accordingly to an attack. These include functionality to block them, together with to regain access to a system.
 
This functionality is somehow distinct for manufacturers as most providers use only limited networks or wireless connectivity. Industrial-quality access controls are crucial to monitor not merely internal processes and systems, but also that of vendors and involved partners. Dynamic, real-time policy enforcement is essential across all of the network, and not just for local operations.
 
A response program should include:
 
  • Developing a plan for information security incidents by determining:
  • Who to call in case of an incident
  • What to do with your data in case of an incident
  • When to alert senior management, emergency personnel, and others
  • The types of activities that constitute an information security incident
  • Know your notification obligations
 
5. Recover
 
Generally the same as data or systems recovery, this function deals with the restoration of impaired or damaged services and content. It should include:
 
  • Making full backups of essential business data
  • Incremental backups of important business information
  • Assessing and improving your procedures and technologies
 
One other aspect of this is opening up communications with clients or customers to reveal the impact of an event. Essentially, it would also include follow-up measures to stop future attacks.
 
How to Secure Your Company
 
At last, companies should focus on adhering to NIST’s volunteer framework not merely to prevent severe threats, but likewise to understand how they and their team can better deal with them. For many organizations — big and small — it is not a question of whether or not they will experience a cyberattack but when. Being prepared for when that happens is the best way to mitigate prospective damage and any operational impact.
 
The best course of action to secure your company — or to find out just how vulnerable it is — is to work alongside experts in manufacturing cybersecurity and the NIST Cybersecurity Framework. Manufacturers ready to take this critical step in their digital evolution should contact their local Manufacturing Extension Partnership Program (MEP) Center. They are really part of the MEP National Network, which includes hundreds of specialists who know how to address the cybersecurity concerns of small and medium-sized manufacturers, and who are well-versed in the NIST Cybersecurity Framework.
 
INDUSTRYWEEK

China Is Too Big To Be Isolated

Oct 8, 2019
China Is Too Big To Be Isolated
View Full Size
Immediately after it was reported in September that NTT Docomo, Japan's biggest mobile carrier, would not offer phones from Huawei Technologies for use on its 5G network, the Chinese corporation announced its new products would ship without Google applications. These actions really point to the coming of a technologically divided world. The U.S. in May blacklisted Huawei due to national security concerns, imposing restrictions on sales of U.S. technologies, including Google's Android operation system, to the Chinese company.
 
With the U.S. Commerce Department having placed Huawei on what it calls the Entity List, the company's new phones likely will not come with popular apps such as Gmail, Chrome and Google Maps. Considering this situation, there is a strong strategic rationale for DoCoMo's decision. The Australian government has made a decision to keep out Huawei from supplying equipment for its 5G network. In New Zealand, the Government Communications Security Bureau has voiced national security concerns in regards to allowing Huawei to supply key 5G technologies.
 
It seems to be an alliance is growing among Asia-Pacific countries, basically U.S. allies, to block Huawei's infiltration into 5G wireless architecture. But a strategy to technologically isolate China is unlikely to work.
 
More and more African countries are inviting Huawei's entry into their telecom markets. Malaysian Prime Minister Mahathir Mohamad and Thai economy minister Pichet Durongkaveroj have said they are happy toward adopting Huawei equipment. Singapore has remained silent on the issue. Chinese 5G technologies are making steady inroads into most parts of the world, mainly nations joining President Xi Jinping's ''One Belt, One Road'' infrastructure building initiative, meant to connect countries across Asia, Africa and Europe.
 
The primary reason is simple: Huawei's 5G offerings are 15% to 30% cheaper than rival products provided by Nokia and Ericsson, according to the head of Huawei's Australian unit. The world is dividing itself into two groups: nations that reject competitive technologies to avoid presumed national security risks and those that prioritize upgrading economic infrastructure at lower costs.
 
Pro- and anti-Huawei spheres are emerging with huge implications for the future of the global economy. With the U.S. and China demonstrating few signs of trying to meet halfway on the Huawei issue, other countries are coming under pressure to take sides.
 
Alarmed by the prospect of facing a U.S. trade embargo on a broad range of parts and licenses, China has started stepping up its efforts to build a self-sufficient supply chain. HiSilicon, Huawei's semiconductor arm, has supposedly begun full-fledged operations to manufacture its Balong 5G chipsets, a market in which Micron Technology has been the dominant player. Huawei plans to launch as early as next spring its first smartphones running on its own Harmony OS, which the company boasts outperforms Google's Android.
 
The U.S. strategy to minimize the amount of Chinese equipment in 5G networks is having an unintended consequence: It is driving the world's second largest economy to build its own technological ecosystem, one that may possibly involve many other countries. A global supply chain that has evolved over many years cannot be easily reorganized. It may be true for now that Huawei cannot manufacture products without U.S. technologies or Japanese parts. But this must not be taken as proof that a strategy of containing China will work in the long run.
 
As of the end of April, about 130 countries had signed onto the Belt and Road. The total represents 5 billion consumers, or 60% of the global population. There is no ruling out the possibility that this group may in due course grow into a huge economic bloc four times larger than the Western World, including the U.S., Europe and Japan.
 
China is simply too big to be isolated.
 
In order to avoid splitting the world into two bitterly divided blocs, the U.S. and its allies should try to entice China to embrace the current world economic order, in which markets are ruled by law, not the party. The true aim of the Trans-Pacific Partnership multilateral trade pact that Trump pulled the U.S. out of in one of his first acts as president was to integrate China into international trade rules. It would probably be time to start envisioning a TPP for digital trade involving the U.S. as well.
 

Will There Be An AI Productivity Boom

Oct 7, 2019
Will There Be An AI Productivity Boom
View Full Size
A big pastime of economists in the 1980s and 1990s was attempting to estimate how much corporate and industrial productivity would certainly benefit from the then-novel phenomena of personal computers, workgroup servers, and computer networking. In the early stages it was hard to see, but over time, economists did indeed find evidence that information technology contributed to boosting economic productivity.
 
It’s a bit too early to expect to see data showing an identical boom from artificial intelligence, today’s big IT revolution. The technology is just becoming industrialized, and so many companies have yet to even try to use things such as for instance machine learning in any significant way.
 
But it's not too soon to speculate. There is no question companies will progressively use AI technologies of various sorts. AI is now well on its way to being part of how companies function. Every company has tons of data to analyze, and that analysis can benefit from even simple machine learning techniques. And companies have processes, from HR to accounting to sales, that can make use of automation that AI can bring.
 
Will all that show up in the numbers around output per employee and such, the measures of productivity? Though it cannot be ruled out, a couple big obstructions stand in the way of AI having an effect on productivity similar to the PC era.
 
One problem is that AI is ruled by the companies that are already among the most productive in the world. As MIT economist David Autor and colleagues have written, wealth is significantly concentrated in the hands of what they term “superstar firms,” a situation of “winner take most,” where “a small number of firms gain a very large share of the market,” firms that are the “more productive” ones.
 
Those companies include Google and Facebook, and others that, Autor and colleagues show, are much more efficient in terms of their labor force. “Many of the canonical superstar firms such as Google and Facebook employ relatively few workers compared to their market capitalization” because “their market value is based on intellectual property and a cadre of highly-skilled workers.”
 
Google, Facebook, Apple, Amazon and Microsoft, the biggest tech companies in the world, the superstar firms, are exactly the ones that already dominate artificial intelligence all over the world, the companies at the forefront of deep learning and other sorts of cutting-edge AI. In a sense, AI is being used to boost productivity that is already greatly above normal. At the same time, something unfortunate has befallen all the non-superstar firms in the world. Back in the 1980s and 1990s, PCs and related technology were a broad global trend benefitting any company that bought PCs, servers and networking. Productivity was theoretically available to all.
 
With the death of Moore’s Law, the decades-long rule of progress in the semiconductor industry, there's less and less technology improvement that is extensively available in a direct way to every firm. Fundamental research has contracted across the technology industry, and much of what innovation happens is progressively more concentrated in the R&D labs of those same superstar firms.
 
With superstar firms ruling AI, and broad tech progress no longer evenly distributed, how will AI contribute to a boom? Perhaps it will happen indirectly, a process of “trickle-down” productivity, as ordinary firms adopt the AI technologies provided by Google and Microsoft and Amazon in the cloud. Regardless if productivity does not immediately improve at every firm, improvements could still materialize inside of industries, and as a national or global phenomenon.
 
It’s important to remember that productivity can take some time to materialize. Back in 1987, Nobel Prize-winning economist Robert Solow was the first scholar to suggest the apparent absence of IT-led productivity growth. “You can see the computer age everywhere but in the productivity statistics,” he famously wrote. It took another decade or so, but gradually the numbers did show progress. An AI boom is possible; undoubtedly, it shouldn’t be ruled out. But market concentration and a slowdown in tech innovation broadly speaking will make it more challenging to achieve than was the case for technology revolutions of the past.
 

Qualcomm Could Unveil Its Next-gen Flagship Chip

Oct 7, 2019
Qualcomm Could Unveil Its Next-gen Flagship Chip
View Full Size
There's speculation that the chip designer will discover the Snapdragon 865 Mobile Platform. Estimated to be found inside high-end Android devices next year, the new chipset will be produced by Samsung using its 7nm EUV process. The lesser the process number, the more transistors fit inside the chip making it more powerful and energy-efficient. And extreme ultraviolet lithography (EUV) is a more perfect method of marking up a chip die for transistor placement. Qualcomm’s latest top-of-the-line-chipset is the Snapdragon 855+, an overclocked version of the Snapdragon 855 Mobile Platform that offers a 15% improvement in graphics capabilities.
 
Nevertheless, there could be another reason for the announcement. As it turns out, some new Android handsets are believed to be disclosed on Tuesday including a pair from Xiaomi (Xiaomi Mi 9 Pro 5G and Xiaomi Mi MIX Alpha), the Sony Xperia 5 and the Realme X2. Considering this, the buzz around the water cooler indicates that one or more of these devices might be the reason for the teaser that Qualcomm posted yesterday for the upcoming event. All of the aforementioned phones will employ a Snapdragon SoC with the 855+ expected inside the Mi 9 Pro 5G and probably the Mi MIX Alpha. The regular Snapdragon 855 SoC will power the Xperia 5 with the Snapdragon 730G chip driving the X2. There must be a connection between the number “3” used in Xiaomi’s teaser and the three smartphone manufacturers we have been discussing in this paragraph.
 
While Samsung is doing the fab work and developing the Snapdragon 865, Qualcomm will be returning to Taiwan Semiconductor Manufacturing Company (TSMC) for 2021’s Snapdragon 875 Mobile Platform. The world’s largest independent foundry, TSMC rolls chips off the assembly line for companies that design their own chips, but don’t have the facilities to make them. Like for example, both Apple and Huawei design their own SoCs like the A13 Bionic and Kirin 990 respectively. However both rely upon TSMC to churn out the chips they've already designed.
 
As for the Snapdragon 865, traditionally Samsung’s new Galaxy S phones have been the first with a global release to sport the latest Qualcomm Snapdragon chipset, and that more than likely will not change next year. The very first phone to be powered by the Snapdragon 855 Mobile Platform was the Xiaomi Mi 9, but this device was not offered worldwide.
 
2019 remains quite a disruptive year for Qualcomm. It set out with the chip designer in the midst of a feud with Apple and both companies were preparing to square off in court multiple times. Qualcomm also was the defendant in an antitrust case brought by the FTC. The non-jury trial in January was presided over by Judge Lucy Koh (of Samsung v. Apple fame). At the time of the proceedings, Apple and other firms testified against Qualcomm’s sales practices including its “No license, no chips” policy, the computation of royalties based on the retail price of a phone, and its failure to license its standards-essential patents in a Fair, Reasonable and Non-Discriminatory (FRAND) manner.
 
Things took a turn for the better in April (as far as Qualcomm is concerned) just as a court battle with Apple was wrapping up; the two outfits concurred on a settlement. All legal action between the companies was terminated and Apple paid Qualcomm an undisclosed amount believed to be $4.5 billion; in return, Apple received a six-year license (with a two-year option) and a multi-year chip supply agreement.
 
And so Qualcomm sailed along, but only for a month. In May, the decision was in and Judge Koh ruled that Qualcomm had engaged in anticompetitive behavior. Losing this court case can force the chip designer to overhaul its current business practices. And while Judge Koh denied to grant Qualcomm a stay that would allow it to continue the status quo until all of its appeals have been exhausted, last month the Ninth U.S. Circuit Court of Appeals given the stay.
 
If Qualcomm fails to get Judge Koh’s ruling overturned on appeal, it faces the long, complex and difficult task of renegotiating all of the current contracts it has with phone manufacturers. The chip designer asked for the stay because it didn't want to go through this process, win on appeal, and then have to come to terms on a whole new set of contracts.
 

Microsize Lens Pushes Photonics Closer to an On-Chip Future

Oct 7, 2019
Microsize Lens Pushes Photonics Closer to an On-Chip Future
View Full Size
Optical microcomputing, next-generation compact LiDAR units, and on-chip spectrometers all took a step closer to reality with the latest announcement of a new form of optical lens. The lens isn't fabricated from glass or plastic, however. Rather, this low-loss, on-chip lens is made of thin layers of specialized materials on top of a silicon wafer. These “metasurfaces” have shown much promise in recent times as a kind of new, microscale medium for containing, transmitting, and manipulating light.
 
Photonics at the macro-scale is more than 50 years old and has applications today in fields such as telecommunications, medicine, aviation, and agriculture. And yet, shrinking all the elements of traditional photonics down to microscale — to match the density of signals and processing operations inside a conventional microchip — involves completely new optical methods and materials.
 
A team of experts at the University of Delaware, including Tingyi Gu, an assistant professor of electrical and computer engineering, just recently publicized a paper in the journal Nature Communications that describes their effort to make a lens from a thin metasurface material on top of a silicon wafer. Gu says that metasurfaces have likely been made from thin metal films with nanosized structures in it. These “plasmonic” metasurfaces offered the promise of, as a Nature Photonics paper from 2017 put it, “Ultrathin, versatile, integrated optical devices and high-speed optical information processing.”
 
The challenge, Gu says, is that these “plasmonic” materials are not correctly transparent like windowpanes. Traveling just fractions of a micrometer can introduce signal loss of a few of the decibels to tens of dB. “This makes it less practical for optical communications and signal processing,” she says.
 
Her group uses a different kind of metasurface made from etched dielectric materials atop silicon wafers. Making optical elements out from dielectric metasurfaces, she says, could sidestep the signal loss problem. Her group’s paper notes that their lens introduces a signal loss of less than one dB.
 
Even a small improvement (and going from handfuls of dB down to fractions of a dB is more than small) will make a big difference, mainly because a real-world photonics chip might one day have many such components in it. And the more lossy the photonics chip, the bigger the amount of laser power needed to be pumped through the chip. More power means more heat and noise, which might ultimately limit the extent to which the chip could be miniaturized. But with her team’s dielectric metasurface lens, “We can make a device much smaller and more compact,” she says.
 
Her group's lens is made from a configuration of gratings etched in the metasurface — following a wavy pattern of vertical lines that looks a bit like the Cisco company logo. Gu’s group was able to achieve some of the familiar properties of lenses, including converging beams with a measurable focal length (8 micrometers) and object and image distance (44 and 10.1 µm). The group further used the device's lensing properties to achieve some type of optical signal Fourier Transform — and this is a property of classical, macroscopic lenses.
 
Gu says that next steps for their device include exploring new materials and to work toward a platform for on-chip signal processing. “We’re trying to see if we can come up with good designs to do tasks as complicated as what traditional electronic circuits can do,” she says. “These devices have the advantage that they can process signals at the speed of light. It doesn’t need logic signals going back and forth between transistors. … It’s going to be fast.”
 

Apple Increases Production of iPhone 11

Oct 7, 2019
Apple Increases Production of iPhone 11
View Full Size
Apple has assured suppliers to surge their production of its latest iPhone 11 range by nearly 10%, or 8 million units, the Nikkei Asian Review has acquired, as a result of better-than-expected demand worldwide for its new cut-price handset.
 
The grow in orders appears to validate Apple CEO Tim Cook's new strategy of appealing budget-conscious consumers with cheaper models amid the weakening world economy. The order boost of between 7 million and 8 million units is corresponding to total annual phone shipments this year by Google, a rising iPhone rival in Apple's home U.S. market.
 
''This autumn is so far much busier than we expected,'' one source with direct knowledge of the situation said. ''Previously, Apple was quite conservative about placing orders,'' which were less than for last year's new iPhone. ''After the increase, prepared production volume for the iPhone 11 series will be higher compared to last year,'' the source said.
 
Shares of Apple component manufacturers flashy in Japan right after publication of the Nikkei report, outperforming the much wider market. Japan’s Minebea Mitsumi closed up 3%, troubled iPhone screen maker Japan Display rose by around 2%, while Murata Manufacturing and Alps Alpine also gained.
 
Apple announced its three new iPhone models - the iPhone 11, 11 Pro, and 11 Pro Max - in early September, and for the first time in its history reduced the starting price of the model upgrade, despite better cameras, to $699, compared to $749 for last year's iPhone XR. Apple's new budget-conscious strategy came as the global smartphone market is assumed to shrink in general for the third year running, according to research company IDC. In January, Cook acknowledged that ''price is a factor'' behind Apple's slowing sales, especially in emerging markets. 
 
The latest surge in iPhone orders is centered in the cheapest iPhone 11 model and the iPhone 11 Pro model, sources said, while Apple has a little bit revised down orders for its top of the range model, the iPhone 11 Pro Max, which has a starting price of $1,099. Cook recently told German newspaper Bild that he could not be happier with the iPhone 11 launching and that its sales had enjoyed a ''very strong start.'' Apple's share price has advanced approximately 40% this year and is now close to its October 2018 record high.
 
Nevertheless, suppliers remain cautious and said they were anxious that the higher level of orders wouldn't be endured. ''Demand is good for now. But we have to be careful not to be too optimistic,'' one executive-level source told the Nikkei. ''I hope that this year's peak season lasts longer than last year.''
 
One element that may have helped stimulate demand momentarily is that Apple's iPhone 11 is still manufactured in China, and Washington has briefly postponed a planned 10% tariff on China-made electronic imports. The wait in the tariff hike, from September to December 15, will help demand during the Thanksgiving and Christmas shopping seasons. Donald Trump has hardened his posture on trade talks with China, saying in September he did not want an interim truce.
 
Nevertheless, the uptick in iPhone orders is a welcome change in recent fortunes for California-based Apple.
 
Just the previous year, Apple asked key iPhone assemblers Foxconn, which previously trades as Hon Hai Precision Industry, and Pegatron to call off additional production only weeks after the iPhone XR hit the shelves. Subsequently, this January, Apple made an infrequent move when it cut its quarterly sales forecast, blaming soft iPhone demand in China as its economy slowed. A continuous rise in demand now would therefore counter the 2018 drop in iPhones sales - the company's first since the iconic handset first launched in 2007.
 
''Apple's pricing strategy this year so far turns out to have boosted some initial sales and preorders. ... However, given the weakening world economy and uncertainties ahead, we are concerned whether the good demand will last long,'' said Chiu Shih-Fang, a veteran smartphone analyst at Taiwan Institute of Economic Research.
 
''Even if the second half is definitely better than the first half, we need to monitor if the lower average sale price could have an impact on Apple's total revenue.''
 
Yasuo Nakane, head of global tech research at Mizuho Securities, said he had revised up 2019 iPhone manufacturing estimates to 194 million units from 178 million - although that is still lower than the 208.8 million iPhones sold in 2018. All models in the iPhone 11 range have better cameras than last year's, and are loaded with facial recognition and wireless charging features similar to 2018.
 
Compared with rivals Samsung Electronics, Huawei Technologies, Xiaomi and Oppo, Apple did not introduce 5G compatibility, the next generation wireless communication standard that allows faster data transfer and low latency. In the first half of 2019, Apple suffered a nearly 25% slump in iPhone shipments, in comparison to 2018, according to IDC - far worse than its main rivals, Samsung Electronics and Huawei Technologies. The world's top two smartphone makers, respectively saw an almost 2% drop and a nearly 26% surge in shipments over the same period.
 

How Much Power Will Quantum Computing Need

Oct 4, 2019
How Much Power Will Quantum Computing Need
View Full Size
Google’s Quantum AI Lab has installed the advanced generation of what D-Wave Systems defines as the world’s first commercial quantum computers. Quantum computing can probably solve specific problems a lot quicker than today’s classical computers while using relatively less power to perform the calculations. Yet the energy efficiency of quantum computing always continues to be a mystery.
 
At the moment, D-Wave’s machines can scale up the number of quantum bits (qubits) they use without significantly increasing their power requirements. That’s simply because D-Wave’s quantum computing hardware rely on a specialized design consisting of metal niobium loops that act as superconductors when chilled to a frigid 15 millikelvin (-273°  C). Much of the D-Wave hardware’s power consumption — a little lower than 25 kilowatts for the latest machine — goes toward running the refrigeration unit that keeps the quantum processor cool. The quantum processor itself necessitates a comparative pittance.
 
“The operation of the quantum processor itself requires remarkably little power—only a tiny fraction of a microwatt—which is essentially negligible in comparison to the power needs of the refrigerator and servers,” says Colin Williams, director of business development & strategic partnerships at D-Wave Systems.
 
The new 1000-qubit D-Wave 2X machine installed at Google’s lab has around double the qubits of its predecessor, the D-Wave Two machine. But the minimal volume of power used by the quantum processor means that “the total system power will still remain more or less constant for many generations to come” even as the quantum processor scales up to thousands of qubits, Williams says. D-Wave can currently get away with this because the same “cryostat” unit that uses so many kilowatts of power would still be sufficient to cool much larger quantum processors than the ones previously in use. 
 
''It would be similar if you attach a large cooling device to your PC that uses many kilowatts of power — you would barely see an increase in power consumption when going to larger systems since the power is dominated by the large cooling infrastructure,'' says Matthias Troyer, a computational physicist at ETH Zurich.
 
The ability to scale up a D-Wave machine’s computing capabilities with no need of increasing its power consumption may sound appealing. But it actually doesn’t say much about the power efficiency of quantum computing compared with classical computing. Today’s D-Wave machines perform about as well as a high-end PC on particular unique tasks, but they use way more power based on their extreme cooling requirements. (High-end computing cores require just tens of watts of power.)
 
“While the ‘flat power requirement’ is a good statement to make for marketing, it is unclear at the moment what the true power needs are once the device is optimized and scaled up,” Troyer says. “Right now they need orders of magnitude more power than competing classical technology.”
 
However, this isn't completely a fair comparison, Troyer says. “On the power side, they are currently losing,” he says. But the D-Wave machine “is not engineered to be power saving. It may pay off again at some point.”
 
Scott Aaronson, a theoretical computer scientist at MIT and a D-Wave critic, seemed bemused by the idea of D-Wave having a power advantage of any sort. Pertaining to D-Wave’s reliance on a crygenic cooler he wrote in an email: “It’s amusing chutzpah to take such a gigantic difficulty and then present it as a feature.” He talked about that D-Wave may require an even more power-hungry cooling system to create lower temperatures that increase its quantum processors’ chances of a “speedup” advantage over classical computing in the future.
 
D-Wave’s quantum annealing machines exemplify just one possible computer architecture for quantum computing. These are generally designed to solve a specialized set of “optimization problems” rather than act as universal logic-gate quantum computers. (The latter would be super-fast versions of today’s classical “gate-model” computers.) Google’s Quantum AI Lab has invested in both D-Wave’s machines and in exploring development of universal logic-gate quantum computers.
 
In due course, Troyer expects power requirements for quantum computing to perhaps be “linearly proportional” to the number of qubits and their couplings, as well as proportional to the number of times operators must run and recool the system before it finds the solution.
 
Quantum computing’s high strengths most likely won’t begin to emerge until engineers build machines with many thousands or possibly millions of qubits. That’s still a ways off even for D-Wave, which has chosen to scale up the number of qubits in its processors pretty quickly. Most quantum computing researchers have opted for an even slower approach of building quantum computing devices with just several qubits or tens of qubits, because of major challenges in correcting for qubit errors and maintaining coherence across the system.
 
However, both D-Wave and independent quantum computing labs share the identical general goal of building machines that can exploit the “spooky physics” of quantum physics. Quantum computers could potentially perform many more calculations at the same time than classical machines. If quantum computers can conquer classical computers in terms of “time to solution,” they could also prove more power-efficient at the end of the day.
 
“If a quantum device can solve a problem with much better [time to solution] scaling than classical computing, it would also win on power,'' Troyer says.
 

You have 0 items in you cart. Would you like to checkout now?
0 items