Page 2,913«..1020..2,9122,9132,9142,915..2,9202,930..»

IBM just solved this quantum computing problem 120 times faster than previously possible – ZDNet

Using a combination of tweaked algorithms, improved control systems and a new quantum service called Qiskit Runtime, IBM researchers have managed to resolve a quantum problem 120 times faster than the previous time they gave it a go.

Back in 2017, Big Blue announced that, equipped with a seven-qubit quantum processor,its researchers had successfully simulated the behavior of a small moleculecalled lithium hydride (LiH). At the time, the operation took 45 days. Now, four years later, the IBM Quantum team has announced that the same problem was solved in only nine hours.

The simulation was run entirely on the cloud, through IBM's Qiskit platform an open-source library of tools that lets developers around the world create quantum programs and run them on prototype quantum devices that IBM makes available over the cloud.

SEE: Building the bionic brain (free PDF) (TechRepublic)

The speed-up that was observed was largely made possible thanks to a new quantum service, Qiskit Runtime, which was key to reducing latencies during the simulation.

IBMteased Qiskit Runtime earlier this yearas part of the company's software roadmap for quantum computing, and at the time estimated that the new service would lead to a 100-time speed-up in workloads. With a reported 120-time speed-up, therefore, it seems that Big Blue has exceeded its own objectives.

Classical computing remains a fundamental part of Qiskit, and of any quantum operation carried out over the cloud. A quantum program can effectively be broken down into two parts: using classical hardware, like a laptop, developers send queries over the cloud to the quantum hardware in this case, to IBM's quantum computation center in Poughkeepsie, New York.

"The quantum method isn't just a quantum circuit that you execute," Blake Johnson, quantum platform lead at IBM Quantum, tells ZDNet. "There is an interaction between a classical computing resource that makes queries to the quantum hardware, then interprets those results to make new queries. That conversation is not a one-off thing it's happening over and over again, and you need it to be fast."

With every request that is sent, a few tens of thousands of quantum circuits are executed. To simulate the small LiH molecule, for example, 4.1 billion circuits were executed, which corresponds to millions of queries going back and forth between the classical resource and the quantum one.

When this conversation happens in the cloud, over an internet connection, between a user's laptop and IBM's US-based quantum processors, latency can quickly become a significant hurdle.

Case in point: while solving a problem as complex as molecular simulation in 45 days is a start, it isn't enough to achieve the quantum strides that scientists are getting excited about.

"We currently have a system that isn't architected intrinsically around the fact that real workloads have these quantum-classical loops," says Johnson.

Based on this observation, IBM's quantum team set out to build Qiskit Runtime a system that is built to natively accelerate the execution of a quantum program by removing some of the friction associated with the back-and-forth that is on-going between the quantum and the classical world.

Qiskit Runtime creates a containerized execution environment located beside the quantum hardware. Rather than sending many queries from their device to the cloud-based quantum computer, developers can therefore send entire programs to the Runtime environment, where the IBM hybrid cloud uploads and executes the work for them.

In other words, the loops that happen between the classical and the quantum environment are contained within Runtime which itself is near to the quantum processor. This effectively slashes the latencies that emerge from communicating between a user's computer and the quantum processor.

"The classical part, which generates queries to the quantum hardware, can now be run in a container platform that is co-located with the quantum hardware," explains Johnson. "The program executing there can ask a question to the quantum hardware and get a response back very quickly. It is a very low-cost interaction, so those loops are now suddenly much faster."

Improving the accuracy and scale of quantum calculations is no easy task.

Until now, explains Johnson, much of the research effort has focused on improving the quality of the quantum circuit. In practice, this has meant developing software that helps correct errors and add fault tolerance to the quantum hardware.

Qiskit Runtime, in this sense, marks a change in thinking: instead of working on the quality of quantum hardware, says Johnson, the system increases the overall program's capacity.

It remains true that the 120-times speed-up would not have been possible without additional tweaks to the hardware performance.

Algorithmic improvements, for example, reduced the number of iterations of the model that were required to receive a final answer by two to 10 times; while better processor performance meant that each iteration of the algorithm required less circuit runs.

At the same time, upgrades to the system software and control systems reduced the amount of time per circuit execution for each iteration.

"The quality is a critical ingredient that also makes the whole system run faster," says Johnson. "It is the harmonious improvement of quality and capacity working together that makes the system faster."

Now that the speed-up has been demonstrated in simulating the LiH molecule, Johnson is hoping to see developers use the improved technology to experiment with quantum applications in a variety of different fields beyond chemistry.

In another demonstration, for example, IBM's quantum team used Qiskit Runtime to run a machine-learning program for a classification task. The new system was able to execute the workload and find the optimal model to label a set of data in a timescale that Johnson described as "meaningful".

Qiskit Runtime will initially be released in beta, for a select number of users from IBM's Q Network, and will come with a fixed set-up of programs that are configurable. IBM expects that the system will be available to every user of the company's quantum services in the third quarter of 2021.

Combined with the 127-qubit quantum processor, called the IBM Quantum Eagle, which is slated for later this year, Big Blue hopes that the speed-up enabled by Runtime will mean that a lot of tasks that were once thought impractical on quantum computers will now be achievable.

The system certainly sets IBM on track to meet the objectives laid out in the company's quantum software roadmap, which projects that there will be frictionless quantum computing in a number of applications by 2025.

Visit link:
IBM just solved this quantum computing problem 120 times faster than previously possible - ZDNet

Read More..

Quantum computings imminent arrival in Cleveland could be a back-to-the-future moment: Thomas Bier – cleveland.com

CLEVELAND -- The Cleveland Clinics partnership with IBM to use quantum computing for medical research brings to mind the most unfortunate instance of bad timing in the history of Cleveland: the 1967 merger of Case Institute of Technology with Western Reserve University just when the computer age was coming to life.

The merger squelched Cases opportunity to be among the leaders in the most revolutionary technology ever (and to benefit Cleveland with computer-related jobs). Might the arrival of quantum computing mean fresh opportunity?

At the time of the merger, Cases Department of Computer Engineering and Science had a good chance to be at the forefront. But capitalizing on that required support from senior administrators of the new Case Western Reserve University administrators who could not be focused on technology to the degree that Case, on its own, had been. In the new world of CWRU, technology was one of many fields.

A vision for the merged institutions prepared by a prominent commission gave only a brief mention of computing either as a current or potential strength of the new institution or as a challenge or opportunity to be addressed, according to Richard E. Baznik in Beyond the Fence: A Social History of Case Western Reserve University. The goose with golden innards wasnt even recognized, let alone encouraged to lay eggs.

Further, the merger created the worst possible institutional environment for computer advocates. Not only did administrators have to contend with issues of who might lose their job because of consolidation and who would have which power (particularly over budget), they also had to manage the challenge that all universities were facing as the post-World War II surge in enrollment and federal funding was ebbing.

Inescapably, the units that formed CWRU were locked in competition for shrinking resources, if not survival. And in that mix, dominated by heavyweights such as the School of Medicine and the main sciences, computers was a flyweight.

All of that was topped off by intense feelings among Case people of being severely violated by the Institutes loss of independence, which feelings were heightened by the substantial upgrading that had occurred under the longtime leadership of former Case president T. Keith Glennan (president from 1947 to 1966).

Thomas Bier is an associate of the university at Cleveland State University.

The combination of those potent forces upset CWRU institutional stability, which was not fully reestablished until the presidency of Barbara Snyder 40 years later.

Although in 1971, CWRUs computer engineering program would be the first of its type to be accredited in the nation, momentum sagged and the opportunity to be among the vanguard was lost. Today, the universitys programs in computer engineering and science are well-regarded but not top-tier.

But the arrival of quantum computing poses the challenge to identify new opportunity and exploit it.

Quantum computing, as IBM puts it, is tomorrows computing today. Its enormous processing power enables multiple computations to be performed simultaneously with unprecedented speed. And the Clinics installation will be first private-sector, on-premises system in the United States.

Clinic CEO and President Dr. Tomislav Mihaljevic said, These new computing technologies can help revolutionize discovery in the life sciences and help transform medicine, while training the workforce of the future and potentially growing our economy.

In terms of jobs, the economy of Northeast Ohio has been tepid for decades, reflecting, in part, its scant role in computer innovation. While our job growth has been nil, computer hot spots such as Seattle and Austin have been gaining an average of 25,000 jobs annually.

Cleveland cannot become a Seattle or an Austin. Various factors dictate that. But, hopefully, the arrival of quantum computing a short distance down Euclid Avenue from CWRU will trigger creative, promising initiatives. Maybe, as young technologists and researchers become involved in the Clinic-IBM venture, an innovative entrepreneur will emerge and lead the growth of a whole new industry. Maybe, the timing couldnt be better.

Quantum computing bring, it, on!

Thomas Bier is an associate of the university at Cleveland State University where, until he retired in 2003, he was director of the Housing Policy Research Program in the Maxine Goodman Levin College of Urban Affairs. Bier received both his masters in science degree, in 1963, and Ph.D., in 1968, from from Case/CWRU. Both degrees are in organizational behavior.

Have something to say about this topic?

* Send a letter to the editor, which will be considered for print publication.

* Email general questions about our editorial board or comments or corrections on this opinion column to Elizabeth Sullivan, director of opinion, at esullivan@cleveland.com.

Originally posted here:
Quantum computings imminent arrival in Cleveland could be a back-to-the-future moment: Thomas Bier - cleveland.com

Read More..

Protecting Powerlines And Pipelines: The Quantum Solution – Forbes

America dodged a major cyber bullet this past weekend, although the end-result of the ransomware attack on the Colonial Pipeline has been disruptive enough, producing economic shock across the country and gas lines in the Northeast.

Still, if the still-unidentified hackers had wanted to break into the technology operating the pipeline, instead of looking for easy blackmail money; the attack could have been catastrophic with effects lasting for months, even years. Instead, operators shut down the pipeline themselves to prevent such an occurrence from happening: a clear admission of how vulnerable our energy grid is, just like our power grid, even after a decade or more of warnings.

Taken together with the weather-related Texas power outage I wrote about in this space more than a month ago, the pipeline attack is a clear and present warningwith trillions of dollars in losses at stake. Unless we get serious about protecting our power and energy infrastructure, attacks like this weekends will become more disastrous and more disruptive, until we face the worst of alla future quantum computer attack that breaks the back of the entire United States economy.

FILE: A Colonial Pipeline Co. sign at the Pelham junction and tank farm in Pelham, Alabama, U.S., on ... [+] Monday, Sept. 19, 2016. Fuel suppliers are growing increasingly nervous about the possibility of gasoline and diesel shortages across the eastern U.S. almost two days after a cyberattack knocked out a massive pipeline. Colonial Pipeline said on Sunday, May 9, 2021, that it was still developing a plan for restarting the nations largest fuel pipeline -- a critical source of supply for the New York region -- and would only bring it back when safe to do so, and in full compliance with the approval of all federal regulations. Photographer: Luke Sharrett/Bloomberg

The government says it is really committed to action this time.But weve been here before.In 2007 we had the sweeping cyberattack on the U.S. government, including the Defense Department; an attack so comprehensive that I and others dubbed it a Cyber Pearl Harbor.More recently we had the hacking raid on OPM in 2015, affecting the records of at least 20 million federal employees.That was followed by the revelations about the Solar Winds hacks last year.

Yet here we are, still vulnerable, still exposed.Its as if after the bombs were dropped on Pearl Harbor that Sunday morning in December 1941, Americans had read about the crippling of the U.S. fleet, then rolled over and went back to sleep.

Sleeping through cyber disasters is no longer an option.Fortunately, the emerging technologies of the quantum revolution offer solutions both long-term and short-term to our worst infrastructure threatsincluding a future quantum computer attack itself.

The first solution are software algorithms that are specifically designed to protect against future quantum computer assault.Under the rubric of post-quantum cryptography (PQC), these algorithms are also insurance against conventional cyberattacks. At the National Institute of Standards and Technology (NIST) scientists and engineers are diligently preparing national standards for PQC, which are slated to be finished by 2024 and will then be ready to deploy to protect all public encryption. These algorithms will be eagerly awaited since they will provide protection against classical hackers, as wellindeed there are companies like Canadas ISARA Corp. which have been deploying PQC algorithms already.

The second quantum solution is even closer at hand.It uses the same scientific phenomenon that makes quantum computing possiblethe entanglement of sub-atomic particlesto provide hack-proof keys for communication between end-to-end users.

Some of these quantum-based cryptographic systems use quantum random number generators to produce quantum encryption keys.Another company, Qubitekk, produces entangled photons to generate identical symmetric keys at both ends of the communication link. In either case any unauthorized intrusion into the communication immediately severs the linkand everyone knows instantly theres been an attack.

Operator at work place in the system control room

These quantum-based solutions are especially suited for the Supervisory Control and Data Acquisition (SCADA) systems that control and monitor field devices from a central command center.Utilities and infrastructure companies have used SCADA for years to administer power stations and pipelines.These rely heavily on point-to-point communications for their operations, while communication protocols continually transfer data from sensors to SCADA servers, and back to the sensors.Quantum-based cryptography can offer tamper-proof protections for these protocols.Scientists at both at Oak Ridge Laboratories and Los Alamos have been working on quantum key distribution (QKD) capabilities to secure the energy sector. A range of American and European companies have successfully deployed similar quantum key networks for their clients.

Taken together, then, quantum solutions can secure systems now and in the future against quantum computer attacks.It simply doesnt make sense to spend billions on classical cyber protections that will be obsolete in 3-4 years as hackers inevitably find their way around those safeguards, instead of investing in quantum-based hack-proof protections that will last for decades.

Thanks to our on-going cyber vulnerabilities, America has become like a bank vault with the door wide open.Were simply inviting attackers, and when a truly determined predator like Russia or China steps in, it could mean ruin for the U.S. economy, not just for a few months or a year, but for good.Quantum technology may not offer all the answers, but it may be the ultimate firewall weve all been waiting forand that state and non-state hackers have been hoping we wouldnt discover.

Read the original here:
Protecting Powerlines And Pipelines: The Quantum Solution - Forbes

Read More..

Precision Is Natures Gift to Technology – The Wall Street Journal

Nobel Prize-winning physicist Frank Wilczek explores the secrets of the cosmos. Read previous columns here.

Precision is a powerful tool, but it can be hard to come by. That theme, with variations, is a leitmotif of science, organic life and modern technology. It is sounding again today, at the frontier of quantum computing.

Consider biology. Complex organisms store their essential operating systemsinstructions for how to build cells and keep them goingwithin long DNA molecules. Those basic programs must be read out and translated into chemical events. Errors in translation can be catastrophic, resulting in defective, dysfunctional proteins or even in cancers. So biology has evolved an elaborate machinery of repair and proofreading to keep error rates lowaround one per billion operations. A series of complicated molecular machines examine the progress and correct mistakes, in a process aptly called proof-reading. The creation of this machinery is one of evolutions greatest achievements.

Many applications of computers also need precision. (For instance, in bank transactions its important to get passwords and transfers exactly right!) Modern computer technology came into its own when small, reliable solid-state transistors became available. Here, the basic distinction between 0 and 1 gets encoded in two alternative locations for buckets of electrons. When there are many electrons per bucket, errors in the position of one or a few dont spoil the message.

But in doing computations the computer must move the buckets around. Making those buckets of electrons smaller makes the job of moving them around easier. Indeed, the computer industrys spectacular record of ever-faster speed is largely the story of lowering the number of electrons used to make a bit; nowadays were approaching ten or fewer. Unfortunately, at this frontier the near error-immunity that stems from having many redundant electrons is less automatic. To maintain nearly error-free, precise operation, new tricks will be necessary.

See the rest here:
Precision Is Natures Gift to Technology - The Wall Street Journal

Read More..

Quantum Blockchain Technologies could help transform the digital economy as we know it – Proactive Investors UK

() recently changed its name from Clear Leisure, which was more than a cosmetic alteration. Our investment programme is focused on selecting the most innovative and out-of-the-box start-ups in the blockchain and cryptocurrencies sector, with whom we will work alongside to develop exciting synergies, it said.

Below is an abridged transcript from a recent conversation with chairman Francesco Gardin, which provides a flavour of what is planned. After that is a brief explainer on quantum computing.

Let's focus on quantum [computing] first. There are few things that changed the course of mankind. Recently we witnessed the digital revolution; in the early 60s, computers were basically invented thanks to the transistor. And we are now very close to a similar revolution. [Quantum computers] could do something which are orders of magnitude superior to our digital computers. So, when you have this unlimited amount of computing power, you have no more boundaries to what you can do.

We are building a new team. If you want to use an analogy, then Formula 1 is a good one. We are trying to build the number-one car. So, we need a workshop, excellent engineers, and excellent drivers. And that is exactly what we're doing. We're setting up a workshop. Of course, it will not be a physical one, but a very well-protected data centre. We are setting up a team of experts; former students from the UCL in London and physicists from Milan University. So, we are putting together an excellent team of experts to work on our R&D. We are already working in the direction of using quantum computers and deep learning to explore mega terabytes of data related to, for example, cryptocurrencies and designing new ASIC chips. So, I mean the amount of R&D that we're going to pour into this company is massive.

Our strategy is one where we will deliver intermediate results that are very attractive not only for our own use but might also be useful for other companies too. So, some of our research will be medium- and long-term. Other parts of our research will be short-term that can be exploited with the right partner.

The mechanical and electrical interaction of a traditional computer can be distilled down to an on-off switch; or the ones and zeros that make up the binary code that powers the digisphere. These are called bits. Quantum computers use quantum bits or qubits and tap into the unique ability of subatomic participles to exist in more than one state at the same time. Insert exploding head emoji here. Long story short, using superposition (the aforementioned ability to exist in multiple states) and a process called entanglement (really, dont ask theres a link here), quantum computers can handle exponentially more data than the current supercomputers.

Quantum computers are exceedingly difficult to engineer, build and programme, an article in the Scientific American says.

As a result, they are crippled by errors in the form of noise, faults and loss of quantum coherence, which is crucial to their operation and yet falls apart before any nontrivial program has a chance to run to completion.

Its the point at which the quantum computer outperforms a traditional supercomputer.

Google in 2019 claimed it had passed the supremacy milestone one identified as early as the 1980s. This is a wonderful achievement. The engineering here is just phenomenal, Peter Knight, a physicist at Imperial College London to the NewScientist magazine. It shows that quantum computing is really hard but not impossible. It is a stepping-stone toward a big dream.

Read the rest here:
Quantum Blockchain Technologies could help transform the digital economy as we know it - Proactive Investors UK

Read More..

Aehr Test Systems Appoints Technology Industry Veteran Fariba Danesh to its Board of Directors – GlobeNewswire

FREMONT, Calif., May 14, 2021 (GLOBE NEWSWIRE) -- Aehr Test Systems (NASDAQ: AEHR),a worldwide supplier of semiconductor test and reliability qualification equipment, today announced it has appointed Fariba Danesh to its board of directors, effective May 10, 2021.

Ms. Danesh is a technology industry veteran, with 30 years of executive-level technology and operating leadership in multiple enterprise and consumer hardware markets, with special emphasis on semiconductor, photonics, telecommunications, and data storage. She is currently COO at PsiQuantum, a quantum computing startup based in Palo Alto, CA that is using silicon photonics to build the world's first useful quantum computer, applying existing semiconductor and photonics manufacturing processes.

Gayn Erickson, President and CEO ofAehr Test Systems, commented, Fariba brings incredible knowledge, experience, and contacts in the compound semiconductor and optical semiconductor spaces. She is intimately aware of the challenges and critical requirements for stabilization and burn-in of these optical semiconductors and the unique value that Aehrs wafer level, singulated die and module test solutions bring to reliability testing. We are excited to have her join our Board.

Ms. Danesh said, I am very excited to be joining the Aehr Test Systems Board. With the unique capabilities of its test and burn-in solutions, particularly for the silicon carbide and silicon photonic markets, Aehr is well positioned to address the significant market opportunities ahead.

Prior to joining PsiQuantum in January 2021, Ms. Danesh served for nine years as CEO of Glo AB, a venture-funded photonics/compound semiconductor company that designs and develops semiconductor light-emitting diodes at levels of brightness suitable for general illumination applications. Prior to that, she was SVP, General Manager Fiber Optics Products Division of Avago Technologies (now Broadcom) for three years, where she had complete P&L responsibility for a $400 million annual revenue photonics business. Previous to that she served in senior executive positions at several leading technology companies, including EVP of Global Operations for Maxtor, a $3 billion annual revenue data storage company, COO of Finisar Corporation,one of the top three fiber optic communication product companies in the world, and CEO/COO of Genoa Corporation, a III-V semiconductor optical amplifier company.

With the appointment of Ms. Danesh, Aehr Test now has seven board members.

About Aehr Test SystemsHeadquartered in Fremont, California, Aehr Test Systems is a worldwide provider of test systems for burning-in and testing logic, optical and memory integrated circuits and has installed over 2,500 systems worldwide. Increased quality and reliability needs of the Automotive and Mobility integrated circuit markets are driving additional test requirements, incremental capacity needs, and new opportunities for Aehr Test products in package, wafer level, and singulated die/module level test. Aehr Test has developed and introduced several innovative products, including the ABTSTM and FOX-PTM families of test and burn-in systems and FOX WaferPakTM Aligner, FOX-XP WaferPak Contactor, FOX DiePak Carrier and FOX DiePak Loader. The ABTS system is used in production and qualification testing of packaged parts for both lower power and higher power logic devices as well as all common types of memory devices. The FOX-XP and FOX-NP systems are full wafer contact and singulated die/module test and burn-in systems used for burn-in and functional test of complex devices, such as leading-edge memories, digital signal processors, microprocessors, microcontrollers, systems-on-a-chip, and integrated optical devices. The FOX-CP system is a new low-cost single-wafer compact test and reliability verification solution for logic, memory and photonic devices and the newest addition to the FOX-P product family. The WaferPak contactor contains a unique full wafer probe card capable of testing wafers up to 300mm that enables IC manufacturers to perform test and burn-in of full wafers on Aehr Test FOX systems. The DiePak Carrier is a reusable, temporary package that enables IC manufacturers to perform cost-effective final test and burn-in of both bare die and modules. For more information, please visit Aehr Test Systems website at http://www.aehr.com.

Read this article:
Aehr Test Systems Appoints Technology Industry Veteran Fariba Danesh to its Board of Directors - GlobeNewswire

Read More..

‘I would never go back’ – How the cloud changed the way one company does business – WRAL.com

By Abbey Slattery, WRAL Digital Solutions

This article was written for our sponsor, Cii Technology.

When a storm hits North Carolina, Jason Cutler no longer worries his company's servers are going to fail.

"To be honest with you, I would never go back, even if that was an option," said Cutler, the information technology manager and a sales manager at Utility Service Agency.

He is talking about the change Utility Service Agency made six years ago, moving from using onsite servers to using the cloud.

"We made the move because onsite servers were getting very expensive to maintain," said Cutler.

Now, Utility Service Agency doesn't spend time maintaining servers as it works in the electric utility industry to make electric power transmission and distribution systems safer, more reliable, and more efficient.

"There are a lot of unknowns when you have a power outage or a storm and how that impacts a server on the premises," said Cutler. "When we moved to the cloud, that alleviated a bunch of headaches."

Using the cloud means Utility Service Agency's information is stored in an offsite data center that has multiple safety precautions.

"The best home for a piece of electronic equipment is a data center," said Mike Taylor, vice president of business development for Cii Technology Solutions. "A data center has physical security, has redundant air conditioning systems, has redundant electrical, it has redundant internet connections."

The necessity of those precautions became clear during an ice storm that caused a power out several years ago. Because Utility Service Agency was on the cloud, it was able to continue serving customers.

"We were up and running despite a major storm," said Cutler.

The same goes for summertime, when air conditioners put large loads on conductors, equipment is bound to fail unless it's on the cloud.

"If we lose power, everyone in my company is down," said Cutler. "And the way we sit right now, the way we're structured, we have a variety of tools in place that will help us be up and running. And this is where the cloud environment really has brought us to a more current state of having technology to support it and allows minimal interruption to our customers."

Not only can Utility Service Agency access its data anytime, employees don't have to be in the office to work.

"As long as you have internet and power, you can take your laptop and work virtually anywhere," said Cutler. "I worked out of my truck a couple times, where I needed to take care of some emergencies, and I tethered to my phone. That's another option that was available to us by using this cloud environment."

Cutler also does tasks on the go because he can access accounts and information through the cloud. This was particularly helpful during the COVID-19 pandemic.

"Mobility is really the key here," said Cutler. "That really can make or break the difference of you being successful at taking care of your customer."

Utility Service Agency's cloud system is easy to manage, quick, and allows for more employees to connect to the company.

"We have a nice infrastructure that was put together to support our growing company where it would not impact anybody from connecting at any point in time, 24/7," said Cutler.

Even the migration was painless, as Cii Technology Solutions managed everything.

"I leaned on them to think about different scenarios, how much space we'd need, how much memory we'd need, all the hardware that was required to allow everybody to log in at the same time and not max out the processors or the memory," said Cutler. "Obviously, it needs to run efficiently."

Cii Technology Solutions then created a customized solution.

"If you build a system without knowing what a company's workflow is, it's not going to be efficient for them to use, so that is absolutely paramount," said Taylor.

If your company is considering migrating to the cloud, Cutler has advice for you.

"Do it yesterday," he said. "It has been such a good move for us."

This article was written for our sponsor, Cii Technology.

Read this article:
'I would never go back' - How the cloud changed the way one company does business - WRAL.com

Read More..

Foxconn Plans to Make ‘Digital Infrastructure Hardware’ in Wisconsin What’s That? – PBS Wisconsin

Foxconn struck a new economic development deal with Wisconsin, but what exactly the Taiwan-based manufacturer plans to produce in the state remains a mystery.

In April, the Wisconsin Economic Development Corporation signed off on a majorly scaled-down tax incentive package for Foxconn. The amended contract comes less than three years after the company broke ground on a much-hyped campus in Racine County where it originally promised to build large LCD screens.

Those plans did not come to fruition. At the same time, Foxconn officials have become more circumspect about their plans for Wisconsin. In a statement lauding its revised contract with the state, the company stated the terms were "based on Foxconn's current projections for digital infrastructure hardware products through 2025." It didn't elaborate on what exactly those products might be.

What are "digital infrastructure hardware products," and how could they fit into the global electronics manufacturer's Wisconsin operations?

The latter question remains unanswered for the time being. Foxconn officials did not respond to multiple queries about its intentions, and a spokesperson for the Wisconsin Economic Development Corporation referred questions about any plans to the company.

However, on a broad basis, the term digital infrastructure hardware is used in specific ways in engineering and economic development realms.

Parmesh Ramanathan is a professor of electrical and computer engineering at the University of Wisconsin-Madison whose research centers on improving wireless networks and electric grids through new technologies. He said digital infrastructure hardware can include any number of products that support contemporary digital needs, which increasingly depend on internet-connected devices relaying information back and forth via vast off-site collections of servers called data centers.

Data centers form the backbone of what is commonly known as "the cloud." Cloud computing is a major component of 21st century digital infrastructure, along with wireless networks and "smart" devices connected to the internet, from phones to security cameras to fridges.

"When public policy people talk about investments in digital infrastructure they're usually talking about improving network connectivity or broadband connectivity," Ramanathan said.

These investments require physical hardware, ranging from the sensors installed on cell towers to the servers at the heart of the mammoth cloud-computing data centers. Indeed, previous reporting suggests Foxconn may be looking to assemble servers for cloud computing in Mount Pleasant.

A November 2020 Bloomberg report, citing unnamed "people familiar with the matter," revealed that Foxconn had landed a deal to assemble server components for Google to use in its cloud-computing business. According to the report, assembly of the server components was set to begin at Foxconn's Mount Pleasant campus during the first quarter of 2021. However, neither Foxconn nor Alphabet, the parent company of Google, has publicly confirmed such a deal, and it remains unclear well into the second quarter of 2021 if this type of work is happening in Wisconsin.

In an April 30 interview on Here & Now, Wisconsin Economic Development Corporation Secretary and CEO Missy Hughes did not mention any specific contract between Foxconn and another company, but she did describe witnessing operations in Mount Pleasant consistent with server assembly.

"What they're doing right now, what I've been able to see firsthand, is building high-tech data servers, and they have put in assembly lines to do that," said Hughes. "They are building those for a number of different companies. And they will continue to be able to be flexible and build other types of products for other companies and really use their expertise."

Ramanathan described the types of servers Google uses in its cloud-computing business as "high-end" pieces of digital infrastructure hardware, with the cost for an individual server running tens of thousands of dollars.

Tom Still, president of the Wisconsin Technology Council, said he has been unable to verify whether Foxconn has landed a deal to assemble Google servers in Wisconsin, but that doesn't mean such a deal doesn't exist.

"It's such a competitive business that I can imagine either side in those kinds of relationships would not want to talk openly about it," Still said.

He added that server assembly, whether for Google or another customer, would certainly qualify as digital infrastructure hardware production.

Further, Still described digital infrastructure hardware as a broad and evolving term of art.

"It can touch everything from servers to motherboards to data center infrastructure, outward communications tools, cybersecurity and environmental mitigation," Still said, noting a growing market for products that can reduce the environmental impact of data centers in particular. Server farms use huge amounts of electricity and water.

With cloud computing becoming increasingly central to digital life, Still said the need for new data centers is expected to be strong well into the future. That means the market for high-end servers will likely remain strong too.

"There is a huge appetite for data centers here and elsewhere," Still said, adding that he believes Wisconsin is an attractive place to site them.

"We don't have earthquakes or hurricanes that take those things down," he said. "We've got more reliable power and I think we've got the technology infrastructure and the workforce in Wisconsin to help build them."

Still and Ramanathan said the market for digital infrastructure hardware is likely to expand for years as more people gain access to the web through internet-connected devices. Investors agree, with a North Carolina-based capital management group forecasting that rising demand for "digital connectivity" will feed demand for data centers, cell towers and other digital infrastructure hardware.

Whether or not Wisconsin attracts new data centers, Still described a newer trend among American technology companies toward "onshoring" or "reshoring" parts of their supply chains to the United States. The COVID-19 pandemic has only accelerated the trend, he said. Foxconn's Mount Pleasant facility could be well-positioned to reap the benefits of this strategic shift, Still said, with the assembly of server components being a part of the digital infrastructure supply chain ripest for reshoring.

Foxconn's deep business connections in East Asia, where much of the world's digital infrastructure hardware is currently manufactured, could also help it source components it would need to assemble servers in Wisconsin, leading with computer chips. The vast majority of silica-based computer chips are manufactured in China. However, both Still and Ramanathan considered it unlikely that the manufacturing of chips from raw materials would happen in Wisconsin, or the U.S. in general, on a large scale any time soon.

"That would be a huge investment," Ramanathan said. "Hundreds of billions probably."

Read more from the original source:
Foxconn Plans to Make 'Digital Infrastructure Hardware' in Wisconsin What's That? - PBS Wisconsin

Read More..

CSOP to Bring CSOP Global Cloud Computing Technology Index ETF (stock ticker: 3194.HK) on the HKEX – Business Wire

HONG KONG--(BUSINESS WIRE)--CSOP Asset Management Limited (CSOP) is proud to announce the listing of CSOP Global Cloud Computing Technology Index ETF (stock ticker: 3194.HK) on the Hong Kong Stock Exchange (the HKEX). 3194.HK will track the performance of the Solactive Global Cloud Computing Technology Index (the index). With listing price at around HKD 15.53 per unit, trading lot of 100 shares and annual management fee of 0.99%, CSOP Global Cloud Computing Technology Index ETF will start to trade on 13 May, 2021. Upon inception, 3194.HK has received around USD 9.37 million initial investment.

Cloud computing is the modern practice of using remote servers and shared resources instead of a local network for storing, managing, processing and delivering data. Such technology enables access to information from anywhere with much less upfront investment. Cloud computing has further helped to lower costs, improving the reliability and security of business IT operations. Since the onset of Covid-19, telecommuting has accelerated the pace of business digitalization, further bolstering the rise in adoption of cloud. Worldwide spending on public cloud services is expected to reach USD 332.3 billion in 2021, an increase of 23.1% from 2020.1 Cloud computing has also become the critical technology sector amid the competition of the worlds tech giants and major economies. Amazon, Microsoft and Google combined accounted for 68.2% of the global cloud computing market in 2019.2 The U.S. is the most significant public cloud market with a projected spending of USD 124.6 billion in 2019.3 With the governments support in new infrastructure development, Chinas public cloud market also keeps evolving with high prospects, booming from RMB 96.3 billion in 2018 to RMB 375.4 billion in 2023, with an estimated CAGR of 31%.4

Cloud computing is one of the most in-demand and fastest-growing tech sectors boosted by global digitalization, especially during the pandemic. Comprehensively investing in the 50 most representative U.S. and Hong Kong listed companies with global operations in the field of cloud computing, the Solactive Global Cloud Computing Technology Index returns 54% over the prior 12 months.5 Weighted by free-float market capitalization of constituents and rebalanced quarterly, the index aims to capture the overall performance of the fast-growing worldwide cloud computing industry in a scientific and timely manner.6 As the end of April 2021, the total market cap of the index is USD 8.9 trillion.7 Innovatively adopting a physical representative sampling strategy to replicate the index, 3194.HK enables Hong Kong investors to grasp the investment opportunities of cloud computing industry in an easy and transparent way.

The pursuit of smart living has always been one of the development themes of human beings. As the pioneer ETF issuer in Hong Kong, CSOPs ETFs/ETPs not only truly serve the purpose of asset allocation, but also the future prospects of investment. The launch of 3194.HK is one of the suite of CSOP thematic ETFs that offer future-defining investment opportunities.

CSOP has already launched a series of future thematic ETFs. CSOP Hang Seng Tech Index ETF(3033.HK) and CSOP Yinhua CSI 5G Communications Theme ETF(3193.HK)8 visualize ways of betterment. Now CSOP Global Cloud Computing Technology Index ETF(3194.HK) further demonstrates a more intelligent concept of lifestyle. More CSOPs future-themed ETFs will come to the market, broadening investors imagination of the future investment of human beings. Comments Melody He, Managing Director, Head of Business Development and Product Strategy & Solutions.

About CSOP Asset Management Limited

CSOP Asset Management Limited (CSOP) was founded in 2008 as the first offshore asset manager set up by a regulated asset management company in China. With a dedicated focus on China investing, CSOP manages public and private funds, as well as providing investment advisory services to Asian and global investors. In addition, CSOP is best known as an ETF leader in Asia. As of 31 March 2021, CSOP has more than USD 10 billion in assets under management.

This material has not been reviewed by the Securities and Futures Commission.

Issuer: CSOP Asset Management Limited

Please refer to the offering documents for the index provider disclaimer.

IMPORANT: Investment involves risks. Investment value may rise or fall. Past performance information presented is not indicative of future performance. Investors should refer to the relevant Prospectus and the Product Key Facts Statement for further details, including product features and risk factors. Investors should not base on this material alone to make investment decisions.

CSOP Global Cloud Computing Technology Index ETF (the Sub-Fund) is a passively managed index tracking ETF authorised under Chapter 8.6 of the Code on Unit Trusts and Mutual Funds. The shares of the Sub-Fund (the Shares) are traded on the Stock Exchange of Hong Kong Limited (the SEHK) like stocks. The Sub-Fund is a physical ETF and invests primarily in US and Hong Kong listed companies that have business operations in the field of cloud computing. The Sub-Fund is denominated in USD.

Please note that the above listed investment risks are not exhaustive and investors should read the Prospectus and the Product Key Facts Statement in detail before making any investment decision.

Disclaimer

CSOP Global Cloud Computing Technology Index ETF is authorized by the Securities and Futures Commission ("SFC") in Hong Kong. Such authorization does not imply any official recommendation by the SFC. This material and the information contained in this material shall not be regarded as an offer or solicitation of business in any jurisdiction to any person to whom it is unlawful to offer or solicit business in such jurisdictions. This material and the information contained in it are for general information only and do not constitute financial, professional, investment or any other kind of advice in any way and shall not be considered as an offer or solicitation to deal in any investment products. If you wish to receive advice on investment, please consult your professional legal, tax and financial advisers.

CSOP Asset Management Limited (CSOP) which prepared this material believes that information in this material is based upon sources that are believed to be accurate, complete, and reliable. However, CSOP does not warrant the accuracy and completeness of the information, and shall not be liable to the recipient or controlling shareholders of the recipient resulting from its use. CSOP is under no obligation to keep the information up-to-date.

This material should not be reproduced or made available to others without the written consent of CSOP.

Please refer to the offering documents for the index provider disclaimer.

This material is prepared by CSOP and has not been reviewed by the Securities and Futures Commission in Hong Kong.

Issuer: CSOP Asset Management Limited

1 Source: Gartner, April 20212 Source: Gartner, August 20203 Source: IDC Media Center4 Source: China Academy of Information and Communications Technology, the White Paper of Cloud Computing, published in July 20205 Source: Bloomberg, as of 30 April, 2021. All information for an index prior to its launch date is back-tested. Back-tested performance reflects hypothetical historical performance. The hypothetical performance figure is for illustrative purpose only. Not indicative of actual future performance, which could differ substantially.6 Source: Solactive AG7 Source: Solactive AG, Bloomberg8 CSOP Yinhua CSI 5G Communications Theme ETF is a feeder fund. Its master fund, Yinhua CSI 5G Communication ETF, is not authorized by the Securities and Futures Commission for direct offering to the public in Hong Kong.

Continued here:
CSOP to Bring CSOP Global Cloud Computing Technology Index ETF (stock ticker: 3194.HK) on the HKEX - Business Wire

Read More..

10 Hot Data Center Technologies And Trends To Watch In 2021 – CRN

10 Data Center Market Trends You Should Be Following In 2021

As Dell Technologies CEO Michael Dell puts it, technology prevented a complete societal economic meltdown by enabling the world to work, learn and play from home during the global COVID-19 pandemic. Data centers played one of the most critical roles in stopping the worlds economy from potentially collapsing and continues to play an ever-increasing role as the new normal takes shape with remote working and a digital financial economy ahead.

The most important data center trends in 2021 include the worlds largest data center infrastructure providers starting to fight back against public cloud providers in a move to win back customer wallet share. On the innovation front, data center server CPU innovation is skyrocketing as competition heats up between Intel and AMD, while Nvidia positions itself to become a unique one-stop-shop data center provider.

Private equity is throwing billions at data center providers or acquiring them completely. All data center signs are pointing toward edge computing as well as more green, sustainable facilities and solutions.

With so much going on in the industry, CRN breaks down the ten biggest data center trends you need to be following in 2021.

Data Center As-A-Service Has Arrived, Targets Public Clouds

Arguably the biggest data center trend in 2021 is the momentum towards the new way customers want to buy and maintain data center solutions which is in a consumption-based, pay-per-use manner. While the leading public cloud providers have already been successfully offering consumption-based cloud IT for years, the globes leading data center infrastructure providers like Hewlett Packard Enterprise and Dell Technologies are doubling down on this movement unlike ever before.

Dell Technologies launched Apex last week, the $94 billion companys new as-a-service consumption-based IT portfolio that includes Apex Data Storage Services and Apex Cloud Services. Dells top executives say Apex provides customers a better cloud experience around cost, control and simplicity compared to public cloud providers.

We know that there are so many hidden costs in so many of these other models if you think about the public cloud, and all of these unexpected costs that customers run into -- the ability for our competitive advantage and transparent approach to pricing is something that customers are really responding well to, said Allison Dew, Dell Technologies global chief marketing officer during Dell Technologies World this month.

The Takeaway: The worlds market leaders in servers, storage and hyperconverged infrastructure are fundamentally changing their sales motion to enable businesses to buy data center products in a new way, while at the same time looking to win back market share from public cloud providers who gained share in 2020 as many flocked to public clouds during COVID-19.

AWS, Microsoft, Google Spending And Expanding Like Never Before

The three public cloud titans are spending billions each quarter on building and equipping new hyperscale data centers across the world to extend their cloud services reach. In fact, Amazon, Microsoft and Google now collectively account for more than 50 percent of the worlds largest data centers.

The number of large data centers operated by hyperscale providers like AWS, Microsoft and Google increased to nearly 600 by the end of 2020, twice as many as there were in 2015. The COVID-19 pandemic has spurred record-breaking data center spending levels led by AWS, Microsoft and Google, reaching $37 billion alone in the third quarter of 2020.

In one of the boldest data centers plans in history, Microsoft unveiled last month its bullish plan to build 50 to 100 new data centers each year, including a $1 billion investment to build several hyperscale data center regions in Malaysia.

The Takeaway: AWS, Google and Microsoft will continue to spend billions throughout 2021 to increase market reach by expanding their data enter footprints into new geographies or regions with high cloud and data services demands.

Data Center Spending Picks Back Up, Will Hit $237 Billion

The global data center systems market will reach $237 billion in 2021, representing an increase of more than 7 percent year over year, according to IT research firm Gartners most recent IT spending forecast.

In 2020, many enterprises focused on simply keeping the lights on and running operational smoothly due to the COVID-19 pandemic. This led businesses to not focus heavily on data center infrastructure investments and large IT projects. However, the temporary pause button enterprises hit on data center spending will go away by the end of 2021, Gartner said, with data center infrastructure spending expected to grow year over year through 2024.

The Takeaway: With no decrease in demand for data as well as increasing demand for colocation and hybrid cloud opportunities, the data center market is poised for growth over the next several years.

Intel Vs. AMD Driving Data Center CPUs Innovation

Intel, the longtime leader in data center server CPUs, is now facing stiff competition with AMD on a global scale.

AMD recently saw its largest microprocessor share gain yet against Intel in the server market with EPYC processors in the first quarter of 2021, according to the latest x86 CPU market share report from Mercury Research. AMDs server CPU share grew 1.8 points to 8.9 percent, reflecting the chipmakers report of strong EPYC sales and Intels decline in Data Center Group revenue.

This heavy competition is driving server CPU performance through the roof which greatly benefits data center customers around speed, agility, and new feature launches. Earlier this year, Intel launched its third-generation Intel Xeon Scalable CPUs, code name Ice Lake. Likewise, AMD launched its third-generation EPYC Milan processors this year, dubbing it as the highest-performance server processor in the industry.

The Takeaway: With server CPUs innovation reaching all-time highs due to competition between Intel and AMD, new use cases and services opportunities in the data center are opening up.

Nvidia To Become A Data Center Powerhouse

In April, Nvidia unveiled an Arm-based data center CPU for AI and high-performance computing it says will provide 10 times faster AI performance than one of AMDs fastest EPYC CPUs, a move that will give the Nvidia control over compute, acceleration and networking components in servers. The new data center CPU, named Grace, creates new competition for x86 CPU rivals Intel and AMD when it arrives in early 2023.

Nvidias data center product road map ahead will consist of GPUs, data processing units (DPUs) and CPUs, with each product line receiving an updated architecture every two years.

Nvidia has increasingly viewed itself as a data center-scale computing company that has led it to pursue optimization of applications at a system level. While the company is seeing high adoption of its GPUs, Nvidia expanded into high-speed networking products last year with its $7 billion acquisition of Mellanox Technologies. Nvidia is also seeking to close on its $40 billion acquisition of Arm, whose CPU designs are being used for Grace. Additionally, Nvidia continues to expand its alliance with data center virtualization and software star VMware.

The Takeaway: Nvidia has the potential to disrupt the entire data center market over the next few years through next-level data center innovation and could forced market consolidation.

Increase In Automation And Robotics Deployments

The global pandemic has accelerated the need to make data center operations less reliant on human intervention aided by the influx of innovation around software automation and artificial intelligence. The data center industry is seeing the benefit of leveraging more intelligent, autonomous systems for simple tasks and distributed environments designed to increase capabilities.

According to AFCOMs recent 2021 State of the Data Center Industry study, more than 40 percent of respondents said theyd be deploying robotics and automation for data center monitoring and maintenance over the next three years. Only 16 percent of respondents said they already have deployed robotics and automation in the data center, while 35 percent expect to deploy these solutions by 2024. Data center automation enables the routine workflows and processes of a data center -- such as scheduling, monitoring, maintenance and application delivery -- to be managed and executed without humans.

The Takeaway: Organizations are investing in automation and robotics in the data center so employees can bring more value elsewhere to the business or for a customer.

Private Equity Pumps More Money Into Data Centers

From private equity firms acquiring data center companies to investing billions to help data center providers scale, theres no doubt that global investment firms will continue to double down on the market in 2021.

Private equity has been fueling data center providers growth objectives around scale and expanding global reach for years. For example, in 2019, private equity accounted for 80 percent of all data center acquisitions, focused on driving sales growth. Investors understand now more than ever that every IT trend -- from enabling data-driven outcomes to a shift to a digital economy -- all require infrastructure.

The Takeaway: In 2021, global investors perceive data centers as one of the best ways to make money as demand for cloud services and remote working skyrockets.

Enterprises Pick Cloud Infrastructure Services Over Data Center Products

Enterprises spent $130 billion on cloud infrastructure services globally in 2020, blowing by the $90 billion enterprises spent on data center products.

Enterprise spending on cloud infrastructure services reached nearly $130 billion in 2020, representing an increase of 35 percent annually, according to new data from Synergy Research Group. Enterprise spending on owned data center hardware and software fell 6 percent year over year to $89 billion. In 2019, the two market segments of cloud infrastructure services and data center hardware and software were almost equal in size at roughly $95 billion each.

While 2020 can be viewed as something of an anomaly due to the COVID-19 pandemic, the trends are absolutely clear, according to John Dinsdale, a chief analyst at Synergy Research Group.

It took ten years for enterprise spending on cloud services to catch up with enterprise spending on data center hardware and software, then in 2020 the market for cloud services again soared, said Dinsdale. In the last ten years, CIOs and business managers have increasingly voted with their wallets and decided that cloud services are the way to go. And we expect [that] to continue for many years to come.

The Takeaway: Although everyone agrees hybrid cloud is the future, enterprises will continue to spend on cloud infrastructure services compared to, in most cases, traditional data center solutions in 2021 as COVID-19 forced many to rapidly shift to the public cloud.

Green Data Centers; Sustainability Taking Center Stage

The data center industry is in a unique position to accelerate the adoption of sustainable practices like renewable energy and using lithium-ion batteries to reduce climate change at a global level.

Customers, stakeholders, investors and even world leaders are increasingly demanding accountability on climate impact, by creating a compelling business incentive to embrace sustainability. Innovation in energy efficiency, such as with liquid cooling, as well as renewable power is being driven by the industrys largest customers like AWS, Google, Facebook and Microsoft.

In 2021, data center sustainability has become front and center for enterprise customers which is raising the bar for the entire industry. With the digital economy taking shape, investors are also part of the green energy revolution as they seek to build greener IT portfolios. Data center colocation giants Equinix and Digital Realty have both raised funds using green bonds, with others expecting to follow suit.

The Takeaway: Data center operators will invest in buying and creating more sustainable products from liquid cooling to undersea data centers. It is key to note that there wont be a trade-off in terms of buying the best data center product versus buying a more sustainable solution.

All Eyes On The Edge

With people flocking towards remote working and education, along with an influx of smart technologies inside offices and remote locations the next data center frontier is at the edge.

Edge-specific data center providers like EdgeConneX are sprouting up while market leading infrastructure and power vendors are creating edge-specific products at rapid pace and scale. Companies from Dell and HPE to Vertiv and AWS are investing in new edge capabilities, services and vertical-specific edge reference architectures.

There is significant demand for edge computing in 2021 fueled by various technologies including AI, Internet of Things (IoT) and the emergence of 5G. Gartner predicts that by 2025, 75 percent of enterprise-generated data will be created and processed outside a centralized data center or cloud. This massive shift puts more emphasis on ensuring a secure, effective and reliable solution inside an edge data centers no matter what size.

The Takeaway: The massive investments from the biggest technology conglomerates in the world show there is a ton of expansion and market opportunities coming to the edge data center market in 2021 and beyond.

Read more from the original source:
10 Hot Data Center Technologies And Trends To Watch In 2021 - CRN

Read More..