Page 3,822«..1020..3,8213,8223,8233,824..3,8303,840..»

A plan to sell Perth Glory to the LFE cryptocurrency firm is over, but the deal never added up – ABC News

Updated February 26, 2020 11:59:14

Right from the start, something did not seem right about this.

Was it the shed in rural Wales that was supposedly the office of the London Football Exchange (LFE), or the pantomime LFE "founder" Jim Aylward and his obscure video messages about his group's takeover of Perth Glory?

"We are building a football group that is leading to a tokenised ecosystem, so there is going to be utility for the token," he explained on Twitter after the news broke.

It could have been the push by those involved with LFE urging Glory supporters to buy into this cryptocurrency token, even though the group did not yet own the Perth club.

Whatever it was, even Glory owner Tony Sage seemed to be caught unaware.

At first, he said he was on his way to the Europe to finalise a deal to sell 80 per cent of the club to the cryptocurrency group.

He would retain 20 per cent of the Glory and become chairman of the LFE's football group.

That didn't really add up either.

The proposed new owner of the Glory had ambitious goals to buy clubs in France and the English Premier League, but it was unclear how exactly the group would pay for it.

Would it be in cryptocurrency? Would it be cash? Would it be debt?

They were all questions that couldn't be answered probably because there were no answers, as those involved in the proposed sale did not know themselves.

Things turned even more bizarre over the next few days with mixed messages and contradictions adding to the confusion.

The Glory and LFE released a joint statement saying they had reached an agreement for the club to be acquired by LFE.

Sage was quoted in it saying: "The LFE is designed and manned by fans for fans [and it] makes me proud to say I am part of the team and that PGFC is the cornerstone of this wonderful project."

Two days later though, the club released another statement saying no deal he had been done, with Sage saying: "Don't believe fake news. Your club has not been sold and I have not even sent anything to Football Federation Australia (FFA) for approval."

He said he was continuing to do due diligence on the organisation of which he was supposed to be chairman, something that raised even more questions among the club's fans.

The supporters knew something was up.

The subsequent investigative work by some of the club's passionate fans and their backgrounding of journalists helped get to the core of the story.

Less than two weeks after news of the proposed deal came out, it was all off.

The FFA stayed quiet for much of the process, but released a statement saying the sale would not go ahead, putting an end to what had become a saga.

Sage's love for the Glory has been well documented.

His passion is admirable and the amount of money he has poured into the loss-making club is eye-watering.

But this process has damaged the Glory and had many within it questioning its future direction.

For an experienced and astute businessman, Sage should have seen the red flags the supporters so clearly identified from the outset.

Topics:a-league,soccer,sport,perth-6000,wa

First posted February 26, 2020 08:09:33

Read this article:
A plan to sell Perth Glory to the LFE cryptocurrency firm is over, but the deal never added up - ABC News

Read More..

New Intel chip could accelerate the advent of quantum computing – RedShark News

The marathon to achieve the promise of quantum computers hasedged a few steps forward as Intel unveils a new chip capable, it believes, of accelerating the process.

Called Horse Ridgeand named after one of the coldest places in Oregon, the system-on-chip can control a total of 128 qubits (quantum bits) which is more than double the number of qubits Intel heralded in its Tangle Lake test chip in early 2018.

While companies like IBM and Microsoft have been leapfrogging each other with systems capable of handling ever greater qubits the breakthrough in this case appears to be an ability to lead to more efficient quantum computers by allowing one chip to handle more tasks. It is therefore a step toward moving quantum computing from the lab and into real commercial viability.

Applying quantum computing to practical problems hinges on the ability to scale, and control, thousands of qubits at the same time with high levels of fidelity. Intel suggests Horse Ridge greatly simplifies current complex electronics required to operate a quantum system.

To recap why this is important lets take it for read that Quantum computing has the potential to tackle problems conventional computers cant by leveraging a phenomena of quantum physics: that Qubits can exist in multiple states simultaneously. As a result, they are able to conduct a large number of calculations at the same time.

This can dramatically speed up complex problem-solving from years to a matter of minutes. But in order for these qubits to do their jobs, hundreds of connective wires have to be strung into and out of the cryogenic refrigerator where quantum computing occurs (at temperatures colder than deep space).

The extensive control cabling for each qubit drastically hinders the ability to control the hundreds or thousands of qubits that will be required to demonstrate quantum practicality in the lab not to mention the millions of qubits that will be required for a commercially viable quantum solution in the real world.

Researchers outlined the capability of Horse Ridge in a paper presented at the 2020 International Solid-State Circuits Conference in San Francisco and co-written by collaborators at Dutch institute QuTech.

The integrated SoC design is described as being implemented using Intels 22nm FFL (FinFET Low Power) CMOS technology and integrates four radio frequency channels into a single device. Each channel is able to control up to 32 qubits leveraging frequency multiplexing a technique that divides the total bandwidth available into a series of non-overlapping frequency bands, each of which is used to carry a separate signal.

With these four channels, Horse Ridge can potentially control up to 128 qubits with a single device, substantially reducing the number of cables and rack instrumentations previously required.

The paper goes on to argue that increases in qubit count trigger other issues that challenge the capacity and operation of the quantum system. One such potential impact is a decline in qubit fidelity and performance. In developing Horse Ridge, Intel optimised the multiplexing technology that enables the system to scale and reduce errors from crosstalk among qubits.

While developing control systems isnt, evidently, as hype-worthy as the increase in qubit count has been, it is a necessity, says Jim Clarke, director of quantum hardware, Intel Labs. Horse Ridge could take quantum practicality to the finish line much faster than is currently possible. By systematically working to scale to thousands of qubits required for quantum practicality, were continuing to make steady progress toward making commercially viable quantum computing a reality in our future.

Intels own research suggests it will most likely take at least thousands of qubits working reliably together before the first practical problems can be solved via quantum computing. Other estimates suggest it will require at least one million qubits.

Intel is exploring silicon spin qubits, which have the potential to operate at temperatures as high as 1 kelvin. This research paves the way for integrating silicon spin qubit devices and the cryogenic controls of Horse Ridge to create a solution that delivers the qubits and controls in one package.

Quantum computer applications are thought to include drug development high on the worlds list of priorities just now, logistics optimisation (that is, finding the most efficient way from any number of possible travel routes) and natural disaster prediction.

Follow this link:
New Intel chip could accelerate the advent of quantum computing - RedShark News

Read More..

Particle accelerator technology could solve one of the most vexing problems in building quantum computers – Fermi National Accelerator Laboratory

Last year, researchers at Fermilab received over $3.5 million for projects that delve into the burgeoning field of quantum information science. Research funded by the grant runs the gamut, from building and modeling devices for possible use in the development of quantum computers to using ultracold atoms to look for dark matter.

For their quantum computer project, Fermilab particle physicist Adam Lyon and computer scientist Jim Kowalkowski are collaborating with researchers at Argonne National Laboratory, where theyll be running simulations on high-performance computers. Their work will help determine whether instruments called superconducting radio-frequency cavities, also used in particle accelerators, can solve one of the biggest problems facing the successful development of a quantum computer: the decoherence of qubits.

Fermilab has pioneered making superconducting cavities that can accelerate particles to an extremely high degree in a short amount of space, said Lyon, one of the lead scientists on the project. It turns out this is directly applicable to a qubit.

Researchers in the field have worked on developing successful quantum computing devices for the last several decades; so far, its been difficult. This is primarily because quantum computers have to maintain very stable conditions to keep qubits in a quantum state called superposition.

Superconducting radio-frequency cavities, such as the one seen here, are used in particle accelerators. They can also solve one of the biggest problems facing the successful development of a quantum computer: the decoherence of qubits. Photo: Reidar Hahn, Fermilab

Superposition

Classical computers use a binary system of 0s and 1s called bits to store and analyze data. Eight bits combined make one byte of data, which can be strung together to encode even more information. (There are about 31.8 million bytes in the average three-minute digital song.) In contrast, quantum computers arent constrained by a strict binary system. Rather, they operate on a system of qubits, each of which can take on a continuous range of states during computation. Just as an electron orbiting an atomic nucleus doesnt have a discrete location but rather occupies all positions in its orbit at once in an electron cloud, a qubit can be maintained in a superposition of both 0 and 1

Since there are two possible states for any given qubit, a pair doubles the amount of information that can be manipulated: 22 = 4. Use four qubits, and that amount of information grows to 24 = 16. With this exponential increase, it would take only 300 entangled qubits to encode more information than there is matter in the universe.

Qubits can be in a superposition of 0 and 1, while classical bits can be only one or the other. Image: Jerald Pinson

Parallel positions

Qubits dont represent data in the same way as bits. Because qubits in superposition are both 0 and 1 at the same time, they can similarly represent all possible answers to a given problem simultaneously. This is called quantum parallelism, and its one of the properties that makes quantum computers so much faster than classical systems.

The difference between classical computers and their quantum counterparts could be compared to a situation in which there is a book with some pages randomly printed in blue ink instead of black. The two computers are given the task of determining how many pages were printed in each color.

A classical computer would go through every page, Lyon said. Each page would be marked, one at a time, as either being printed in black or in blue. A quantum computer, instead of going through the pages sequentially, would go through them all at once.

Once the computation was complete, a classical computer would give you a definite, discrete answer. If the book had three pages printed in blue, thats the answer youd get.

But a quantum computer is inherently probabilistic, Kowalkowski said.

This means the data you get back isnt definite. In a book with 100 pages, the data from a quantum computer wouldnt be just three. It also could give you, for example, a 1 percent chance of having three blue pages or a 1 percent chance of 50 blue pages.

An obvious problem arises when trying to interpret this data. A quantum computer can perform incredibly fast calculations using parallel qubits, but it spits out only probabilities, which, of course, isnt very helpful unless, that is, the right answer could somehow be given a higher probability.

Interference

Consider two water waves that approach each other. As they meet, they may constructively interfere, producing one wave with a higher crest. Or they may destructively interfere, canceling each other so that theres no longer any wave to speak of. Qubit states can also act as waves, exhibiting the same patterns of interference, a property researchers can exploit to identify the most likely answer to the problem theyre given.

If you can set up interference between the right answers and the wrong answers, you can increase the likelihood that the right answers pop up more than the wrong answers, Lyon said. Youre trying to find a quantum way to make the correct answers constructively interfere and the wrong answers destructively interfere.

When a calculation is run on a quantum computer, the same calculation is run multiple times, and the qubits are allowed to interfere with one another. The result is a distribution curve in which the correct answer is the most frequent response.

When waves meet, they may constructively interfere, producing one wave with a higher crest. Image: Jerald Pinson

Waves may also destructively interfere, canceling each other so that theres no longer any wave to speak of. Image: Jerald Pinson

Listening for signals above the noise

In the last five years, researchers at universities, government facilities and large companies have made encouraging advancements toward the development of a useful quantum computer. Last year, Google announced that it had performed calculations on their quantum processor called Sycamore in a fraction of the time it would have taken the worlds largest supercomputer to complete the same task.

Yet the quantum devices that we have today are still prototypes, akin to the first large vacuum tube computers of the 1940s.

The machines we have now dont scale up much at all, Lyon said.

Theres still a few hurdles researchers have to overcome before quantum computers become viable and competitive. One of the largest is finding a way to keep delicate qubit states isolated long enough for them to perform calculations.

If a stray photon a particle of light from outside the system were to interact with a qubit, its wave would interfere with the qubits superposition, essentially turning the calculations into a jumbled mess a process called decoherence. While the refrigerators do a moderately good job at keeping unwanted interactions to a minimum, they can do so only for a fraction of a second.

Quantum systems like to be isolated, Lyon said, and theres just no easy way to do that.

When a quantum computer is operating, it needs to be placed in a large refrigerator, like the one pictured here, to cool the device to less than a degree above absolute zero. This is done to keep energy from the surrounding environment from entering the machine. Photo: Reidar Hahn, Fermilab

Which is where Lyon and Kowalkowskis simulation work comes in. If the qubits cant be kept cold enough to maintain an entangled superposition of states, perhaps the devices themselves can be constructed in a way that makes them less susceptible to noise.

It turns out that superconducting cavities made of niobium, normally used to propel particle beams in accelerators, could be the solution. These cavities need to be constructed very precisely and operate at very low temperatures to efficiently propagate the radio waves that accelerate particle beams. Researchers theorize that by placing quantum processors in these cavities, the qubits will be able to interact undisturbed for seconds rather than the current record of milliseconds, giving them enough time to perform complex calculations.

Qubits come in several different varieties. They can be created by trapping ions within a magnetic field or by using nitrogen atoms surrounded by the carbon lattice formed naturally in crystals. The research at Fermilab and Argonne will be focused on qubits made from photons.

Lyon and his team have taken on the job of simulating how well radio-frequency cavities are expected to perform. By carrying out their simulations on high-performance computers, known as HPCs, at Argonne National Laboratory, they can predict how long photon qubits can interact in this ultralow-noise environment and account for any unexpected interactions.

Researchers around the world have used open-source software for desktop computers to simulate different applications of quantum mechanics, providing developers with blueprints for how to incorporate the results into technology. The scope of these programs, however, is limited by the amount of memory available on personal computers. In order to simulate the exponential scaling of multiple qubits, researchers have to use HPCs.

Going from one desktop to an HPC, you might be 10,000 times faster, said Matthew Otten, a fellow at Argonne National Laboratory and collaborator on the project.

Once the team has completed their simulations, the results will be used by Fermilab researchers to help improve and test the cavities for acting as computational devices.

If we set up a simulation framework, we can ask very targeted questions on the best way to store quantum information and the best way to manipulate it, said Eric Holland, the deputy head of quantum technology at Fermilab. We can use that to guide what we develop for quantum technologies.

This work is supported by the Department of Energy Office of Science.

Original post:
Particle accelerator technology could solve one of the most vexing problems in building quantum computers - Fermi National Accelerator Laboratory

Read More..

Top 10 breakthrough technologies of 2020 – TechRepublic

Between tiny AI and unhackable internet, this decade's tech trends will revolutionize the business world.

MIT Technology Review unveiled its top 10 breakthrough technology predictions on Wednesday. The trends--which include hype-inducing tech like quantum computing and unhackable internet--are expected to become realities in the next decade, changing the enterprise and world.

SEE: Internet of Things: Progress, risks, and opportunities (free PDF) (TechRepublic)

While many of the trends have a more scientific background, most can also apply to business, said David Rotman editor at MIT Technology Review.

"Even though some of these sound science-y or research-y, all really do have important implications and business impacts. [For example], unhackable internet," Rotman said. "It's early, but we can all see why that would be a big deal.

"Digital money will change how we do commerce; satellite mega constellations will potentially change how we do communications and the price of communications," Rotman added.The methodology behind determining the breakthrough technologies focused on what writers, editors, and journalists have been reporting on in the past year. All of the technologies are still being developed and improved in labs, Rotman said.

The MIT Technology Review outlined the following 10 most exciting technologies being created and deployed in the next 10 years.

One of the most exciting technologies of the bunch, according to Rotman, quantum supremacy indicates that quantum computers are not only becoming a reality, but the functionality is becoming even more advanced.Murmurs of quantum computer development have floated around the enterprise. The technology is able to process massive computational solutions faster than any supercomputer.

While this form of computing hasn't been widely used yet, it will not only be usable by 2030, but possibly reach quantum supremacy, MIT found.

"Quantum supremacy is the point where a quantum computer can do something that a classical conventional computer cannot do or take hundreds of years for a classical computer to do," Rotman said.

The technology is now getting to the point where people can test them in their businesses and try different applications, and will become more popular in the coming years, Rotman said.

Quantum computers are especially useful for massive scheduling or logistical problems, which can be particularly useful in large corporations with many moving parts, he added.

"Satellites have become so small and relatively cheap that people are sending up whole clusters of these satellites," Rotman said. "It's going to have an enormous impact on communication and all the things that we rely on satellites for."

These satellites could be able to cover the entire globe with high-speed internet. Applications of satellite mega-constellation use are currently being tested by companies including SpaceX, OneWeb, Amazon, and Telesat, according to the report.

Another interesting, and surprising, technology in the study concerned tiny AI. The surprising nature of this comes with how quickly AI is growing, Rotman said.

Starting in the present day, AI will become even more functional, independently running on phones and wearables. This ability would prevent devices from needing the cloud to use AI-driven features, Rotman said.

"It's not just a first step, but it would be an important step in speeding up the search for new drugs," Rotman said.

Scientists have used AI to find drug-like compounds with specific desirable characteristics. In the next three to five years, new drugs might be able to be commercialized for a lesser cost, compared to the current $2.5 billion it takes to currently commercialize a new drug, the report found.

Researchers are now able to detect climate change's role in extreme weather conditions. With this discovery, scientists can help people better prepare for severe weather, according to the report.

In less than five years, researchers will find drugs that treat ailments based on the body's natural aging process, the report found. Potentially, diseases including cancer, heart disease and dementia could be treated by slowing age.

Within five years, the internet could be unhackable, the report found.

Researchers are using quantum encryption to try and make an unhackable internet, which is particularly important as data privacy concerns heighten, Rotman said.

Digital money, also known as cryptocurrency, will become more widely used in 2020. However, the rise of this money will also have major impacts on financial privacy, as the need for an intermediary becomes less necessary, according to the report.

Occupying three trends on the list, medicine is proving to potentially be a huge area for innovation. Currently, doctors and researchers are designing novel drugs to treat unique genetic mutations. These specialized drugs could cure some ailments that were previously uncurable, the report found.

Differential privacy is a technique currently being used by the US government collecting data for the 2020 census. The US Census Bureau has issues keeping the data it collects private, but this tactic helps to anonymize the data, a tactic other countries may also adopt, according to the report.

For more, check out Forget quantum supremacy: This quantum-computing milestone could be just as important on ZDNet.

Be in the know about smart cities, AI, Internet of Things, VR, AR, robotics, drones, autonomous driving, and more of the coolest tech innovations. Delivered Wednesdays and Fridays

Image: Urupong, Getty Images/iStockphoto

Read the original post:
Top 10 breakthrough technologies of 2020 - TechRepublic

Read More..

Cloud Computing Is Not the Energy Hog That Had Been Feared – The New York Times

The computer engine rooms that power the digital economy have become surprisingly energy efficient.

A new study of data centers globally found that while their computing output jumped sixfold from 2010 to 2018, their energy consumption rose only 6 percent. The scientists findings suggest concerns that the rise of mammoth data centers would generate a surge in electricity demand and pollution have been greatly overstated.

The major force behind the improving efficiency is the shift to cloud computing. In the cloud model, businesses and individuals consume computing over the internet as services, from raw calculation and data storage to search and social networks.

The largest cloud data centers, sometimes the size of football fields, are owned and operated by big tech companies like Google, Microsoft, Amazon and Facebook.

Each of these sprawling digital factories, housing hundreds of thousands of computers, rack upon rack, is an energy-hungry behemoth. Some have been built near the Arctic for natural cooling and others beside huge hydroelectric plants in the Pacific Northwest.

Still, they are the standard setters in terms of the amount of electricity needed for a computing task. The public thinks these massive data centers are energy bad guys, said Eric Masanet, the lead author of the study. But those data centers are the most efficient in the world.

The study findings were published on Thursday in an article in the journal Science. It was a collaboration of five scientists at Northwestern University, the Lawrence Berkeley National Laboratory and an independent research firm. The project was funded by the Department of Energy and by a grant from a Northwestern alumnus who is an environmental philanthropist.

The new research is a stark contrast to often-cited predictions that energy consumption in the worlds data centers is on a runaway path, perhaps set to triple or more over the next decade. Those worrying projections, the study authors say, are simplistic extrapolations and what-if scenarios that focus mainly on the rising demand for data center computing.

By contrast, the new research is a bottom-up analysis that compiles information on data center processors, storage, software, networking and cooling from a range of sources to estimate actual electricity use. Enormous efficiency improvements, they conclude, have allowed computing output to increase sharply while power consumption has been essentially flat.

Were hopeful that this research will reset peoples intuitions about data centers and energy use, said Jonathan Koomey, a former scientist at the Berkeley lab who is an independent researcher.

Over the years, data center electricity consumption has been a story of economic incentives and technology advances combining to tackle a problem.

From 2000 to 2005, energy use in computer centers doubled. In 2007, the Environmental Protection Agency forecast another doubling of power consumed by data centers from 2005 to 2010.

In 2011, at the request of The New York Times, Mr. Koomey made an assessment of how much data center electricity consumption actually did increase between 2005 and 2010. He estimated the global increase at 56 percent, far less than previously expected. The recession after the 2008 financial crisis played a role, but so did gains in efficiency. The new study, with added data, lowered that 2005 to 2010 estimate further.

But the big improvements have come in recent years. Since 2010, the study authors write in Science, the data center landscape has changed dramatically.

The tectonic shift has been to the cloud. In 2010, the researchers estimated that 79 percent of data center computing was done in smaller traditional computer centers, largely owned and run by non-tech companies. By 2018, 89 percent of data center computing took place in larger, utility-style cloud data centers.

The big cloud data centers use tailored chips, high-density storage, so-called virtual-machine software, ultrafast networking and customized airflow systems all to increase computing firepower with the least electricity.

The big tech companies eke out every bit of efficiency for every dollar they spend, said Mr. Masanet, who left Northwestern last month to join the faculty of the University of California, Santa Barbara.

Google is at the forefront. Its data centers on average generate seven times more computing power than they did just five years ago, using no more electricity, according to Urs Hlzle, a senior vice president who oversees Googles data center technology.

In 2018, data centers consumed about 1 percent of the worlds electricity output. That is the energy-consumption equivalent of 17 million American households, a sizable amount of energy use but barely growing.

The trend of efficiency gains largely offsetting rising demand should hold for three or four years, the researchers conclude. But beyond a few years, they say, the outlook is uncertain.

In the Science article, they recommend steps including more investment in energy-saving research and improved measurement and information sharing by data center operators worldwide.

The next few years, they write, will be a critical transition phase to ensure a low-carbon and energy-efficient future.

See the article here:
Cloud Computing Is Not the Energy Hog That Had Been Feared - The New York Times

Read More..

Cloud computing: More costly, complicated and frustrating than expected – but still essential – ZDNet

Migrating to the cloud seems to be on every CIO's to-do list these days. But despite the hype, almost 60% of UK businesses think that cloud has over-promised and under-delivered, according to a report commissioned by consulting company Capita.

The research surveyed 200 IT decision-makers in the UK, and found that an overwhelming nine in ten respondents admitted that cloud migration has been delayed in their organisation due to "unforeseen factors".

On average, businesses started planning their migration to the cloud in 2015, and kicked off the process in 2016. According to the report, one reason clearly stood out as the push factor to adopt cloud computing: 61% of businesses started the move primarily to reduce the costs of keeping data on-premises.

But with organisations setting aside only one year to prepare for migration, which the report described as "less than adequate planning time," it is no surprise that most companies have encountered stumbling blocks on their journey to the cloud.

Capita's head of cloud and platform Wasif Afghan told ZDNet: "There has been a sort of hype about cloud in the past few years. Those who have started migrating really focused on cost saving and rushed in without a clear strategy. Now, a high percentage of enterprises have not seen the outcomes they expected."

Four years later, in fact, less than half (45%) of the companies' workloads and applications have successfully migrated, according to Capita. A meager 5% of respondents reported that they had not experienced any challenge in cloud migration; but their fellow IT leaders blamed security issues and the lack of internal skills as the main obstacles they have had to tackle so far.

Half of respondents said that they had to re-architect more workloads than expected to optimise them for the cloud. Afghan noted that many businesses have adopted a "lift and shift" approach, taking everything they were storing on premises and shifting it into the public cloud. "Except in some cases, you need to re-architect the application," said Afghan, "and now it's catching up with organisations."

The challenges "continue to spiral," noted Capita's report, and they are not going away; what's more, they come at a cost. Up to 58% of organisations said that moving to the cloud has been more expensive than initially thought.

The trend is not only confined to the UK: the financial burden of moving to the cloud is a global concern. Research firm Canalys found that organisations splashed out a record $107 billion (83 billion) for cloud computing infrastructure last year, up 37% from 2018, and that the bill is only set to increase in the next five years. Afghan also pointed to recent research by Gartner, which predicted that through 2020, 80% of organisations will overshoot their cloud infrastructure budgets because of their failure to manage cost optimisation.

Infrastructure, however, is not the only cost of moving to the cloud. IDC analysed the overall spending on cloud services, and predicted that investments will reach $500 billion (388.4 billion) globally by 2023. Clearly, the escalating costs of switching to the cloud is coming as a shock to some businesses - especially so because they started the move to cut costs.

Afghan said: "From speaking to clients, it is pretty clear that cloud expense is one of their chief concerns. The main thing on their minds right now is how to control that spend." His response to them, he continued, is better planning. "If you decide to move an application in the cloud, make sure you architect it so that you get the best return on investment," he argued. "And then monitor it. The cloud is dynamic - it's not a one-off event."

Capita's research did found that IT leaders still have faith in the cloud, with the majority (86%) of respondents agreeing that the benefits of the cloud will outweigh its downsides. But on the other hand, only a third of organisations said that labour and logistical costs have decreased since migrating; and a minority (16%) said they were "extremely satisfied" with the move.

"Most organisations have not yet seen the full benefits or transformative potential of their cloud investments," noted the report.

As a result, IT leaders are left feeling frustrated and underwhelmed by the promises of cloud technology. But Capita's experts argued that the reason for such disillusionment comes down to the misplacement of expectations. Cloud migration, and its promise of cost-cutting, is a means to an end, reads the report; focusing too much on the process might be "a misaligned goal". One that leads businesses to forgetting that the actual purpose of the move is to enable innovation.

Mark Cook, executive officer at Capita, said: "One of the most important questions raised by the research is how far today's IT leaders are able to see beyond cloud as a means to an end - while staying focused on their original transformation goals and aspirations."

To illustrate, Capita's report pointed to the top transformational priorities identified by respondents. IT leaders, indeed, largely indicated cloud migration as their top priority - above process automation, big data analytics, and artificial intelligence or machine learning.

In other words, cloud has become the end-goal for many businesses, more so than the applications enabled by cloud and which will drive innovation to create new value. "Could too much focus on 'cloud' be clouding the issue?" asked the report.

Researchers recommended, therefore, that instead, companies recover an "innovation mindset", and remember the original goals that prompted their move to cloud. Combined with a better strategy, including better governance and skilling up the workforce, the report predicts that a fresher vision will let organisations reap the real benefits of cloud computing.

"'Destination digital' can itself become an all-consuming journey," said Mark Cook, executive officer at Capita. "This points to the importance of individually designing and pressure-testing each journey to ensure it will successfully bring the organisation closer to actual business goals."

Visit link:
Cloud computing: More costly, complicated and frustrating than expected - but still essential - ZDNet

Read More..

Cloud Computing Security Risks and How to Protect Cloud Customers from Ransomware – Customer Think

Cloud computing is gradually becoming the preferred choice of businesses to streamline different business processes. As per industry reports around 68% of the businesses use cloud technology while 19% are planning to integrate cloud computing into their business operations. There are also many reasons for companies switching from the traditional business approach to cloud computing. It has been seen that companies that invest in cloud, big data, security, and mobility have witnessed revenue growth of up to 52 percent and these are compelling figures to state why implementing cloud computing helps in the efficient running of the organization and better assist the customers.Besides just helping businesses with better revenue figures the cloud computing services provide numerous benefits such as:

AccessibilityScalabilityCollaborationPay structureControl choicesData securityTool selectionSecurity featuresSavings on equipmentSpeed to marketStorage optionsStreamlined workRegular updatesCompetitive edge

But with the widespread use of cloud computing, many security threats have also evolved over the past few years because the approach of cloud computing has gone through some transformation. Though the cloud environment is more secure than the on-premise environment still there are security concerns that need to be addressed. So lets look at the security threats to cloud computing and what measures can be taken to assure the full-proof cloud environment.

The biggest threat to any cloud environment is a breach of data. The main reason behind data breach is when an unauthorized person or program gets access to the data. It becomes a serious concern for organizations because data breach puts all or partial data at risk because the intruder can view, copy and transmit the confidential data for whatever reason possible.

Data loss is the opposite of data breach because it can occur either due to natural factors or human errors. Physical destruction of the servers due to natural calamities or human targeted attacks can lead to data loss. This is a great setback for businesses because there are few chances of recovering the data.3. DoS or Denial of ServiceThis is an advanced form of attack done primarily to flood the system with immense traffic and take advantage of the situation when the system cannot buffer or crashed through bugs and vulnerabilities. This is one of the most used media to shut down the cloud services and making them temporarily unavailable for the customers.

This security risk is related to take advantage of the growing cryptocurrency frenzy. The hackers install crypto-mining script on the servers which increases CPU load and slows down the overall system. The users computing resources are exploited to process numerous transactions of cryptocurrency.

This is the most common hijacking in the cloud environment where the hackers take advantage of insecure passwords and gain access to the cloud through a staffs account. The hacker can manipulate the data and can interfere with the different processes of the business.

This isnt the hijacking of the server itself but is more related to the third-party services. Internet of Things or IoT solutions is responsible for a data breach to a certain extent. IoT devices like home appliances, connected cars, health monitors tend to collect and send a huge amount of data in real-time. This real-time data is vulnerable and hackers can hijack it by hacking the APIs.

Not every system is full-proof because even if external security threats are nullified then also internal risks like an employee exploiting the privacy and initiating data breach are very feasible. Besides this, unintentional human errors can also put the cloud environment open to malware and cyber-attacks.

As per phishprotection.com conducting regular security, assessments are the best way to safeguard the cloud infrastructure. An up-to-date cloud system and third-party tools from reliable service providers are very vital to protect the data from going into unauthorized hands.The cloud security monitoring using Artificial Intelligence can help in identify and counter-attack the potential danger and helps in safeguarding the cloud infrastructure.Encrypting the data before uploading to the cloud system to ensure data privacy.Making employees aware of potential security threats is a great way to eliminate human errors.Having a data recovery plan helps to minimize the impact of data loss. Backing up data to a centralized server regularly also helps in protecting the data.Hiring cloud security professionals makes sure that a business stays away from cloud-related threats.As per proofpoint.com the access management policies should be very strict. It has been stated by CISA that only the most trusted or the person who needs access should be grant access to. Considering biometric authentication and multi-factor methods is a good move.Securing the cloud infrastructure is not as easy as securing your PC with antivirus like avast.com or following 10 steps for ransomware protection. The scale at which cloud computing operates and the importance of data flowing through different cloud servers is so huge that even a slight technical glitch can cost the company millions of dollars. But like every other technology the risk factors are always there and the only thing required is attentiveness to deal with all sorts of security threats.

Read the original here:
Cloud Computing Security Risks and How to Protect Cloud Customers from Ransomware - Customer Think

Read More..

Global Healthcare Cloud Computing Market Expected to Grow at a CAGR of 25.1& Over the Forecast Period, 2018-2023 – ResearchAndMarkets.com – Yahoo…

The "Global Healthcare Cloud Computing Market, Forecast to 2023" report has been added to ResearchAndMarkets.com's offering.

Global healthcare cloud market revenue is expected to surge at a compound annual growth rate of 25.1% from 2018 to 2023.

Cloud computing in healthcare is gaining momentum as a variety of factors create a significant need for the value propositions that a successful cloud implementation promises to offer.

Cloud computing involves the use of external suppliers of infrastructure, platforms, and software. As a result, former capital expenses or owned software systems are transitioned to a service offered by a cloud provider or participants within a cloud service provider's ecosystem. This study focuses on cloud services that are used by providers and other healthcare stakeholders seeking to manage clinical and business workflows and reduce certain costs associated with the data-rich global healthcare environment.

This study reviews the significant drivers that are propelling cloud computing in healthcare. For example, the data flow resulting from digital health systems will transform healthcare into a Big Data environment. This trend will include data from telehealth and increasingly from consumer-generated remote monitoring systems. There is a consistent view across the industry that healthcare providers are eager to take advantage of the cloud, but this is offset by the reality that careful planning and diligence must be performed in order to ensure that the configuration of the selected cloud implementation is the correct path forward. Also challenging is the fact that providers are finding it difficult to staff the IT cloud management experts needed to ensure a smooth transition from in-house systems. The transition to usage-based cloud services must be well planned to avoid the potential for higher-than-anticipated costs. For example, the migration of data from internal data centers to cloud services can be complicated, depending on the systems involved. There is also the ever-present need to maintain data security and patient privacy.

Cloud computing offers a compelling financial proposition for healthcare providers. By utilizing the infrastructure of a cloud service provider, healthcare providers should be able to achieve increased scalability and reduce their IT costs. In addition, the use of software-as-a-service will relieve medical staff from time-intensive management of various software-related maintenance and update functions.

The study also reviews the potential for the healthcare cloud to accelerate the deployment of emerging technologies such as artificial intelligence, blockchain, and advanced data analytics.

The evolving healthcare cloud will increase the potential for data system interoperability across the healthcare industry. Perhaps the most exciting aspect of the cloud will be the growth of a new generation of ecosystems that will revolutionize the way that clinical and operational data can be used to support improved patient outcomes and customer relationship management. Although a great deal of media attention is devoted to tracking familiar cloud market leaders such as Amazon AWS, Microsoft Azure, and Google Cloud, this report will identify and summarize the activities of key participants of the emerging healthcare cloud ecosystem. This dynamic ecosystem will be the catalyst of new growth opportunities for specialized service suppliers across the global healthcare industry.

Key Issues Addressed

Key Topics Covered:

1. Executive Summary

2. Analysis of the Growth Environment

3. Visioning Scenarios

4. The Cloud-based Healthcare Ecosystem

5. Growth Pipeline

6. Competitive Landscape

7. Notable Healthcare Cloud Companies

8. Leading Cloud Solution Suppliers

9. Specialized Healthcare Cloud Services

10. System Integrators Participating in the Healthcare Cloud

11. Regional Outlook

Companies Mentioned

For more information about this report visit https://www.researchandmarkets.com/r/16h2fp

View source version on businesswire.com: https://www.businesswire.com/news/home/20200225005496/en/

Contacts

ResearchAndMarkets.comLaura Wood, Senior Press Managerpress@researchandmarkets.com For E.S.T Office Hours Call 1-917-300-0470For U.S./CAN Toll Free Call 1-800-526-8630For GMT Office Hours Call +353-1-416-8900

Read more:
Global Healthcare Cloud Computing Market Expected to Grow at a CAGR of 25.1& Over the Forecast Period, 2018-2023 - ResearchAndMarkets.com - Yahoo...

Read More..

Architecting a Cloud-to-Edge Computing Strategy As data-hungry devices and apps clamor for speed – IoT World Today

As data-hungry devices and apps clamor for speed and bandwidth, public clouds alone bring too much latency. Enter cloud-to-edge architectures.

After a decade dominated by the growing reliance on centralized third-party public cloud services, the deployment of edge computing systems and Internet of Things (IoT) devices represents a partial pendulum swing back to decentralized data management. Large cloud service providers, however, are eager to extend services from the cloud to the edge. As a result, organizations face decisions on how tightly to tie evolving architecture to a single service provider.

Cloud computing services such as Amazon Web Services (AWS) and Microsoft Azure allowed large corporations to ease dependence on large, costly private data centers. They also enabled emerging companies to rapidly scale up to compete with larger competitors even if they lacked the time or financial resources to build their own computing infrastructure.

But as organizations seek to capitalize on a growing torrent of data from edge devices and servers, they cant afford to be constrained by the latency and bandwidth limitations inherent in transmitting data up to a cloud service. In critical situations, operational control requires real-time monitoring and analytics capabilities.

Now people realize that data should be close to where its absorbed from or consumed, said David Linthicum, chief cloud strategy officer at Deloitte, the consulting and advisory services firm. If youre flying an airplane, you dont want to send data down to a cloud server to see if the engines are on fire.

Still, cloud resources are essential, for cost-effectively analyzing large data volumes over time that can feed models for implementation on devices operating at the edgewhether in a plane, controlling autonomous vehicles, re-calibrating factory machines, or deployed throughout a so-called smart city.

The problem everybody is trying to solve is how to balance data between edge-based devices and cloud systems, Linthicum said.

New York-based Oden Technologies developed an industrial automation and analytics platform that used a cloud-to-edge architecture to provide manufacturers with an AI-powered production recommendation system to optimize production and hit peak factory performance. An edge gateway server connects to the companys Oden Cloud to enable complex data processing and real-time machine learning in edge applications.

Our Golden Run platform uses a machine learning model that is periodically updated. Whenever there is enough new data, we run it to see if there might be a new optimization to implement, said Oden co-founder and CEO Willem Sunblad. We might do that a week at a time so it doesnt have to run at the edge, but predictive quality should run in real time and for that you want latency to be low and dont want the dependency of the internet.

How to Navigate Cloud-to-Edge Models?

The Linux Foundation-supported State of the Edge 2020 report noted that infrastructure to support edge computing is nascent and enterprises may have to implement their own until the technology matures. Until there are public edge clouds and ubiquitous high-speed networks (e.g., 5G connectivity), custom edge installations may be the only way to reliably implement an edge application, the report asserted. However, as the demand for edge applications grow[s], the cloud will drift to the edge. Edge computing will become part of the standard internet topology.

Read more:
Architecting a Cloud-to-Edge Computing Strategy As data-hungry devices and apps clamor for speed - IoT World Today

Read More..

Cloud the next destination of geospatial – Geospatial World

Image Courtesy: Forbes

Think of a time, when you have used a map to find a caf for your family dinner. Or perhaps, you used it to look for a newly opened fashion store in your neighborhood. Working with location technology day in and out, I am fascinated by the journey of maps- from paper to a tap on the phone. I have never been more energized by the growth of these location technologies having an impact on our daily lives, the way we work and interact. The beauty of maps, however, is that they can be used in a variety of forms, beyond just navigating roads and finding landmark buildings. Location mapping and technologies, as companies are now finding out, is helping businesses small and large across the world, to combine geography with business intelligence, thus enabling them to make faster and efficient business decisions.

One of the most important examples of how businesses are benefitting from spatial technologies and platforms can be found in the transport and logistics sector. While battling with challenges around high infrastructure maintenance costs, energy consumption, environmental challenges, safety regulations, etc., companies continue to navigate through a complex network of multiple carriers, service providers, physical locations and unexpected inefficiencies in the supply chain and high costs.

By deploying location data and analytics, businesses are now devising innovative solutions and, in some cases, creating new business opportunities. For example, in 2016, the automobile giant, Mahindra and Mahindra launched an agricultural rental equipment service. Using its app called Trinngo, farmers can now rent tractors and equipment on a pay-per-use basis. These orders are then passed on to the nearest franchisee using location-based mapping.

Alternatively, retail companies are using data from location check-ins and customer detecting in-store Wi-Fi antennas that can detect nearby smartphones and triangulate their location, to create geo-behavioral consumer profiles that help them devise personalized and targeted marketing communications. For instance, OLX India used hyper-location targeting to reach second hand mobile and automobile markets using pin-codes. Competitive-minded businesses have been leveraging, location analytics, in the context of big data and micro-level data models, to identify new markets and make other decisions such as where to open a new store.

Despite the tremendous advances made by location mapping and technology industry, there are significant challenges to its widespread adoption and its potential, thus remains untapped. This can be traced back to reasons such as lack of awareness around geospatial and predictive analytics, data privacy concerns, capital intensive network and data storage infrastructure and lack of proper data collection methodologies.

In addition to the scale and cost challenges, new and emerging technologies pose another challenge to businesses today. Being an early adopter of these technologies has lots of advantages but often necessitates careful planning, forcing companies to consider how they operate and conduct business. Fortunately, the emergence of cloud computing aims to provide a potential solution for all these challenges. Cloud computing on its own has become an integral part for many companies today, helping them with greater convenience, lower costs and enhanced efficiencies. Having grown as a disruptive technology itself, today cloud computing serves as an equalizer for the accessibility and affordability of big data and other innovative solutions, including geospatial technologies.

Let us look at the benefits of integrating these two technologies together and how location as a service could be a key ingredient for any business strategy:

As is in the most cases around emerging technologies, regulatory and security concerns around data privacy in cloud remain. Businesses are worried about data leaks, vulnerability attacks and lack of control around off-premise infrastructure. However, the benefits of convergence of cloud computing with spatial sciences are likely to exceed and rather offers compelling ready- to- use, affordable and scalable alternative to companies, especially the small and medium enterprises, who unlike their larger counterparts lack the money and human resource muscle. To put things in perspective, users can increasingly leverage a cost-effective, flexible, on-demand computing platform to integrate- observation systems, clustering, analytical visualizations, statistical trend analysis, parameter extracting algorithms, simulation models, map-based storytelling, etc., all essential components for geospatial technologies.

At the most basic level, convergence of these two technologies means the ability to store and process higher volumes of data at an unprecedented scale. However, location as a service can deliver actionable location insights that are real-time and that can be cross leveraged across a spectrum of web and mobile applications. Cloud based location solutions can enable anyone with an internet connectivity and a Web browser using any device such as desktop, a laptop, a tablet or a smartphone to access maps including high definition imagery, topography, street based maps and functionalities like geo-coding on the cloud on the move, thus reducing on premise spends on developing and maintaining an IT infrastructure.

Last, but not the least, it can also make data sharing and collaboration between multiple parties easier, faster and secure. Businesses can take advantage of crowd sourcing as individuals turn both into consumer of the geospatial services as well as data sensors.

Our location intelligence platform, called the- HERE Platform, allows businesses to enrich their data through an advanced development workspace that offers standardized developer tools, analytics, scaled out processing and data management. Customers and partners can also monetize and license their maps with third-party data to create new, innovative offerings in a neutral, secure and open marketplace. The platform also makes it easier to analyze and archive the data, process events, visualize information and create dynamic datasets on the factors that include their businesses- whether it is determining business locations for retail stores, determining least cost path navigation for fleet management or creating new insurance incentives on road by profiling driver habits. With an equal emphasis on security, the HERE Platform complies with all applicable privacy and security regulations across geographies.

As we continue our journey in a digital world, connected with new and autonomous technologies, businesses will have to focus on building capabilities that help them stay ahead of the competition and remain profitable. Thus, at the convergence of cloud computing and spatial intelligence, lies the next big opportunity to make location intelligence accessible and strategic business advantages.

Note: This is a guest blog by Damandeep Kochhar, Global VP and MD India, HERE Technologies

Continued here:
Cloud the next destination of geospatial - Geospatial World

Read More..