Page 4,111«..1020..4,1104,1114,1124,113..4,1204,130..»

CEO Andy Jassy reportedly said AWS is two years ahead of Microsoft – Business Insider

Amazon Web Services CEO Andy Jassy reportedly told employees at an all-hands meeting on Thursday that the Seattle-based company's cloud business was two years ahead of its competitor Microsoft's.

"If you do any thorough, apples-to-apples, objective comparison of AWS versus Microsoft, you don't come out deciding that they're comparable platforms," he said, according to a report from the Federal Times, which said it obtained a video of the meeting. "Most of our customers will tell us that we're about 24 months ahead of Microsoft in functionality and maturity."

Jassy's comments came as the executive was detailing the company's plans to challenge Microsoft's win of the Joint Enterprise Defense Infrastructure project, a contentious $10 billion cloud-computing contract with the Department of Defense.

AWS has started to protest that decision over the so-called JEDI contract in the US Court of Federal Claims, citing "unmistakable bias."

"AWS is uniquely experienced and qualified to provide the critical technology the US military needs, and remains committed to supporting the DOD's modernization efforts," an AWS representative said in a prepared statement cited by the Federal Times. "We also believe it's critical for our country that the government and its elected leaders administer procurements objectively and in a manner that is free from political influence. Numerous aspects of the JEDI evaluation process contained clear deficiencies, errors, and unmistakable bias- and it's important that these matters be examined and rectified."

Microsoft was selected October 25 for the JEDI deal, which will help move the Department of Defense's sensitive data to the cloud. It's worth as much as $10 billion over 10 years.

The contentious bidding process included involvement from tech titans such as Oracle and politicians up to and including President Donald Trump, who has publicly ridiculed Amazon CEO Jeff Bezos over his ownership of The Washington Post, whose coverage Trump takes issue with.

Jassy told employees the process involved political interference and therefore was unfair.

"When you have a sitting president who's willing to publicly show his disdain for a company and the leader of a company, it's very difficult for government agencies including the DOD to make an objective decision without fear of reprisal," Jassy said, according to the report.

Experts say Amazon may have a case if it could prove political interference unfairly affected the outcome of the bidding process particularly given that a coming book claims Trump ordered then-Defense Secretary James Mattis to "screw Amazon" out of the JEDI contract.

But it's no sure thing. Amazon would have to prove not only that political pressure was applied to the process but also that the pressure affected the outcome. Experts previously told Business Insider that Microsoft most likely won the JEDI deal on its own merits as a cloud heavyweight.

See the article here:
CEO Andy Jassy reportedly said AWS is two years ahead of Microsoft - Business Insider

Read More..

US Healthcare Cloud Computing Market Size & Share 2019 Predictions and Analysis Report by 2027: Facts & Factors (FnF) – The World Industry…

A leading market research firm Facts & Factors (FnF) added a research report on U.S. Healthcare Cloud Computing Market By Component (Hardware, Services, and Software), By Deployment Mode (Hybrid, Public, Community, and Private), By Pricing Model (Spot/Subscription and Pay-As-You-Go), By Application (Non-Clinical Information System and Clinical Information System), By Service Model (SaaS, Pass, and IaaS), and By End-User (Healthcare Providers and Healthcare Payers): Industry Perspective, Comprehensive Analysis, and Forecast, 20182027 to its research database. This Healthcare Cloud Computing Market report analyzes the comprehensive overview of the market comprising an executive summary that covers core trends evolving in the market. It also includes industry aspects such as drivers, restraints, and opportunities that have been observed in the market throughout its forecast period. It also includes figures pertaining to the volume, value and growth rate of the market from a historical as well as futuristic point of view. The Healthcare Cloud Computing Market report includes the forecasts, analysis and important industry trends, market size, market share estimates and profiles of the leading industry Players.

The Healthcare Cloud Computing Market report provides forecasts in terms of CAGR, and Y-O-Y growth. This helps to understand the overall market and to recognize the growth opportunities in the global Healthcare Cloud Computing Market. The report also includes a detailed profile and information of all the major market players currently active in the global Healthcare Cloud Computing Market. The companies covered in the report can be evaluated on the basis of their latest developments, financial and business overview, product portfolio, key trends in the market, long-term and short-term business strategies by the companies in order to stay competitive in the market.

Request an Exclusive Free Sample Report of Healthcare Cloud Computing Market: fnfresearch.com/us-healthcare-cloud-computing-market-by-component-hardware

(Free sample report contains research report overview, TOC, list of tables and figures, an overview of major market players and key regions included)

The global Healthcare Cloud Computing Market report includes the definition of the market, major applications, segments, uses, classification, product specifications, manufacturing processes, cost structures, and other applications. It also analyzed the regional market conditions, including the product price, profit analysis, capacity, volume, production, supply, demand, and market growth rate and forecast, etc.

The Healthcare Cloud Computing Market report provides market dynamics including the latest trends, opportunities, market drivers, and challenges. The report covers an overall view of the global Healthcare Cloud Computing Market and also includes market segments and market size. A detailed description of the challenges and drivers offers a clear picture of how the market is expected to perform throughout the forecast period 2018-2027.

To Know What is Size, Share of the Healthcare Cloud Computing Market, Ask a Free Sample Report: fnfresearch.com/us-healthcare-cloud-computing-market-by-component-hardware

Major Company Profiles Covered in This Report:

IBM, Microsoft, Quality Systems, CareCloud Corporation, ClearDATA Networks, INFINITT Healthcare, VMware, Carestream Health, CloudMine, SAS Institute, Progress Software Corporation, Salesforce.com, Napier Healthcare Solutions, Siemens Healthcare, Cerner Corporation, General Electric, Athenahealth, and Oracle Corporation

The major region covered in this report:

For in-depth Report, Details Visit Report Page: fnfresearch.com/us-healthcare-cloud-computing-market-by-component-hardware

The report covers the Healthcare Cloud Computing Market demand, growth opportunities, challenges, key segments, standardization, deployment models, opportunities, value chain, company profiles, and strategies.

What Healthcare Cloud Computing Market research report offers?

For Urgent Enquiry, Mail us atsales@fnfresearch.com

About Us:

Facts & Factors is a leading market research company and offers customized research reports and consulting services. Facts & Factors aims at management consulting, industry chain research, and advanced research to assist our clients by providing planned revenue model for their business. Our report and services are used by prestigious academic institutions, start-ups, and companies globally to understand the international and regional business background. Our wide-ranging database offers statistics and detailed analysis of different industries worldwide that help the clients in achieving sustainable progress. The well-organized reports help clients in developing strategies and making informed business decisions.

Contact Us:

Facts & Factors

Global Headquarters

Level 8, International Finance Center, Tower 2,

8 Century Avenue, Shanghai,

Postal 200120, China

Tel: +8621 80360450

E-Mail: sales@fnfresearch.com

Web: http://www.fnfresearch.com

As one of the lead news writers at the world industry news, Hirens specialization lies in the science, technology, Health & business domains. His passion for the latest developments in cloud technology, connected devices, nanotechnology, and virtual reality, among others, shines through in the most recent industry coverage he provides. Hirens take on the impact of digital technologies across the technology, health and business domains gives his writing a fresh and modern outlook.

Read more here:
US Healthcare Cloud Computing Market Size & Share 2019 Predictions and Analysis Report by 2027: Facts & Factors (FnF) - The World Industry...

Read More..

The Real Fight for the Future of 5G – Foreign Affairs Magazine

In late October, Germany and China began commercial-scale rollouts of 5G, the wireless technology infrastructure that is transforming the way the world computes. Machines and people will still talk to each other over the borderless network we call the Internet. But with 5G, a new networking infrastructure is emerging, dependent on the Internet but distinct from it andsubject to much more government and private control.

With 5G it is possible to do enormous amounts of computing at very high speeds and without having to connect the input devicea cell phone, say, or a self-driving carto a wire of any kind. But those high speeds are possible only if the rest of the system (signal towers, base stations, distributed servers, and the megascale centers that house the data and do a great deal of computing themselves) is physically near enough to these input devices. Having your phone, car, or pacemaker in constant contact with vast computational power in the so-called cloud sounds amazingly untethered and extraterritorial. Yet in its physicality and focus on location, the emerging system is more grounded than the Internet ever was.

Whether control over 5G will be exercised principally by states or companies remains to be seen. But the implications for surveillance, security, and national prosperity are enormous, and yet policymakers and business executives have hardly begun to address them.

The Internet has proved remarkably resistant to state governance. Its use can certainly be shaped by expensive government initiatives such as Chinas Great Firewall or the European Unions General Data Protection Regulation (GDPR). But multilateral attempts to control the Internet itself have so far failed, mainly because the deliberately impenetrable global Internet communityincluding Internet service providers and sui generis governance institutions such as the Internet Engineering Task Force and the Internet Societyis dedicated, in the best geek spirit, to avoiding state capture. That posture may change, but for now the communitys obdurateness and the jealousies of states, which have kept

View post:
The Real Fight for the Future of 5G - Foreign Affairs Magazine

Read More..

Mentoring boosts managed IT and cloud computing business sky high – Lancashire Business View

News LBV Hub IT and Technology Boost J700 Group

A Rossendale-based managed IT support business has quadrupled its turnover and added four new staff since receiving strategic mentoring through Boost, Lancashires business growth hub.

J700 Grouphas seen turnover quickly rise to a significant six figure sum and the business aims to hit 1m within the next 18 months.

Jonathan Cundliffe worked in senior IT positions for more than 25 years before establishing J700 Group in 2015. His wife Deon joined in 2018 to support him as commercial director. The company provides IT, cloud, website design, SEO and communications solutions to businesses across the North West.

After Jonathan approached Boost last year for help to grow the business, him and Deon joined Boosts Growth Mentoring programme delivered by Orvia. Paul Bury was appointed as their mentor to help focus the companys sales and marketing strategies.

Jonathan said: The support received from Paul was absolutely brilliant. He went above and beyond at each meeting and I can say with confidence that we would not be doing as well without his help.

Pauls main objective was to get J700 Group to refocus on core propositions, define the business aims and achieve more clarity about their approach to service. The mentoring encouraged Jonathan and Deon to adopt a clear unique selling proposition (USP) to set themselves apart from competitors.

Since receiving Boost support, new business includes website projects, several significant software projects, IT support and consultancy services.

At the start of the Boost mentoring, the company employed two people. It now has six people contributing to the business, with plans to grow to a team of 10 over the next three years. J700 Group has also relocated to larger premises to support the rapid growth. The new space is nearly ten times bigger and includes conference and training facilities.

Jonathan said: I must admit, I was quite sceptical at the beginning. I was worried Id be letting a stranger into the business and giving up valuable time in my already busy day thinking that the mentoring wouldnt add any value to the business.

My opinion changed the instant I met Paul. Not only does he have a wealth of skills, knowledge and experience in business, he is also a really nice guy.

Jonathan is now seeking a sales and business development manager and plans grow the IT support team, including offering an IT apprenticeship.

Paul Bury said: J700 Group has excellent growth potential and what was really pleasing to see was the ideas and insight developed through the mentoring being put into practice. Through positive action, we saw really positive results. Well done to both Jonathan and Deon. J700 has a very exciting future.

Since receiving the mentoring support through Boost, Jonathan has decided to give something back and has joined the mentoring programme to share his specialist skills and help other Lancashire businesses to grow.

Boost is Lancashire's Business Growth Hub and is led by the Lancashire LEP (Local Enterprise Partnership) and Lancashire County Council and supported by funding from the European Regional Development Fund (ERDF).

Boost has received 3.8m of funding from the England European Regional Development Fund (ERDF) as part of the European Structural and Investment Funds Growth Programme 2014-2020. The Ministry of Housing, Communities and Local Government is the Managing Authority for ERDF. Established by the European Union, ERDF funds help local areas stimulate their economic development by investing in projects which will support innovation, businesses, create jobs and local community regenerations. For more information visit https://www.gov.uk/european-growth-funding.

Link:
Mentoring boosts managed IT and cloud computing business sky high - Lancashire Business View

Read More..

Multi-cloud, more problems: the increasing attack surfaces of multi-cloud adoption – TEISS

In the not too distant past, cloud computing was a concept that IT professionals would spend hours explaining to their board of directors. Now, the ubiquity of the cloud requires no further explanation.

Cloud evolution is revolutionising business operations by employing a network of remote servers hosted on the internet to store and process data instead of relying on-premise infrastructure.

Utilising a cloud-based IT infrastructure can significantly increase the cost efficiency of an enterprise. It has also been known to facilitate greater agility by allowing systems to operate without the constraints of on-premise infrastructure.

Migrating to the cloud is simple and there are several rental options on the market that can be scaled according to specific business needs. Indeed, the global public cloud market, or Infrastructure as a Service (IaaS) grew 31.3% during 2018 to US$32.4 billion, according to Gartner.

The cause of this growth may be traced to the rising number of corporations embracing multiple cloud services. Indeed, more than 73% of organisations are using two or more public cloud providers.

Perhaps this is because each vendor specialises in different aspects of cloud computing from managing and maintaining IT systems, to more flexible workflows, while offering automatic updates.

The major cloud service providers are Amazon Web Services, Microsoft Azure and Google Cloud. These services are often preferred because of their intrinsic security infrastructures. While the benefits of utilising the cloud are numerous, there are some significant shortcomings.

The decreased visibility that comes with relying on multiple third-party applications can result in a larger attack surface.

Indeed, the more an application comes into contact with the Internet, the more risk it accumulates. This risk is augmented by the fact that different providers require different services and tools to address unique problems across multiple environments.

Often the tools on cloud-hosting services fall short, especially for console and deployment security affecting enterprise customers.

This has been a contributing factor for several recent cloud breaches. The trap that ensnares many security professionals is assuming that they are completely safe and secure with no need for internal security testing. However, this is of course a myth.

The enterprise that rents virtual space on IaaS also needs to ensure their own safety, especially in environments using multiple cloud providers.

The increasing sprawl of the cloud means that there is a dire need for a new approach to cloud security; an approach that helps enterprises address vulnerabilities through one service that can assess threats across a range of platforms from a single pane of glass.

Without a system in place, blind reliance on the cloud could have detrimental consequences as breaches are increasing globally. Security professionals are expected to possess a comprehensive understanding of risk covering all aspects of the cloud.

However, defining the risk posture of your enterprise depends on what type of data is stored and processed in the cloud. Some data is more regulated than others such as: business intelligence, intellectual property, the personal information of customers, internal records, and financial information.

This data is not just valuable to organisations, but also to cybercriminals. If your corporation is putting sensitive data into a public cloud, then you are creating a promising temptation for potential hackers.

For the security professional, implementing and monitoring cloud security means dealing with a mixture of new clouds, new ways of creating and deploying apps, legacy IT, off-premise architecture, shadow clouds and potential cloud sprawl.

Addressing these challenges can be difficult with a security workforce that often lacks operational skills for every scenario.

This creates new problems that go beyond the traditional protection of physical on-premise infrastructure.

The cloud era ushers in a new way of conducting business with greater efficiency, and complications.

While increasing the speed of conducting business, it becomes more difficult to Monitor cloud assets and detect vulnerabilities. This is complicated by inconsistencies between cloud providers and the capabilities of their security tools, which typically do not collaborate with other cloud services.

In order to practically address cloud security, responsibilities must be shared by both providers and enterprises, with each focusing on the technologies that are within their remit of control.

When discussing IaaS, where the cloud provider secures the back-end data centres, networking, servers, and virtualisation; the enterprise is responsible for protecting cloud payloads such as operating systems, databases, security and applications.

This shared responsibility model puts the onus on the enterprise to protect its own workloads running in public clouds.

The enterprises intent on improving their cloud security posture should prioritise the issues not covered by their service providers. The methods of protection provided by IaaS vendors generally only protects the infrastructure they rent to the enterprise.

While these safety measures comprise an essential element of cloud security, it is not necessarily helpful for customers who have cloud security needs across other areas.

First and foremost, security professionals should secure the control plane, which consists of enterprise connections into third-party public cloud.

Security for the public control plane is all about ensuring identity access and networking management for critical applications are correctly configured, thereby reducing the risk of permission malfunction and increasing security across the board.

Secondly, enterprises should focus on securing their data plane, which includes performing security assessment for cloud workload instances (applications) for any vulnerabilities. This means that enterprises should protect any information on the cloud as thoroughly as they would if it were onsite.

Security professionals should deploy a continuous process of automatically assessing cloud environments against security best practices and security violations to recommend steps for remediation.

For the modern enterprise, the rush to multi-cloud is bringing huge operational benefits to organisations, and new classes of service for customers and business partners. Yet with these benefits comes a rapidly rising degree of risk due to inherent security vulnerabilities with cloud services.

For IaaS scenarios, responsibility for cloud security posture management and cloud workload protection rests squarely within the enterprise. Cloud providers will not do this for you. In order to ensure perpetual protection, professionals should not become complacent.

It is essential to take security into your own hands, especially when your data is in the hands of others.

Original post:
Multi-cloud, more problems: the increasing attack surfaces of multi-cloud adoption - TEISS

Read More..

D-Wave sticks with its approach to quantum computing – TechCrunch

Earlier this month, at the WebSummit conference in Lisbon, D-Wave and Volkswagen teamed up to manage a fleet of buses using a new system that, among other things, used D-Waves quantum technology to help generate the most efficient routes. While D-Waves 2000Q only played a small part in this process, its nevertheless a sign that quantum computing is slowly getting ready for production use and that D-Waves approach, somewhat controversial in its early days, is paying off.

Unlike other players in the quantum computing market, D-Wave always bet on quantum annealing as its core technology. This technology lends itself perfectly to optimization problems like the kind of routing problem the company tackled with VW, as well as sampling problems, which, in the context of quantum computing, are useful for improving machine learning models, for example. Depending on their complexity, some of these problems are nearly impossible to solve with classical computers (at least in a reasonable time).

Grossly simplified, with quantum annealing, you are building a system that almost naturally optimizes itself for the lowest energy state, which then represents the solution to your problem.

Microsoft, IBM, Rigetti and others are mostly focused on building gate-model quantum computers and they are starting to see results (with the exception of Microsoft, which doesnt have a working computer just yet and is hence betting on partnerships for the time being). But this is also a far more complex problem. And while you cant really compare these technologies qubit to qubit, its telling that D-Waves latest machines, the Advantage, will feature 5,000 qubits while the state of the art among the gate-model proponents is just over 50. Scaling these machines up is hard, though, especially given that the industry is still trying to figure out how to manage the noise issues.

D-Wave remains the only major player thats betting on annealing, but the companys CEO Vern Brownell remains optimistic that this is the right approach. We feel more strongly about our decision to do quantum annealing now that there are a few companies that actually have quantum computers that people can access, he said in an interview earlier this month.

We have customers, Volkswagen included, that have run problems against those other computers and seeing what they can actually do and its vastly different. Our capability is many orders of magnitude faster for most problems than what you can do with other quantum computers. And that is because of the choice of quantum annealing. And that is because quantum healing is more robust to errors. Error correction, he argues, remains the fundamental problem, and will hamper the performance of these systems for the foreseeable future. And in order to move into the enterprise or any kind of practical application, that error correction needs to be wrestled with, he noted.

Continue reading here:
D-Wave sticks with its approach to quantum computing - TechCrunch

Read More..

Information overload: The promise and risk of quantum computing – Bulletin of the Atomic Scientists

Google recently announced a breakthrough in quantum computing. Sundar Pichai, Google's CEO, reportedly compared the achievement to the Wright brothers' first flight. Credit: Composite by Matt Field. (Creative Commons photos by Maurizio Pesce and Steve Jurvetson.)

The English philosopher Sir Francis Bacon is often credited with saying knowledge is power. Although Bacons aphorism is still in circulation, the 16th century thinker clearly didnt predict the advent of the modern-day search engine. Now knowledge is so readily available that information overload, rather, is a problem. Perhaps a more meaningful maxim would be something along the lines of: The ability to sort and process large amounts of knowledge is power. And that ability will be dramatically increasedfor good and illas researchers make progress in the field of quantum computing.

The international security community in particular has been grappling with the implications of access to vast troves of information. Twentieth century practitioners prioritized scientific efforts that improved technologies such as the surveillance drone to collect data. In the 21st century, however, the security community is dealing with the ramifications of those effortsthe need to process the huge amounts of data that drones, satellites, and other technologies can acquire. But in the age of big data and information technologies, practitioners face a challenging new paradigm: Government isnt necessarily at the forefront of development in data processing technology; private industry is. Policy makers must confront the uncomfortable reality that the future of national security now relies on the governments ability to oversee, regulate, and adopt the research and emerging technologies developed by private companies.

Case in point: Google recently claimed to have achieved so-called quantum supremacy, marking an important development in a perennially just-over-the-horizon technology that could dramatically improve the speed at which computers can complete complex tasks. Its also a technology that, if used by adversarial countries, could disrupt important aspects of US national security such as data protection.

Googles claim of quantum supremacy. Googles announcement was another milestone in the international competition to harness data processing technologies like artificial intelligence. Although the significance of the companys accomplishment has been challenged by industry competitors, the announcement at least confirms the steady progress and commitment of private industry leaders to the development of technologies that could have major implications for national security. Quantum supremacy refers to a benchmark indicating that a quantum system can perform a given function faster than a classic computer. Google developed a quantum processor with 53 operational qubitsthe principle unit of information in a quantum computerthat successfully completed a computationally intensive task in only 200 seconds. Google scientists estimated it would take the most powerful classical supercomputer over 10,000 years to complete the same task.

At this point, quantum computers are still mostly being built and tested to execute specific and carefully chosen tasks that could, in theory, be executed classically. The technological limitations of quantum computing are such that accomplishing a given task requires a custom built and programmed quantum computing system. This means that each iterative achievement will likely be respective of the highly specific task that each individually developed quantum computer will be built for and potentially respective of the specific method of operation applied. This technical reality, taken in the context of the standing definition for quantum supremacy, implies there may be several announcements by groups claiming that theyve achieved this or that quantum supremacy.

The necessary progress to definitively surpass classic computers and achieve universal quantum supremacy, rather than piecemeal quantum supremacy, will require both hardware and software improvements. Even after significant innovation, it is highly unlikely that quantum computers will ever replace classical versions for most day-to-day operations. Rather, clever implementation will see the two working most effectively as complements to each other.

Impact on national security. The most suitable applications for quantum computers are problems with large, multi-dimensional parameter spaces that require the manipulation and optimization of significant numbers of independent variables. Consider the ways important national security information is protected and accessed, in technical terms, the ways it is encrypted and decrypted. Due to the high processing power of quantum computers, modern encryption methods that would take classic computers a long time to break would be rendered useless. The unique physical properties of proposed quantum computers would allow them to seek all possible solutions to an encryption algorithm simultaneously, giving an answer that reflects the probability of each outcome. This ability puts national-security-sensitive data at risk. In fact, hackers are already banking on this potentiality and storing available data until some point in the future when the data may be decrypted with quantum computers.

Beyond data processing and manipulation, through increased simulation and computation capabilities, quantum computers could help advance a number of scientific fields, including materials sciences. From improving drone battery-life to solving military logistics issues, researchers are predicting any number of national security applications for quantum computing.

More broadly, quantum computers could augment other emerging information technologies in the security field, like artificial intelligence and machine learning. Likewise, developments in these fields would also lead to an acceleration of quantum computer research. The interconnectedness of these emerging information technologies, and the fact that an improvement in any one of the technologies in the competition for big data primacy would accelerate the others, means it will be critical that governments monitor all information technology research and identify likely coevolution trajectories in order to secure data and infrastructure necessary for national security.

Googles achievement signals another inflection point in that national governments are falling behind private industry as the leading developers of military-relevant technology. Unlike historical technologies that have revolutionized national security, quantum computing research is being driven by robust private industries in both China and the United Statesthe two countries frequently engaged in whats sometimes called the quantum computer arms race.

Although this shift in innovation influence may not necessarily be a bad thingit could prevent broadly relevant technologies from being siloed by the military communityit does require that military and national security practitioners adjust technology development strategies, including addressing the economic and skill barriers to adapting civilian technologies for military application.

Its also worth noting that the focus on civilian development of quantum computers might lead to asymmetric capabilities favoring offensive (operational and manipulative) rather than defensive (protective) technologies. For instance, private industries are most focused on developing quantum computers and less focused on developing quantum-safe encryption methods. This could result in a misbalance between offensive and defensive capabilities that could be catastrophic for national security if the scale tips too far. Governments must proactively identify and prioritize innovation in areas underfunded by the private sector that will be necessary to maintain national security infrastructure.

The US governments approach. In September 2018, the White House issued a national strategy on quantum information science that included near- and long-term development goals. The high-level overview identified a number of specific priorities set by the federal government, including bolstering the national economic, research, and education infrastructure required for quantum information technology development. It also called for collaborating with private industry and with other countries. Congress then passed the National Quantum Initiative Act to allocate funding for a national strategy that fosters public-private-academic partnerships. To address a key vulnerability of the post-quantum-supremacy world, the US National Institute of Standards and Technology is driving the development of quantum-safe encryption methods, an area that has not received comparable resources from private industry. The agencys timeline suggests that the earliest draft of these encryption algorithms and standards will be completed by 2022.

National security priorities. The fact that there was controversy over Googles claim to quantum supremacy indicates that the terminology used to discuss quantum computing is weak and vague. Given the hardware and software limits preventing universal supremacy, the importance of any given instance of quantum supremacy is related to the function that the quantum computer can perform. Striking a proactive and reactive balance to new developments will requires that government project leaders accept that (at least for now) universal quantum computers capable of performing a wide variety of functions are likely decades away from being developed. National security thinkers should instead identify which specific types of supremacy will have meaningful impacts. One that can perform decryption faster than a traditional computer, for instance, would be highly disruptive.

Googles announcement and quantum computing writ large must be considered as part of the broader big data competition. The development of different emerging information technologies, and their respective impacts on national security, must not be considered in isolation. A significant development in any one of the rising technologies will likely have a domino effect and trigger innovation progress in other areas. The promise of quantum computing is the vast new knowledge it will unlock.

Thats also the risk.

Read more:
Information overload: The promise and risk of quantum computing - Bulletin of the Atomic Scientists

Read More..

Dell Technologies on democratising 5G and the future of quantum computing – ZDNet

Michael Dell said he would like to think his company has been a force for good in terms of democratising access to technology and making it more available to everyone.

Speaking with media this week during the Dell Technologies Summit in Austin, the CEO and his president and CTO of products and operations John Roese, said it's important that tech isn't reserved for the elite.

"The lever that you can pull -- it has always worked -- is broad availability to the technology and so something like 5G, our aspiration, we're doing a lot of work right now. Michael specifically, we're trying to basically bend the curve on the economics of 5G by aggressively moving towards virtualisation and simplification," Roese added.

"The net result of that is If we can drive the economic model so that we can flatten that, make it less of a premium product for only the elite, but make it available to everybody -- that's obviously good for us and good for the industry."

According to Roese, it also opens up opportunities for people to change the education cycle.

"Imagine, you know, underdeveloped environments, or even populations that are literally being able to do holographic or AR-based experiences at a cost-effective level -- it changes the curve," he said.

"I was on the board when they got One Laptop per Child in the 2000s, and the whole fact of making children literate, who couldn't even read and write with a piece of technology was because we drove the cost of compute way down, we made it generally available."

Roese said Dell Technologies' goal is not to create technology for five people, in a unit volume of three, rather it's to make it available everywhere.

"And the way that we do it is standardisation, basically making it easy to consume, driving the cost out of it and making it accessible," he said "That lever spawns the innovation cycle that can actually change things like poverty, change literacy rates, and we have good evidence that when that happens, that's exactly what occurs.

"And this next cycle, trust me, we have no other goal, than broad adoption of these technologies."

When asked during a media session what Dell Technologies was doing in the quantum computing space, Dell said "it could go either way".

"We believe the physics are sound, and something will happen in the quantum world that will be a disruption," Roese said, clarifying the company's position. "There are three conditions that have to be true before any kind of adoption."

The first, he said, is an industry-wide agreement on a quantum computing architecture, which is yet to happen with sufficient scale; the second is that quantum computing has to be made to work in the real world.

"We have huge activity going on in the industry around trapped ions, trapped charged particles, trapped photons, that work has not been done -- it is too esoteric to do," he continued.

The third is the development of a software framework and how quantum will be experienced.

"The good news is all three of those are happening, we're working with most of those companies -- I just did a bit of a tour a couple of weeks ago with most of the quantum startups in the world and they are basically on a journey that over the next, let's say five years, we will start to see incremental breakthroughs. They will be very, very narrow -- kind of like the equivalent of like vacuum tube era of technology is what's happening now," Roese said.

See also: Australia's ambitious plan to win the quantum race

Roese pointed to Google's recent announcement, and said the only thing the search giant's "breakthrough" did was create a random number generator, which is something that's never been achieved in a classical computer.

"Now it's not usable for anything yet, but those kind of breakthroughs will happen, but it will happen over a long cycle," he said.

"We are observing, we are engaged, we think this will manifest as an accelerator in the cloud that you'll do certain mathematical functions -- it will not replace your generalised compute infrastructure, probably ever, but it will be interesting over time.

"We're watching it closely, we're involved in it, but if you're worried about changing your entire IT architecture and your strategy and your investment portfolio because of quantum -- don't do that. we will let you know -- my commitment to Michael is I'll give him two to three years notice before he has to decide to do R&D in this space and that's not happening."

Asha Barbaschow travelled to Dell Technologies Summit as a guest of Dell Technologies.

Read this article:
Dell Technologies on democratising 5G and the future of quantum computing - ZDNet

Read More..

How Serious Is the Threat of Quantum Computing to Crypto? – Finance Magnates

The science of quantum physics is being used to build quantum computerspowerful machines that have the ability to solve incredibly complex mathematical equations much more quickly than even the most advanced computers available today.

As such, any data thats encrypted using mathematical equationsincluding banking data, intelligence data protected by the government, and encrypted messages on cell phonesis vulnerable to being exposed by quantum computing. Most notably, in this case, encrypted cryptocurrency datasuch as private wallet keysare also vulnerable to quantum computing technology.

London Summit 2019 Launches the Latest Era in FX and Fintech Join Now

In other words, quantum computing could potentially be used to uncover every private key on a blockchain network, thus rendering that networks users vulnerable to hacking and theft.

Therefore, the point in time at which quantum computers can solve problems that ordinary computers cannotalso known as the quantum supremacyis considered to be a serious threat to the security of blockchain networks.

How far away are we from this quantum supremacy?

Last week, joked Kadan Stadelmann, CTO of Komodo, a multichain architecture project, to Finance Magnates.

In any case, jokes apart, from a technical standpoint we have to consider the quantum supremacy era already here now. The industry leaders in this area have already publicly presented functional two to three-figure qubit chips, which means with unlimited resources and space this could be scaled up quite fast.

Google, for example, just presented how their 54 Qubit chip performed a computation which would take the worlds most powerful supercomputer 10,000 years in just 200 seconds. This doesnt even cover the non-public segment of this industry.

What are qubits? While a classical computer uses bits, which are represented either 0s or 1s. Qubits, on the other hand, can be 0s or 1sor both at different times. Qubits are the thing that make quantum computers so powerful: if a normal computer is operating with four bits, that computer could arrange those bits in any of 16 combinations, one after the other, in order to solve an equation.

With four qubits, however, a quantum computer could arrange four qubits into 16 combinations all at one time. According to Decrypt, just 20 qubits can store more than a million values in parallel, which allows a quantum computer to be able to work through a problem by performing calculations in parallel rather than one at a time.

However, Vlad Miller, CEO of the Ethereum Express company, explained that blockchain network ledgers are not susceptible to hacking by quantum computers.

Today, records of all cryptocurrency transactions are stored on blockchain. Since the copies of the data are distributed among all users, they are almost impossible to change, he said. No data block can be removed or modified without affecting all other blocks, which would require the consent of most network users. In this sense, blockchain is resistant to quantum computers, and the growth of computing power will not affect the security of the system.

Indeed, the threat posed by quantum computers is more likely to concern the vulnerability of personal cryptocurrency accounts or wallets. These powerful computers can hack user codes that are used to authorize transactions.

[] Until recently, this was considered mathematically impossible, Miller continued. An ordinary binary computer is not able to crack a cryptocurrency key, but for quantum machines, this is not difficult because of the incredible computing power.

Charles Phan, Charles Phan, CTO of Cryptocurrency Derivatives Exchange Interdax, also pointed out to Finance Magnates that the SHA-256 function used in mining is another area where quantum computers could influence bitcoin [and] crypto. However, the risk is much lower here.

An adversary with a quantum computer could also direct it towards mining bitcoin if they achieve 51% control of the hash power, then they can permit double-spending and do what it likes. They could also cause damage to the network if they control a significant proportion less than 51%.

How immediate is the threat of quantum computing to blockchain networks, and what can these networks do to protect themselves?

KVB PRIME Gains Key UK Influence by Sponsoring Major Finance ConferenceGo to article >>

Eventually, quantum computers will force changes in cryptocurrency systems, but todays quantum computers exist only in the lab and are a long way from having practical impacts on cryptocurrencies, said Edward Felten, Co-Founder and Chief Scientist of Offchain Labs, a second-layer scaling solution for Ethereum, to Finance Magnates. Ed is also the former White House Deputy United States Chief Technology Officer.

Over time, cryptocurrencies will need to evolve to use different encryption and hashing methods that are quantum-resistant. Crypto communities should be talking about how to do this, but its not yet an emergency.

At what point will it become an emergency? Vlad Miller said that although the United States National Security Agency predicted in 2016 that decades would pass before quantum computers pose a serious threat to encrypted information, given the pace of development of this technology over the past year and a half, the conclusions of the NSA seem overly optimistic. In fact, many experts believe that the threat will arise in the next 15 or even 10 years.

Therefore, a number of blockchain projects are already preparing for this quantum supremacy.

Kadan Stadelmann explained that while [quantum computers] certainly pose a long-term threat to most cryptocurrencies in their current form, the more legitimate projects are already deploying countermeasures in the form of quantum-proof cryptography.

Even already existing cryptocurrency networks will be able to migrate to quantum-resistant algorithms and digital signature schemes, he said.

Therefore, by the time quantum computers are available to the wider public, expect most remaining cryptocurrencies to have already made the leap into the quantum-resistance movement (so to speak.)

However, this may be easier said than donemetaphorically [speaking], pre-existing chains can already flip a switch and immediately deploy a quantum-resistance mechanism, Stadelmann said to Finance Magnates. However, networks that have historically found it difficult to reach consensus on important software upgrades (for example, the Bitcoin network) could potentially run into trouble.

If a quantum computer breaks this cryptography used by Bitcoin, anyone who has re-used a bitcoin address is vulnerable, Charles Phan explained. However, the information a quantum computer needs isnt available until the first transaction from a public key is seen, so individuals who use a different public key each time would be safe.

Still, though, quantum processors threaten only the modern generation of cryptocurrencies like bitcoin, Vlad Miller explained to Finance Magnates.

To protect them, users will have to switch to new authentication methods for authorizing transactions in blockchain-based networks.

Indeed, the solution to this problem will be new methods of cryptography resistant to quantum computing. Today many crypto companies are developing a wide range of such solutions. Some of them are based on long-discovered mathematical methods, such as Lamports signatures, Merkle structures, and the sharing of secrets.

Therefore, it is quite likely that blockchain networks will be secure from the threat that quantum computing poses. Platforms will change their locks once quantum computers start to move out of the lab, said Matthew Hine, Business Strategist at Radix., to Finance Magnates.

But everyone should be very hesitant to publicly publish encrypted information with the expectation that it will be secret forever.

View post:
How Serious Is the Threat of Quantum Computing to Crypto? - Finance Magnates

Read More..

Top 7 most common uses of cloud computing – Cloud …

February 6, 2014 | Written by: Maamar Ferkoun

Categorized: Cognitive

Share this post:

Cloud computing has been credited with increasing competitiveness through cost reduction, greater flexibility, elasticity and optimal resource utilization. Here are a few situations where cloud computing is used to enhance the ability to achieve business goals.

1. Infrastructure as a service (IaaS) and platform as a service (PaaS)

When it comes to IaaS, using an existing infrastructure on a pay-per-use scheme seems to be an obvious choice for companies saving on the cost of investing to acquire, manage and maintain an IT infrastructure. There are also instances where organizations turn to PaaS for the same reasons while also seeking to increase the speed of development on a ready-to-use platform to deploy applications.

2. Private cloud and hybrid cloud

Among the many incentives for using cloud, there are two situations where organizations are looking into ways to assess some of the applications they intend to deploy into their environment through the use of a cloud (specifically a public cloud). While in the case of test and development it may be limited in time, adopting a hybrid cloud approach allows for testing application workloads, therefore providing the comfort of an environment without the initial investment that might have been rendered useless should the workload testing fail.

Another use of hybrid cloud is also the ability to expand during periods of limited peak usage, which is often preferable to hosting a large infrastructure that might seldom be of use. An organization would seek to have the additional capacity and availability of an environment when needed on a pay-as you-go basis.

3. Test and development

Probably the best scenario for the use of a cloud is a test and development environment. This entails securing a budget, setting up your environment through physical assets, significant manpower and time. Then comes the installation and configuration of your platform. All this can often extend the time it takes for a project to be completed and stretch your milestones.

With cloud computing, there are now readily available environments tailored for your needs at your fingertips. This often combines, but is not limited to, automated provisioning of physical and virtualized resources.

4. Big data analytics

One of the aspects offered by leveraging cloud computing is the ability to tap into vast quantities of both structured and unstructured data to harness the benefit of extracting business value.

Retailers and suppliers are now extracting information derived from consumers buying patterns to target their advertising and marketing campaigns to a particular segment of the population. Social networking platforms are now providing the basis for analytics on behavioral patterns that organizations are using to derive meaningful information.

5. File storage

Cloud can offer you the possibility of storing your files and accessing, storing and retrieving them from any web-enabled interface. The web services interfaces are usually simple. At any time and place you have high availability, speed, scalability and security for your environment. In this scenario, organizations are only paying for the amount of cloud storage they are actually consuming, and do so without the worries of overseeing the daily maintenance of the storage infrastructure.

There is also the possibility to store the data either on or off premises depending on the regulatory compliance requirements. Data is stored in virtualized pools of storage hosted by a third party based on the customer specification requirements.

6. Disaster recovery

This is yet another benefit derived from using cloud based on the cost effectiveness of a disaster recovery (DR) solution that provides for a faster recovery from a mesh of different physical locations at a much lower cost that the traditional DR site with fixed assets, rigid procedures and a much higher cost.

7. Backup

Backing up data has always been a complex and time-consuming operation. This included maintaining a set of tapes or drives, manually collecting them and dispatching them to a backup facility with all the inherent problems that might happen in between the originating and the backup site. This way of ensuring a backup is performed is not immune to problems such as running out of backup media, and there is also time to load the backup devices for a restore operation, which takes time and is prone to malfunctions and human errors.

Cloud-based backup, while not being the panacea, is certainly a far cry from what it used to be. You can now automatically dispatch data to any location across the wire with the assurance that neither security, availability nor capacity are issues.

While the list of the above uses of cloud computing is not exhaustive, it certainly give an incentive to use the cloud when comparing to more traditional alternatives to increase IT infrastructure flexibility , as well as leverage on big data analytics and mobile computing.

Read the original post:
Top 7 most common uses of cloud computing - Cloud ...

Read More..