Page 2,097«..1020..2,0962,0972,0982,099..2,1102,120..»

Visualizing the 5 Pillars of Cloud Architecture The New Stack – thenewstack.io

Dan Lawyer

Dan Lawyer, chief product officer at Lucid Software, is passionate about creating value by solving problems in delightful ways. Prior to Lucid, he led product and design organizations at Adobe, Ancestry and Vivint.

Getting the most value out of an organizations cloud infrastructure can be a daunting task. But the key considerations can be winnowed down to an easy-to-remember acronym: CROPS, which stands for cost optimization, reliability, operational excellence, performance efficiency and security. (Sometimes, the five cloud pillars are called CORPS, which is the same thing, just in a different order.)

These five pillars are proven guidelines through which companies can design, evaluate and implement cloud architecture in a way that can most effectively scale, ensuring compliance with the relevant standards and saving money over time. And one of the best ways to implement CROPS principles is with real-time cloud visualization. A complete, real-time understanding of your cloud environment will ensure your resources are best utilized and your CROPS get the attention they need.

There are many different aspects of cloud computing that can get costly, such as infrastructure, downtime and staffing. The way to get the most value out of your cloud infrastructure and minimize costs is to eliminate unused components and refine suboptimal processes. This begins by knowing what youre paying for or in other words, knowing exactly whats in your cloud infrastructure.

Companies that want to know the content of their cloud infrastructure pour resources into an effort to visualize that infrastructure. This is where cloud visualization can play a critical role. The right cloud visualization solution will work seamlessly with your Amazon Web Services, Google Cloud Storage or Azure cloud environments to build an inventory of your cloud components. Then through automation, a resulting diagram can allow you to see all the relationships between resources in your current cloud environment, making it easy to identify where costs can be cut.

When engaging in a cost optimization exercise, such as analyzing opportunities for cost reduction, its also important to remember that time is money. For example, if something is misconfigured, gets hacked or malfunctions, that could cause costly downtime. Cloud visualization will allow you to compare the intended state of your cloud with its current state through filters. This way, you can more quickly identify the malfunctioning areas and address them without significant downtime.

If you understand your cloud infrastructure, you can more confidently ensure your customers can rely on your organization. With the ability to constantly meet your workload demands and quickly recover from any failures, your customers can count on you to consistently meet their service needs with little interruption to their experience.

A great way to increase reliability in your cloud infrastructure is to set key performance indicators (KPIs) that allow you to both monitor your cloud and alert the proper team members when something within the architecture fails. Using a cloud visualization platform to filter your cloud diagrams and create different visuals of current, optimal and potential cloud infrastructure allows you to compare what is currently happening in the cloud to what should be happening.

When you can quickly identify and fix problems, youll be able to maintain uptime and establish ongoing reliability.

Striving for operational excellence means creating an environment for your cloud to always function at its best, and this includes continuous improvement. If you neglect to upgrade products or processes to help your cloud environment function at higher levels, you put a ceiling on the levels to which your business can ascend.

Its essential to do constant research to see where and how you can improve your cloud infrastructure and environment. However, improvement doesnt have to be a massive overhaul. Keep improvements small and continuous to balance the need for upgrades while minimizing downtime. One way to help identify these opportunities for improvement is through a cloud visualization platform that allows real-time discussion about improving your cloud environment.

Cloud visualization can enable different presentations of your environment to understand different scenarios. For example, you may need planning architecture designs for communication with engineers, architects and coders. On the other hand, you may need easy-to-understand, simplified diagrams for nontechnical stakeholders from whom you need buy-in. A high-quality cloud visualization solution should be able to automatically generate these different views.

Many factors can impact cloud performance, such as the location of cloud components, latency, load, instance size and monitoring. If any of these factors become a problem, its essential to have procedures in place that result in minimal deficiencies in performance. For example, if you have cloud components in different locations, a malfunction in one region shouldnt lead to severe downtime and service disruption throughout your whole cloud environment.

The ability to analyze horizontal and vertical scaling structures is invaluable. Ask yourself: How much do we have in our cloud infrastructure? Where is each component and are they working best where they currently reside? When companies can access a comprehensive, dynamic view of their entire cloud environment, they will better understand where each dependency is. You can visualize your auto-scaling groups and compute instance sizes, availability zones and relationships between resources. Then you will be better able to decide how you need to adjust your cloud infrastructure to improve performance.

Each cloud architecture must be able to protect the confidentiality and integrity of your information, systems and assets. A robust and proactive approach to security will also ensure that your organization maintains compliance with all government regulations and cloud security standards, such as General Data Protection Regulation, System and Organization Controls 2, Payment Card Industry Data Security Standard and Health Insurance Portability and Accountability Act. Companies may face disruptive technical debt if they dont realize the need to meet these standards until after deployment.

This is another place where cloud visualization can play a critically valuable role. With real-time visuals, you can stay on top of your cloud security, cloud compliance and internal best practices by visualizing and overlaying your metadata in the context of your diagram. Such metadata may include instance names, security groups, IP addresses and more.

For example, you can develop categories to visualize your cloud based on sensitivity levels and mechanisms such as encryption, tokenization and access control. Additionally, you can use cloud visualization to document where data is stored and how its transmitted.

Last, you can set up conditional formatting in your cloud visualization solution that allows easy identification of security issues, such as unencrypted databases, instead of taking an extended amount of time to search for these issues in your cloud. Conditional formatting can also be valuable in other pillars, such as determining what resources are underperforming or what cloud components are a waste in cost.

Each of the five pillars of cloud architecture plays a vital role in optimizing your cloud environment. Following these principles can help avoid wasting time and money. The ability to dynamically visualize your complex cloud infrastructure and enable real-time collaboration within your cloud environment will help you gain clarity and communicate what is needed to adhere to each of these pillars across your organization. Each stakeholder should be left with little question about the current and future state of your organizations cloud environment, making it easier to build and maintain and pursue future organizational growth.

Featured image via Pixabay.

See the original post here:
Visualizing the 5 Pillars of Cloud Architecture The New Stack - thenewstack.io

Read More..

#ThinkBeforeYouClick: Wasabi’s IT Hero Nate Returns to Warn About the Dangers of Ransomware – PR Newswire

The cornerstone of the campaign is Nate's latest music video, the "Ballad of Ransomware," in which he educates the public about common ransomware pitfalls like "clicking on something stupid," not updating passwords, falling for African prince email scams, or anything online that "seems too good to be true." Nate's signature style is used to break through the monotony of typical cybersecurity training that still has not helped slow down the number of ransomware attacks faced today. According to The Long Road Ahead to Ransomware Preparedness, a new Enterprise Strategy Group (ESG) survey of IT and cybersecurity professionals, 79% of respondent organizations reported having experienced a ransomware attack within the last year.

"Cybercriminals extort millions of dollars every year through ransomware with no signs of slowing down. Any person with a computer is at risk for an attack, and that leaves businesses vulnerable every day. We need to find a more relevant way to communicate the danger of cybercriminals with employees at all types of organizations," said Julie Barry, Vice President of Global Brand and Communications, Wasabi Technologies. "Through Nate's voice and his fresh approach, we want IT teams to know that Wasabi has their backs. We don't just offer immutable cloud storage, but also educational resources that support their teams in the mission to combat ransomware in their organizations."

"Data is without a doubt a company's most valuable asset, and protecting that data from ransomware is a top priority for me and my team," said Kyle Burnette, Director of IT Infrastructure & Security, BrightStar Care. "What I love so much about Wasabi's #ThinkBeforeYouClick campaign is that it calls out the everyday cyber tricks in a funny, relatable way that holds people's attention and reminds them to think twice about their online interactions."

It's clear that ransomware is not a matter of if, but when. Protect yourself with proven security best practices, including regular ransomware awareness training and a robust backup and recovery strategy with immutable cloud storage. For more resources, visit wasabi.com/thinkbeforeyouclick and wasabi.com/ransomware.

About Wasabi Technologies

Wasabi provides simple, predictable and affordable hot cloud storage for businesses all over the world. It enables organizations to store and instantly access an unlimited amount of data at 1/5th the price of the competition with no complex tiers or unpredictable egress fees. Trusted by tens of thousands of customers worldwide, Wasabi has been recognized as one of technology's fastest-growing and most visionary companies. Created by Carbonite co-founders and cloud storage pioneers David Friend and Jeff Flowers, Wasabi has secured nearly $275 million in funding to date and is a privately held company based in Boston. Wasabi is a Proud Partner of the Boston Red Sox, and the Official Cloud Storage Partner of Liverpool Football Club and the Boston Bruins.

Follow and connect with Wasabi onTwitter,Facebook,Instagram, and ourblog.

Wasabi Technologies PR contact:

Kaley CarpenterInkhouse for Wasabi[emailprotected]

SOURCE Wasabi Technologies

View original post here:
#ThinkBeforeYouClick: Wasabi's IT Hero Nate Returns to Warn About the Dangers of Ransomware - PR Newswire

Read More..

5 tips on how to select the most secure backup solution – Wire19

The need for a backup plan is becoming more important than ever. Keeping your systems up and running without interruption is no easy task. Systems are constantly changing; data grows daily with new insights coming out every day it makes sense that you would need a backup plan for these changes! With ransomware on the rise, its not enough to simply have a traditional backup solution in place. Managers must employ new-generation strategies that protect their businesses most valuable assets or else risk suffering costly downtime!

The following rules will help managers choose the most suitable cyber backup solution for their organization.

Investing in a modern backup solution can save your business bottom line and reputation. With the most secure data protection solutions, businesses will find themselves with an all-inclusive approach towards securing their information with cybersecurity and backup solutions that work efficiently with other defenses like cloud storage. This will enable organizations to stay ahead of todays threats by having powerful tools at each stage.

Source: Acronis

Read next:Do the different types of cybersecurity terms confuse you? Know how each of them differ.

View post:
5 tips on how to select the most secure backup solution - Wire19

Read More..

The limits and risks of backup as ransomware protection – ComputerWeekly.com

Ransomware has pushed backup and recovery firmly back onto the corporate agenda. Without a sound backup and recovery strategy, firms have little chance of surviving a ransomware attack, even if they pay the ransom.

IBM, for example, named ransomware as the leading cyber security threat in 2021, accounting for 23% of all cyber attacks.

This has forced CIOs to revisit their backup and recovery strategies, says Barnaby Mote, managing director at online backup provider Databarracks. The paradox is that ransomware has brought backup and recovery back into focus, he says. If you go back five years, it was a hygiene issue, and not on the CIO or CEO agenda. Now it is again.

High-profile attacks against organisations including shipping company Maersk and US oil network Colonial Pipeline have focused attention on the risks posed by this type of cyber attack and prompted organisations to invest in cyber defences.

But ransomware is becoming smarter, with double- and triple-extortion attacks, and techniques that allow the malware to remain undetected for longer. This puts pressure on that other essential defence against ransomware good data backups.

The other factor that has changed dramatically is that when you get a ransomware infection, it doesnt always trigger immediately, says Tony Lock, analyst at Freeform Dynamics. You might find that the ransomware has been in your system a long time before you noticed it, but its only now theyve triggered it and everythings encrypted.

As a result, organisations have to go back further in time to find clean backups, stretching recovery point objectives (RPOs) to the point where the business is put at risk, or its leaders might even feel they must pay the ransom. How far do you need to go, says Lock, so that when youre doing a recovery from your copies, you make sure youre not bringing the infection back with you?

As Lock suggests, when organisations deal with a ransomware attack, one of the greatest risks is reinfecting systems from a compromised backup. Some of the industrys tried-and-tested backup and recovery and business continuity tools offer little protection against ransomware.

Snapshots record the live state of a system to another location, whether that is on-premise or in the cloud. So, if ransomware hits the production system, there is every chance it will be replicated onto the copy.

Conventional data backup systems face the same risk, copying compromised files to the backup library. And malware authors are adapting ransomware so it actively targets backups, prevents data recovery, or immediately targets any attempt to use recovered files by encrypting them.

Some ransomware Locky and Crypto, for example now bypass production systems altogether and go straight for backups, knowing that this puts the victim at a real disadvantage. This has forced organisations to look again at their backup strategies.

One option is to use so-called immutable backups. These are backups that, once written, cannot be changed. Backup and recovery suppliers are building immutable backups into their technology, often targeting it specifically as a way to counter ransomware.

The most common method for creating immutable backups is through snapshots. In some respects, a snapshot is always immutable. However, suppliers are taking additional measures to prevent these backups being targeted by ransomware.

Typically, this is by ensuring the backup can only be written to, mounted or erased by the software that created it. Some suppliers go further, such as requiring two people to use a PIN to authorise overwriting a backup.

The issue with snapshots is the volume of data they create, and the fact that those snapshots are often written to tier one storage, for reasons of rapidity and to lessen disruption. This makes snapshots expensive, especially if organisations need to keep days, or even weeks, of backups as a protection against ransomware.

The issue with snapshot recovery is it will create a lot of additional data, says Databarracks Mote. It will work, but has a large impact on the storage you need, and there is the cost of putting it on primary storage.

Another way to protect against ransomware is to air gap storage, especially backups. In some ways this is the safest option, especially if the backups are stored off-site, on write-only (WORM) media such as optical storage, or even tape.

Personally I like air gaps, says Freeforms Lock. Id like the backup to be on something that is totally air-gapped take a copy on tape and put it somewhere. Preferably with logical and physical air gaps.

The disadvantage of air gaps, especially physical air gaps with off-site storage, is the time it takes to recover data. Recovery time might be too long to ensure business continuity. And if IT teams have to go back through several generations of backups to find ransomware-free copies, the cost of recovering lost data can be high, maybe even higher than the cost of the ransom.

Time to restore, at scale, is now key, says Patrick Smith, field CTO, Europe, Middle East and Africa (EMEA) at Pure Storage. This may mean specific solutions for the business-critical applications that need to be online first.

Suppliers are trying to work round this through virtual air-gapped technology, which allows backups to be stored on faster local (or cloud) storage. But for businesses with the most critical data, it is likely that only fully immutable and air-gapped backups will suffice, even if it is as a second or third line of defence.

However, CIOs are also looking to augment their backup tools with security measures aimed specifically at ransomware.

Perhaps the greatest risk to an organisation with a solid backup policy is unwittingly re-infecting systems from ransomware hidden in backups.

Firms need to put measures in place to scan backups before they restore to a recovery environment, but again this takes time. And malware authors are adept at hiding their trails.

Anomaly detection is one route suppliers are exploring to check whether backups are safe. According to Freeform Dynamics Lock, machine learning tools are best placed to pick up changes in data that could be malware. This type of technology is increasingly important as attackers turn to double- and triple-extortion attacks.

You need to make data protection, observability and checking for anomalies a continuous process, he says.

Link:
The limits and risks of backup as ransomware protection - ComputerWeekly.com

Read More..

Global cloud spending to hit $495bn in 2022, Gartner says – The National

Global spending on public cloud services is expected to jump 20.4 per cent annually to $495 billion this year, as businesses expedite the pace of their digital transformation in the post-Covid era, US researcher Gartner has said.

Total spending is nearly $84bn more than the amount spent in 2020 and is expected to surge nearly 21.3 per cent yearly to almost $600bn next year.

Cloud is the powerhouse that drives todays digital organisations, said Sid Nag, research vice president at Gartner.

CIOs [chief information officers] are beyond the era of irrational exuberance of procuring cloud services and are being thoughtful in their choice of public cloud providers to drive specific, desired business and technology outcomes in their digital transformation journey.

For businesses, moving to a cloud system hosted by a specialised company such as Oracle, Amazon Web Services or SAP is more economical than creating their own infrastructure of servers, hardware and security networks, industry experts said. It also brings down the overall cost of ownership.

In overall cloud spending, infrastructure-as-a-service software is forecast to experience the highest end-user spending growth this year at 30.6 per cent. It will be followed by desktop-as-a-service at 26.6 per cent and platform-as-a-service at 26.1 per cent, Gartner predicted.

In cloud industry, businesses pay only for those selective services or resources that they use over a period of time.

The new reality of hybrid work is prompting organisations to move away from powering their workforce with traditional client computing solutions, such as desktops and other physical in-office tools and opt for the latest cloud solutions, the Connecticut-based market researcher said.

In the Middle East and North Africa, end-user spending on public cloud is forecast to reach $5.8bn this year, growing 18.8 per cent year-on-year.

Several global players are establishing data centres in the region as the cloud market picks up.

In 2020, IBM unveiled two data centres in the UAE, making its first foray into the Middle East and Africa cloud storage market. In 2019, Amazon Web Services opened three data centres in Bahrain.

Germany's SAP has centres in Dubai, Riyadh and Dammam, which house servers for local cloud computing clients.

Alibaba Cloud a comparatively smaller player and the cloud computing arm of the Chinese e-commerce company opened its first regional data centre in Dubai in 2016.

Public cloud services have become so integral that providers are now forced to address social and political challenges, such as sustainability and data sovereignty, Mr Nag said.

IT leaders who view the cloud as an enabler rather than an end state will be most successful in their digital transformational journeys the organisations combining cloud with other emerging technologies will fare even better, he added.

Updated: May 10, 2022, 4:52 AM

Link:
Global cloud spending to hit $495bn in 2022, Gartner says - The National

Read More..

Western Digital: The flash roadmap Blocks and Files – Blocks and Files

Western Digital execs revealed its disk and flash/SSD roadmaps at a May 10 Investor Day event. We covered the disk part of this in a previous article. Here we look at flash.

EVP Robert Soderbery, head of the flash business unit, talked about growing the capacity of a flash die by increasing the layer and shrinking the lateral dimensions of a cell. The latter means that more cells can fit in a layer and thus fewer layers are needed to reach a set capacity level. A slide showed a 10 per cent increase in lateral density through a 40 percent reduction in cell size.

President of Technology Siva Sivaram used a slide showing that Western Digitals 162-layer NAND, at a 1Tbit x4 die capacity and 100TB wafer size,had a 68mm2 cell size compared to Kioxia and Seagates 69.6 and 69.3mm2 cell size and they are building 176-layer NAND.

Sivaram said Western Digitals charge trap NAND cell had a 40MB/s program performance vs competitors 60MB/sec. He presaged 200+layer 3D NAND coming, calling it BiCS+. We have previously understood this to be 212-layers and called BiCS 7. A Sivaram slide showed what looked like string-stacking, called multi-bonding, and penta-level cell (PLC 5bits/cell) technologies coming.

BiCS+ will have 55 percent more bits/wafer than BiCS 6 (162-layer), a 60 percent better transfer speed and 15 percent more program bandwidth.

Sivarams 3D NAND roadmap showed a route to 500+ layers in 2032.

Western Digital makes three main classes of SSD consumer, client and cloud (enterprise), along with automotive and IoT drives using its own NAND, controllers, and firmware. It has a 37 percent share of the consumer SSD market, 20 percent of the client market, but only 8 percent of the cloud market. Soderbery wants to get that cloud market share higher, to 16 percent, and says the cloud SSD market is separating into three segments: compute (for cache and direct access), storage (capacity-optimized), and boot and journaling (endurance-optimized).

The BiCS 4 (96-layer) was good for storage (TLC 3 bits/cell) and boot segments, and BiCS 5 (112-layer) was good for storage (TLC and QLC) and boot. BiCS 6 (162-layer) will be good for compute, storage (TLC, QLC), and boot.

Soderbery sees a significant consumer SSD opportunity as flash replaces 2.5-inch disk. In 2022 62 percent of consumer drives were disk and 38 percent SSDs. In 2026 that is forecast to have changed to 30 percent disk and 70 percent SSD. He thinks there is a 100EB opportunity in this disk-to-SSD transition with consumer SSDs having a greater than 45 percent CAGR from 2022 to 2025.

Overall WD has a 14-16 percent overall SSD market share target. Wells Fargo analyst Aaron Rakers noted that WD is forecasting flash capacity shipped in the cloud to grow at ~37 percent year-on-year from 2022 to 2027, and told subscribers: This is where WDs qualification/ramp of their NVMe SSDs is a key focus for the company.

That is probably Soderberys key goal: get the cloud/enterprise NVMe SSD sales up while not foregoing growth in the consumer and client markets. WD is betting that its cell density and layer count advantages will translate to better price/performance and so enable it to win share, grow its business, and, maybe, fend off activist investor Elliott Management.

Read more from the original source:
Western Digital: The flash roadmap Blocks and Files - Blocks and Files

Read More..

Frustrated with your companys cloud obsession? What goes around, comes around – ComputerWeekly.com

Many of you reading this will be familiar with the experience of having to bite your lip as you watch your organisation pursue a cloud first agenda in an almost religious manner. You know that the obsession with moving everything into the public cloud as quickly as possible is blinkered and misguided, but you go along with it because everyone seems so committed and you dont want to look like a naysayer.

This kind of experience came up during a recent briefing with Jeff Denworth, CMO and Co-Founder of VAST Data. If youre not familiar with VAST, its a company focused on the area of extremely high-volume storage management. Its solutions were originally designed to address the needs of customers who want cloud-like scalability, flexibility and ease of use, but delivered via an on-prem infrastructure. As Denworth says: Our customers are generally dealing with upwards of 5PB of data, and you need to think differently when working at this level.

Well get into that need to think differently thing in future discussions, but if youre aching to learn more right now, check out the VAST website, where you can geek-out on the companys disaggregated and shared everything architecture.

Back to the current discussion, we asked Denworth to describe his ideal customer, to which he responded: Apart from having a need to store and manage data at scale, its mostly about timing. Our proposition resonates particularly well with customers that have been aggressively pursuing a cloud migration strategy, have gained enough experience to figure out that its not the Nirvana they thought it would be, but havent yet got around to letting their storage specialists go.

As an aside we chatted about cloud evangelists who seem to build their careers by going from company to company, encouraging them to shift everything to the cloud while downsizing internal IT, then moving on to their next job just before all of the problems with an obsessive cloud approach become obvious.

How much this happens in exactly this manner is debatable, but weve been tracking the way in which challenges accumulate as the number of cloud services proliferate for over a decade. Put this together with the way in which activity is becoming even more distributed, and many IT teams are seeing the cost, risk and other issues escalate even further.

Does this mean that public cloud services are inherently bad news? Of course not, its just that cloud adoption should be regarded as a potential means to an end, rather than an end in its own right. Our advice is always to focus on service delivery objectives, and when you do this you generally end up with some kind of hybrid/multi-cloud approach a topic well be publishing some new research on soon (watch this space).

In the meantime, theres now enough experience out there to provide the insights necessary to challenge the cloud crusaders and inject a little more rationality into the discussion. And players like VAST and others are consistently demonstrating that its nowadays possible to build on-premise systems that can operate reliably, securely and cost-effectively at extreme scale.

See the article here:
Frustrated with your companys cloud obsession? What goes around, comes around - ComputerWeekly.com

Read More..

D-Wave Deploys Advantage Quantum Computer Accessible in Leap Cloud Service – HPCwire

PALO ALTO, Calif. & BURNABY, B.C., May 12, 2022 D-Wave Systems Inc. (the Company), a leader in quantum computing systems, software, and services, and the only company building both quantum annealing and gate-based quantum computers, today announced the availability of the first Advantage quantum computer, accessible via the Leap quantum cloud service, physically located in the United States. The cloud-based service is part of the USC-Lockheed Martin Quantum Computing Center (QCC) hosted at USCs Information Sciences Institute (ISI), a unit of the University of Southern Californias prestigious Viterbi School of Engineering. Among the highlights:

Through QCC, USC has been a pioneering academic institution in the hosting and operating of a commercial quantum system and is a world leader in research and development of advanced information processing, computer and communications technologies. USC has been working with D-Wave since 2010 and has housed several generations of earlier D-Wave systems with the first one installed at the QCC with Lockheed Martin.

Making quantum computing ubiquitous and available is one of our core areas of focus and is central to the commercialization of quantum computing, said Alan Baratz, CEO of D-Wave. This is an important moment for our U.S.-based customers who want their Leap cloud access to the newest Advantage system and quantum hybrid solver service to be in-region. The timing is especially important. Eleven years ago, together with Lockheed Martin, we installed our first quantum system at USC. Fast forward to today, delivering the most performant commercial quantum computer in the world yet again allows users to harness the power of annealing quantum computing for real-world optimization problems, all accessible real-time through our Leap quantum cloud service and in AWSs Amazon Braket.

Quantum computing is a constantly evolving field and its important that our customers have access to the latest quantum hardware, said Richard Moulds, General Manager of Amazon Braket at AWS. By adding support for a third quantum system from D-Wave to Amazon Braket, all customers now have on-demand access to even more hardware options. Furthermore, U.S. based customers have the added benefit of using a device located in California, making it possible for them to conduct research using D-Wave hardware in-region.

Quantum information science (QIS) is a top priority research area for the nation and has long been a focus of USC Viterbi, said Yannis C. Yortsos, Dean of the USC Viterbi School of Engineering. In collaboration with Lockheed Martin, we established at ISI in 2011 the first academic home for a quantum computing system, namely D-Wave One. For more than a decade, research and education in QIS at USC Viterbi has been thriving and constantly growing.

For more than 12 years, Lockheed Martin has been proud to support advanced practical quantum computing, putting the technology in the hands of people who can make the most of it, said Greg Tallant, Lockheed Martin Fellow. Lockheed Martin is a leader in quantum computing applications development, and the Advantage system at QCC furthers our 21st Century Security vision.

The D-Wave annealing quantum computer provides a four-fold increase in the number of qubits from our previous system, as well as increased coherence and other performance metrics, said Daniel Lidar, holder of the Viterbi Professorship of Engineering at USC, and the scientific and technical director of QCC. We have great hopes for the new system as we explore coherent quantum annealing to achieve quantum speedups in quantum simulation, best-in-class optimization and machine learning. Some of our first projects will be to investigate speedup over classical optimization methods for hard optimization problems as well as pursuing additional government-funded research for identification and classification of quantum phase transitions.

To date, D-Waves customers have developed hundreds of early quantum applications in fields as diverse as financial modeling, flight planning, quantum chemistry simulation, automotive engineering, health care, logistics, and more.

Todays announcement marks the opening of the first Advantage quantum system physically located in the United States at the QCC. D-Waves quantum computers which have been available to North American users via the Leap quantum cloud service out of British Columbia since 2018 are particularly suitable for solving difficult optimization problems. Optimization use cases are ubiquitous in industry and are interesting because of their computational complexity, and recent research demonstrates that annealing quantum computers will be best suited for optimization use cases both today and into the future.

The upgraded system at USC will be available for enterprises, researchers and government. It will enable businesses to benefit from the commercial use-cases that can be run on the quantum hybrid solver service and enable researchers to continue studying how quantum effects may speed up the solution of complex optimization, machine learning and sampling problems. Moreover, the government now has the most advanced system in the US for tackling key public sector initiatives, including electrical grid resilience, emergency response, and infrastructure optimization projects.

About D-Wave Systems Inc.

D-Wave is a leader in the development and delivery of quantum computing systems, software and services and is the worlds first commercial supplier of quantum computers and the only company building both annealing quantum computers and gate-model quantum computers. Our mission is to unlock the power of quantum computing for business and society, today. We do this by delivering customer value with practical quantum applications for problems as diverse as logistics, artificial intelligence, materials sciences, drug discovery, scheduling, cybersecurity, fault detection, and financial modeling. D-Waves systems are being used by some of the worlds most advanced organizations, including NEC Corporation, Volkswagen, DENSO, Lockheed Martin, University of Southern California, Forschungszentrum Jlich and Los Alamos National Laboratory. With headquarters near Vancouver, Canada, D-Waves US operations are based in Palo Alto, CA. With headquarters and the Quantum Engineering Center of Excellence based near Vancouver, Canada, D-Waves US operations are based in Palo Alto, Calif. D-Wave has a blue-chip investor base that includes PSP Investments, Goldman Sachs, BDC Capital, NEC Corp., Aegis Group Partners, and In-Q-Tel.

D-Wave announced in February it has entered into a definitive transaction agreement with DPCM Capital, Inc. (DPCM Capital) (NYSE:XPOA), a publicly traded special purpose acquisition company. Upon closing of the transaction, shares of D-Wave Quantum Inc., a newly formed parent company of D-Wave and DPCM Capital, are expected to trade on the NYSE under the symbol QBTS.

Source: D-Wave

Follow this link:
D-Wave Deploys Advantage Quantum Computer Accessible in Leap Cloud Service - HPCwire

Read More..

America is Losing the Quantum Race with China | Opinion – Newsweek

You may not have realized, but China has been outpacing America in the race to reach the next frontier of critical national security technology: quantum computing. In October, Chinese scientists unveiled the world's fastest programmable quantum computer, a million times more powerful than Google's most advanced supercomputer. Their technology can accomplish in one millisecond what would take a typical computer some 30 trillion years.

America is finally taking notice. Last Wednesday, President Joe Biden signed two documentsan executive order and a National Security Memorandumto boost America's quantum capabilities on offense and defense. That means developing our own quantum computing technology, and protecting our key IT infrastructure from quantum attacks by adversaries. If we're not prepared for the eventuality of a quantum cyberattack that could render useless every password and computing device, from the iPhones in our pockets to GPS in aircraft to the supercomputers that process stock market transactions, the national security consequences will be enormous.

Quantum computing, a form of high-speed calculation at the subatomic level conducted at extraordinarily cold temperatures, will bring computers to speeds barely imaginable today. Atoms, photons and electrons that operate beyond the classical laws of physics and in the realm of "quantum" can be harnessed for extraordinary computing power. Complex problems that once took years to solve could take seconds.

And that means everything we know about cybersecurityevery lock secured by current encryption methodscould get blown wide open.

Think of encryption like a math problem. Using modern 256-bit encryption, you have 78 digits' worth of possible combinations to sort through to get the right code and break the digital lock: 115,792,089,237,316,195,423,570,985,008,687,907,853,269,984,665,640,564,039,457,584,007,913,129,639,936 possible combinations, to be exact. Today's hardware and software, using bits, would take millions of years to sort through that many combinations. Quantum bitsor qubitscan be used in parallel to exponentially accelerate a computer's ability to solve algorithms once thought impossible.

It's possible that China's recent quantum advancement claims are exaggerated, but the advent of quantum computing is not a question of "if," but "when." Ransomware attacks routinely make global headlines. Russia is using cyberattacks as weapons of war against Ukraine. But a U.S. adversary unleashing quantum computing into our digital environment would unleash nothing less than a cybersecurity apocalypse, where corporate, government and military secrets are put at risk by technology that could break 256-bit encryption in a matter of hours.

President Biden's recent moves will better coordinate our government's efforts to prevent this nightmare scenario, by bringing federal agencies and critical infrastructure companies together to address quantum threats. It also brings the National Quantum Initiative Advisory Committee under White House control.

Centralizing the government response is important, but a comprehensive approach will have to look beyond Washington as well. We must engage and incentivize America's higher education system to train more quantum engineers. Beyond our own borders, it will be important to work with like-minded nations like Britain, India, Japan and South Korea to share breakthroughs in quantum technology.

America must also harness the full might and ingenuity of its private sector to remain competitive and avoid a repeat of the cyberattacks that have held some of the country's biggest sectors hostage. Cybersecurity experts and chief information officers can start by encrypting data at rest, tokenizing data and micro-segmenting user and system access controls that allow for fewer "backdoor" entry points. And then there's the old-fashioned method of putting more locks on the dooror in this case, encrypting data several times using different algorithms, making the code-breaking process longer and tougher for even quantum computers.

Quantum technology will revolutionize our future as much as the internet and atomic weapons did. It holds enormous promise for pharmaceutical development and discovery, climate modeling and artificial intelligence. But we are also careening toward a perilous futurethe worst-case cybersecurity scenarios can and will play out if America remains ill-prepared. Our vulnerabilities in securing infrastructure, personal, business and classified data are very real, especially so in an era of great-power competition with adversaries like Russia and China, which boast advanced and aggressive cyber capabilities.

From Pearl Harbor to Sputnik to 9/11, the United States has found itself surprised and outmatched before, and yet it found ways to marshal the unwieldy gears of government, the will of the public and the ingenuity of the private sector to meet those challengeseven if belatedly. The coming dawn of the quantum age is no different. We must capitalize on the momentum kicked off by Biden's executive actions. Our choice is a simple one: to await the devastation of the first cyberattack fueled by quantum decryption, or to build the defenses to stop it.

Theresa Payton is the first and only woman to hold the position of White House Chief Information officer. She served under President George W. Bush from 2006-2008, overseeing IT operations for the President and his staff. She is currently Founder & CEO of Fortalice Solutions, a cybersecurity and intelligence firm that's listed in the Global Cybersecurity Top 500, and author of Manipulated: Inside the Cyberwar to Hijack Elections and Distort the Truth (Rowman & Littlefield).

The views expressed in this article are the writer's own.

Continue reading here:
America is Losing the Quantum Race with China | Opinion - Newsweek

Read More..

IonQ Launches Native Gate Access, Extends Open-Source Capabilities for Researchers and Developers – StreetInsider.com

News and research before you hear about it on CNBC and others. Claim your 1-week free trial to StreetInsider Premium here.

COLLEGE PARK, Md.--(BUSINESS WIRE)--IonQ (NYSE: IONQ), a leader in quantum computing, today announced support for specifying quantum circuits in a hardware-native gate format across its systems. Researchers, academic institutions and developers looking for new ways to test, learn and discover real-world solutions can now more precisely and expressively define their algorithms that run on IonQ quantum hardware.

IonQ provides customers with access to its cloud quantum computing platform the IonQ Quantum Cloud which allows users to run quantum programs on IonQs hardware remotely. Customers have the flexibility to define quantum algorithms in whatever format best suits their needs, and the platforms proprietary compilation, optimization and post-processing stack is designed to ensure consistent, high-quality results. However, advanced researchers and developers often need more fine-grained control over each individual gate run on hardware when exploring novel algorithms, solutions and fundamental techniques.

In order to serve this group of innovators more effectively, IonQ is further democratizing access to its industry-leading hardware by providing users with the ability to submit quantum programs using its hardware-native gate format. Developers can now specify precisely what is happening to every qubit throughout their entire algorithm, improving overall usefulness through new error mitigation or post-processing techniques. The feature is now available via IonQs direct API, Google Cloud Marketplace integration, and a variety of open-source tools such as Qiskit, Cirq, PennyLane and others.

Researchers, academics, developers, and other tinkerers like to be as close to the metal as possible when designing quantum experiments that can surpass todays benchmarks they want to be able to play at every layer of the stack to extract as much performance and novel insight as possible from these systems, said Nathan Shammah, from Unitary Fund, the nonprofit organization developing Mitiq, the first open-source software for quantum error mitigation. IonQ providing a native gate interface across several open-source tools further opens access and paves the way for the open source community to allow for further control and to improve performance in quantum computing software.

By providing the open source community with greater access to IonQs quantum hardware through native gates, we are doubling down on our commitment to provide researchers with the tools needed to experiment with quantum computers in the way they best see fit, said Jungsang Kim, Co-Founder and CTO at IonQ. We believe that quantums true potential will only be realized by those willing to push the boundaries of whats possible, and IonQs industry-leading hardware is designed to provide the ideal platform to build on top of and seek out solutions for the worlds most complex problems.

Todays news is the latest in a series of announcements by IonQ designed to push accessibility of quantum systems forward. In March, IonQ unveiled an industry-standard #AQ performance benchmark set to evaluate the quality of results output from a quantum computer. Additionally, IonQ announced in February the development of the N-qubit Toffoli gate alongside Duke University, introducing a new way to operate on many connected qubits at once by leveraging multi-qubit communication. More recently, IonQ announced the extension of its commercial partnership with Hyundai Motors to use quantum machine learning to improve the computation process for tasks like road sign image classification and simulation in a real-world test environment.

About IonQ

IonQ, Inc. is a leader in quantum computing, with a proven track record of innovation and deployment. IonQs latest generation quantum computer, IonQ Aria, is the worlds most powerful quantum computer, and IonQ has defined what it believes is the best path forward to scale. IonQ is the only company with its quantum systems available through the cloud on Amazon Braket, Microsoft Azure, and Google Cloud, as well as through direct API access. IonQ was founded in 2015 by Christopher Monroe and Jungsang Kim based on 25 years of pioneering research. To learn more, visit http://www.ionq.com.

IonQ Forward-Looking Statements

This press release contains certain forward-looking statements within the meaning of Section 27A of the Securities Act of 1933, as amended, and Section 21E of the Securities Exchange Act of 1934, as amended. Some of the forward-looking statements can be identified by the use of forward-looking words. Statements that are not historical in nature, including the words anticipate, expect, suggests, plan, believe, intend, estimates, targets, projects, should, could, would, may, will, forecast and other similar expressions are intended to identify forward-looking statements. These statements include those related to the anticipated benefits of native gate access; IonQs ability to further develop and advance its quantum computers and achieve scale; IonQs market opportunity and anticipated growth; and the commercial benefits to customers of using quantum computing solutions. Forward-looking statements are predictions, projections and other statements about future events that are based on current expectations and assumptions and, as a result, are subject to risks and uncertainties. Many factors could cause actual future events to differ materially from the forward-looking statements in this press release, including but not limited to: market adoption of quantum computing solutions and IonQs products, services and solutions; the ability of IonQ to protect its intellectual property; changes in the competitive industries in which IonQ operates; changes in laws and regulations affecting IonQs business; IonQs ability to implement its business plans, forecasts and other expectations, and identify and realize additional partnerships and opportunities; and the risk of downturns in the market and the technology industry including, but not limited to, as a result of the COVID-19 pandemic. The foregoing list of factors is not exhaustive. You should carefully consider the foregoing factors and the other risks and uncertainties described in the Risk Factors section of IonQs Quarterly Report on Form 10-Q for the fiscal quarter ended March 31, 20221 and other documents filed by IonQ from time to time with the Securities and Exchange Commission. These filings identify and address other important risks and uncertainties that could cause actual events and results to differ materially from those contained in the forward-looking statements. Forward-looking statements speak only as of the date they are made. Readers are cautioned not to put undue reliance on forward-looking statements, and IonQ assumes no obligation and does not intend to update or revise these forward-looking statements, whether as a result of new information, future events, or otherwise. IonQ does not give any assurance that it will achieve its expectations.

_________________

1 NTD: STET this if this will be filed prior to the 10-Q.

View source version on businesswire.com: https://www.businesswire.com/news/home/20220512005806/en/

IonQ Media contact:Dillon OlagarayMission North[emailprotected]

IonQ Investor Contact:[emailprotected]

Source: IonQ

Original post:
IonQ Launches Native Gate Access, Extends Open-Source Capabilities for Researchers and Developers - StreetInsider.com

Read More..