Page 3,100«..1020..3,0993,1003,1013,102..3,1103,120..»

Making the Right Cloud Security Investments – Security Boulevard

With more remote workers, there is a greater need for cloud computing services. With more cloud computing, there is a greater need for cloud security. An Exabeam study found that companies are moving their security tools to the cloud, but that raises the question: Are they right tools for cloud security? Or are companies under-investing in their cloud security systems?

Many organizations waste billions of dollars on cybersecurity each year. This is due to a combined lack of strategic planning from leadership and an ongoing shortage of security talent, said Matthew Rogers, CISO at Syntax, in an email interview. However, investing in security products without knowledge of how to utilize them provides very little value and results in wasted budgets.

You cant secure what you dont know about. Your cloud environment will have different security challenges than your on-premises network. Because of the move to remote work, the attack surface has expanded significantly, Rogers pointed out, and an increased reliance on mobile and IoT devices has also increased the number of entrance points for cybercriminals.

Moving a high-risk internal asset that previously had only been exposed to a few hundred devices to the cloud now exposes it to billions of devices, greatly magnifying the companys security risk, said Rogers.

Beyond the larger attack surface, Vishal Jain, co-founder and CTO at Valtix, said there are three areas of urgent concern:

While many organizations are actively looking to consolidate their security tools, they still need to pick solutions that operate with cloud awareness.

The trend on the operational side is towards service-based tools like cloud security posture management (CSPM) for compliance, and network security-as-a-service (SaaS) for runtime protections, explained Jain in an email interview. He said these security services are winning out over legacy firewalls since they match cloud-native design patterns with API-based integrations into modern services like Datadog for monitoring, Twilio for messaging/alerts and Slack for DevOps integration. They also provide relevant cloud-specific information to SOC and incident response (IR) teams.

Also, he added, security orchestration and automated response (SOAR) tools are getting better with plugin integrations, but these cant be effective if the traditional policy enforcement tools are not providing relevant contextual data. Yet, there are still a lot of people who think that the best way to approach security problems is to throw money at it getting the most expensive or comprehensive security solutions, without ever looking to see if it is the right security tool for them and their cloud operation.

IT leaders investing in cloud security systems need a plan for execution in place to see any return from the investment, said Rogers. Organizations must train their employees on remaining secure, especially while working remotely, as this lack of understanding of the technology only further wastes the companys investment in security.

Because there are so many complex tools and such a broad lack of understanding, organizations often fail to implement their cloud security plans successfully. Rogers advised organizations take these steps to ensure optimal cloud security while still efficiently allocating their budget:

Companies are in a massive cloud-driven shift thats changing everything from app development to deployment and operations. IT and security teams must have the right solutions to meet their needs but that also are the right investment to protect their assets in the cloud.

Recent Articles By Author

The rest is here:
Making the Right Cloud Security Investments - Security Boulevard

Read More..

Cloud Computing in Healthcare Market Size 2021 By Global Business Trends, Share, Future Demand, Progress Insight, Modest Analysis, Statistics,…

The Global Cloud Computing in Healthcare Market research report contains an in-depth analysis of this market, in which key players are outlined. All the leading companies engaged with the Cloud Computing in Healthcare market are examined. The Cloud Computing in Healthcare market research report offers a comprehensive perspective of the market, which can help in making the right choice for the development of the Cloud Computing in Healthcare market. The report offers essential information such as the CAGR value and SWOT analysis for the forecast period.

The worldwide Cloud Computing in Healthcare market research report delivers a comprehensive analysis of the newest market drivers. The competitive structure has been explained covering development activities related to products, advancements, technologies, and SWOT analysis are explored in this report. This information will help the businesses/clients penetrate or expand in the market. Additionally, this report researches the market in the worldwide market with production, benefits, usage, sales, import & export, market share, and growth rate in the projection period 2020-2025.

Request for Free Sample Report @ https://www.adroitmarketresearch.com/contacts/request-sample/1091?utm_source=Bh

The study contains all resourceful constraints, limitations, openings, challenges as well as outlines the historical data, current and future momentum of the Cloud Computing in Healthcare market. The report also offers extracts regarding statistics, market valuation, and revenue estimates, which further strengthens its status in the competitive spectrum and growth trends embraced by leading manufacturers in the business.

Key players in the global Cloud Computing in Healthcare market covered:

McKesson Corporation, Allscripts, NextGen Healthcare, Epic Systems Corporation, Healthcare Management System, eClinicalWorks, CPSI, Computer Sciences Corporation, and many more.

Browse the complete report Along with TOC @ https://www.adroitmarketresearch.com/industry-reports/cloud-computing-in-healthcare-market?utm_source=Bh

The objective of the report is to present a comprehensive assessment of the market and contains thoughtful insights, facts, historical data, industry-validated market data, and projections with a suitable set of assumptions and methodology. The report also helps in understanding Cloud Computing in Healthcare dynamics, structure by identifying and analyzing the market segments, and project the global market size. Further, the report also focuses on competitive analysis of key players by product, price, financial position, product portfolio, growth strategies, and regional presence.

Cloud Computing in Healthcare Market Segmentation

Type Analysis of Cloud Computing in Healthcare Market:

by End Use (Hospitals, Diagnostics and Imaging Centres, Ambulatory Centres, and Others)

On a regional basis, the market is categorized into five regions such as North America, Latin America, Middle & East Africa, Asia Pacific, and Europe. The report also demonstrates the impact of Porters Five Forces on the global Cloud Computing in Healthcare market. The report covers important Cloud Computing in Healthcare market data in the form of tables,

The report also provides PEST analysis, PORTER & rescues analysis, SWOT analysis to address questions of shareholders to prioritizing the efforts and investment in the near future to emerging segment in Cloud Computing in Healthcare market. The Cloud Computing in Healthcare market report helps the readers in understanding the development factors, industry plans, approaches, and advancement procedures actualized by key market players. The report has been prepared by keeping the clients perspective in mind.

Other important objectives in the Cloud Computing in Healthcare market report:

The research document also divides the Cloud Computing in Healthcare market based on the application scope. Key players profiled in the Cloud Computing in Healthcare market. Insights about the competitive dynamics, along with an analytical review of the industry supply are provided. The report uncovers the production patterns and remuneration of each company across their territories. It further includes Porters five forces analysis as well as SWOT analysis to evaluate the feasibility of a new project. In-depth company profile along with remuneration, pricing model, gross margins, and all other financial aspects are given as well. Substantial information concerning the production pattern, growth rate, and market share of each product type over the analysis period are underlined. The market share of each application together with its growth rate is listed. Revenue share and sales volume estimates of each product type are validated in the report. Key Highlights

Reasons to Buy

To understand the most affecting driving and restraining forces in the Cloud Computing in Healthcare market and its impact on the global market. To gain insightful analyses of the Cloud Computing in Healthcare market 2020-2025 and have a comprehensive understanding of the global market and its commercial landscape. To understand the future outlook and prospects for the Cloud Computing in Healthcare market. Learn about the market policies that are being adopted by prominent organizations. To assess the production processes, major issues, and solutions to mitigate the development liability.

What to Expect from the Report:

The report is designed to mirror real time market developments featuring concrete references of multiple drivers, opportunities as well as challenging implications that are widely prevalent across the market space and interfere with normal growth outflow The report is a meticulous outlook of the current scenario, expanding further in exploring future probabilities that have been compiled post elaborate research practices and surveys Emphasis upon lingering barriers in the market space have been specifically identified to understand stagnancy patterns specific to Cloud Computing in Healthcare market.

Besides requisite information highlighting industry vendors and regional developments, the report further emphasizes developments in the product and application segments. A brief on segment potential has been thoroughly assessed to derive logical deductions favoring high revenue steering business strategies.

The main questions answered in the report are:

What are the key drivers for the global Modular Cloud Computing in Healthcare market? What are the main findings of the five forces analysis of the global Modular Cloud Computing in Healthcare Market? What are the challenges for market growth? What are the major market trends influencing the growth of the global Modular Cloud Computing in Healthcare Market? Who are the key vendors in the global Modular Cloud Computing in Healthcare market? What market opportunities and threats are vendors facing in the global Modular Cloud Computing in Healthcare market?

If you have any questions on this report, please reach out to us @ https://www.adroitmarketresearch.com/contacts/enquiry-before-buying/1091?utm_source=Bh

About Us :

Adroit Market Research is an India-based business analytics and consulting company incorporated in 2018. Our target audience is a wide range of corporations, manufacturing companies, product/technology development institutions and industry associations that require understanding of a markets size, key trends, participants and future outlook of an industry. We intend to become our clients knowledge partner and provide them with valuable market insights to help create opportunities that increase their revenues. We follow a code Explore, Learn and Transform. At our core, we are curious people who love to identify and understand industry patterns, create an insightful study around our findings and churn out money-making roadmaps.

Contact Us :

Ryan JohnsonAccount Manager Global3131 McKinney Ave Ste 600, Dallas,TX75204, U.S.A.Phone No.: USA: +1 210-667-2421/ +91 9665341414

Read more here:
Cloud Computing in Healthcare Market Size 2021 By Global Business Trends, Share, Future Demand, Progress Insight, Modest Analysis, Statistics,...

Read More..

Emerging: Edge Cloud Architecture Continues to Shake Out – IoT World Today

Edge cloud architecture is going to bring about new capabilities. But as data-intensive functionality comes together at the edge, technologies need to develop, then converge, first.

As video streaming, Alexa-type digital assistants and self-driving cars continue to permeate daily life, edge computing architecture has become foundational to enable these tasks.

These data-intensive processes are fueled by a proliferation of Internet of Things (IoT) devices..

According to Statista, there will be 30.9 billion devices by 2025. These devices are becoming increasingly intelligent as well, with more analytics and decision-making capabilities at the device level.

There are more and more devices that need intelligent capabilities, especially to process AI at the edge, said Aditya Kaul, research director at Omdia.

These devices may be sensors at oil rigs, connected heart monitors or components in a self-driving car.

But edge devices and the tasks they enable, such as autonomous driving and real-time video surveillance create massive amounts of data, requiring compute power, bandwidth and memory close to the users and devices that these resources.

Consider a self-driving car that can traverse a street, sensing the speed of and distance from the cars surrounding it. These kinds of processes take many sensors gathering myriad data points that must be processed in the moment; they cannot travel back to a centralized cloud, as the round trip to the cloud and back creates too much delay for processes that need fractions-of-a-second in reaction time. Autonomous vehicles cant wait the extra fractions of a second for data to travel back and forth to the cloud.

For the past decade, many data-intensive computing processes have required a centralized architecture, such as cloud computing to provide the fire power to complete these tasks.

Today, most IT market watchers say that the next era of computing lies at the distributed edge bringing compute, storage and network bandwidth closer to the devices that need those resources.

Keeping AI processing on the edge device circumvents privacy concerns while avoiding the bandwidth, latency, and cost concerns of cloud computing, wrote Omdia in the report Artificial Intelligence for Edge Devices.

These cost and performance gains have made edge computing architecture the next big trend for AI.

The edge is becoming to the 2020s what the cloud was to the 2010s: the strategic focus of competition, wrote Omdia in the report Edge Computing, 5G, and AIa New Competitive Space.

Indeed, interest in edge computing has reached critical mass and beyond. According to a recent IBM survey, 91% of respondents expect to deploy edge computing.

IoT Gets Boost From Edge Cloud

Many industry watchers focus on the new intelligent hardware that edge architecture enables semiconductor chips from the likes of NVIDIA and Intel that bring more intelligence and analytics to the edge, enabling such processes as facial recognition and digital assistants like Alexa autonomous driving and more. These new intelligent chips

But this new competitive terrain does not just reside at the device level , Kaul said, but rather is enabled at the edge. An edge cloud provides the servers, software and networking to enable more data-driven decision making in real time.

Its very much a hot topic, Kaul said. Most of the AI edge discussion has been about the device. With edge cloud, were saying that intelligence will move from the device to a server or appliance close to the device, Kaul said.

<<<<<<<<<<<<<<<

Check out more from Omdia on Edge, 5G and Intelligent Hardware

<<<<<<<<<<<<<<<

Kaul said that there is no single configuration or vendor yet that defines an edge cloud. Today, there are several architectures that support edge processes.

There is a lot of discussion about how far away that server should be from the device, what should the latency be of that round-trip time, what is the configuration of those servers? Is it a leaner, more power-efficient server device? Kaul said. There are a lot of questions, and there are no good answers today.

Edge cloud infrastructure can take many forms. An edge cloud may take the form of a series of mini-data centers for a telecom provider, with numerous compact servers residing close-by. But an edge cloud could also consist of a rack of servers that sit in a closet at a Walmart.

There are a range of possible configurations for edge cloud architecture, Kaul said, and there is no clear choice in terms of vendors or configuration.

Edge Cloud and 5G

New connectivity options, such as 5G wireless connectivity, offer greater capabilities at the edge and for edge clouds. 5G promises up to 100 times greater upload and download speeds and network performance enhancements as well that enable the kinds of data-intensive processes that edge processes require.

Experts say that in the future 5G connectivity will be a clear enabler of edge cloud processes.

The edge data center is clearly tied to 5G infrastructure rollout. It offers much lower latency to the end device, Kaul said. It offers much lower latency to the edge device, and that gives you additional capabilities, so you dont have to send everything back to a centralized data center, Kaul said.

Admittedly, though, 5G connectivity is still nascent, as many regions of the U.S. and the globe still lack fiber lines and infrastructure to accommodate 5G.

Still, while 5G rollout is incremental, the digital imperatives of COVID-19 may accelerate this 5G reality. The need for connectivity for remote, digital health, for remote working capability and for new mobile capabilities in the field indicate that 5G connectivity, edge architecture and even AI will ultimately enable one another.

5G will not only be the protocol that launches a thousand new high-throughput applications and use cases like AVs, it will also drive important new low-bandwidth applications. Put another way, because of the new frequencies opened up by 5G, there will be new opportunities for innovation up and down the spectrum, wrote Ben Yu in VentureBeat.

Overall edge cloud is the big trends, and we will see more in 2021, Kaul said.

See the original post here:
Emerging: Edge Cloud Architecture Continues to Shake Out - IoT World Today

Read More..

Hybrid Cloud and Hyperconverged Infrastructure (HCI) – Datamation

Please stay tuned for the recording podcast and video will be posted by Tuesday, March 2.

Early in the evolution of cloud computing say, about 2012 many pundits said the public cloud would make data centers obsolete. Yet here in 2021, data centers remain very much alive, with the hybrid cloud providing the crucial link from the legacy world of hulking data centers to the next-gen environment of hyperscale cloud.

Indeed, hybrid cloud has earned a place as a default enterprise technology. This is true despite the fact that Gartner predicts that by 2025, 80% of enterprises will shut down their traditional data centers.

A key technological linchpin in hybrid cloud is hyperconverged infrastructure, or HCI. Once mostly hardware, today HCI software is undergoing rapid adoption. Software-defined, benefitting from API technology, HCI allows sophisticated and flexible cloud management.

To explore the rapid growth of hybrid cloud and HCI, I will speak with Wendy M. Pfeiffer, CIO, Nutanix. Among the questions well discuss:

The rest is here:
Hybrid Cloud and Hyperconverged Infrastructure (HCI) - Datamation

Read More..

How colocation fits alongside a cloud-native architecture – ComputerWeekly.com

Technology will have an important role to play in supporting business growth opportunities as organisations begin to claw back the ground they lost during the coronavirus crisis.

It is likely that many technology-powered growth initiatives will rely on the flexibility of cloud computing, to enable organisations to deliver new products and services quickly, while maintaining a controllable and agile cost model that allows them to expand quickly as and when they need to.

But some applications, particularly those that have been designed as single tenanted, cannot shift easily to the cloud. They will major reworking, which is the main reason organisations continue to bear the costs associated with running their own datacentre facilities.

Colocation datacentre providers have carved a niche, helping organisations to lower the overheads associated with running their own datacentres. Multiple companies can rent server space in large facilities that offer all the necessary power, cooling and networking required to run their customers IT equipment.

Discussing how customer requirements for colocation are shifting, Jeff DeVerter, chief technology officer at Rackspace, says: A big change weve observed in colocation is the proliferation of smaller, closer to customer locations. We continue to see growth in private cloud deployments. They are being used in conjunction with public cloud resources or datasets creating a multicloud architecture. The lines between public and private are starting to become less visible which is a trend we expect to continue.

Deloittes latest technology, media and telecommunications predictions for 2021 estimate that cloud revenues will increase by 30% or more between 2021 and 2025. The coronavirus pandemic has forced businesses to look at new ways to grow.

As demand for public cloud services grows, colocation providers have carved a new niche for themselves as hybrid cloud specialists. They have needed to adapt due to shifts in the way enterprises want to deploy core business applications.

A Deloitte global survey of 50 CIOs, undertaken in April 2020, found that the proportion of total workloads done on-premise will fall to 35% in 2021 from 59% in 2019. According to the Deloitte study, CIOs expect public clouds share to grow from about a quarter to over a third (23% to 38%), with private cloud reaching 20% and hybrid cloud accounting for 7% of workload.

As a result, along with providing datacentre hosting for customers own systems, some colocation providers have expanded into managed services, where virtualised workloads are isolated in multi-tenanted server environments. While the public cloud is largely recognised as being the more secure and cheaper option for most enterprises, IT leaders are generally risk-averse and try to avoid being locked into the technical infrastructure of a single hyperscale provider.

This has led to an opportunity for colocation providers to evolve and rebrand themselves as multicloud and hybrid cloud providers, offering fast data connectivity to public clouds such as Amazon Web Services (AWS) and Microsoft Azure.

One of the first things that IT decision-makers need to work out is whether workloads are cheaper to run in the public cloud.

Some workloads are not so well suited to public cloud deployment. An application that does not require the elasticity needed to cope with peak usage can be more cost-effectively deployed in an environment that does not charge based on use. When using public cloud, public infrastructure as a service (IaaS) and platform as a service (PaaS) offerings, organisations are billed continuously as consumption occurs, instead of a one-time payment when they procure their datacentre capacity.

As analyst Gartner points out in a recent report, in cloud computing, organisations are confronted with the difficulty of creating accurate cost estimates. They are often hit by bills that they apparently cant explain and struggle to identify items that are responsible for spending. As a result, financial management is often overlooked until spending is out of control, warn Gartner senior director analysts Marco Meinardi and Traverse Clayton.

For multinational organisations and those companies with large branch networks, architecturally, the public cloud offers an elegant way to deploy a manageable IT environment. But it is not perfect.

Organisations need to consider latency in terms of how quickly data can be processed to deliver business insights. This is particularly relevant when data needs processing close to where it is generated or there are regulations that prevent data from being processed outside of the country where it is generated.

Edge devices in an industrial context, such as a wind farm, may each offer local processing in real time. Trend analysis may also need to be performed rapidly to manage unforeseen events, adapt to environmental changes and keep production optimal. To avoid latency and backhaul network congestion, it is largely accepted that such data processing needs to be run near the industrial facility, which has led some colocation providers to specialise in supporting edge computing. More advanced processing may require data being uploaded to the public cloud, such as for predictive analytics, which usually involves connectivity to the public cloud providers,

In its Datacentre and colocation market trends 2021 report, analyst Forrester describes the concept of data gravity, which introduces strategic decision points on expansion plans based on where data will be in the future.Careless enterprises can create data silos and make migrations, network access, and contracts costly, warns analyst Abhijit Sunil in the report.

Forrester urges IT leaders to consider a hybrid cloud model for deploying workloads. Here, critical applications are housed in a colocation datacentre and can have direct access to the cloud through gateways in the same datacentre.

According to ISG (Information Services Group), enterprises are embracing colocation in the datacentre market as a way to enable multicloud strategies and deal with concerns about data sovereignty, security and privacy regulations.

In an effort to save valuable time, money and space, many large enterprises will look to move in-house IT operations to managed colocation facilities or sell their datacentres and lease the space they need to operate, says Barry Matthews, partner and leader at ISG North Europe.

The 2020 ISG provider lens next-gen private/hybrid cloud datacentre services & solutions report for the UK notes that many enterprises in the country focus on colocation because it allows them to locate and manage data closer to their cloud, network and security functions. ISG reports that UK enterprises are increasingly viewing colocation providers as an extension of their business, with providers offering services such as tracking provisioning status, interacting with customer support and monitoring system health in real time.

In spite of Deloittes predictions of massive growth in public cloud services, ISGs research finds that about 60% of UK enterprise workloads still reside on-premise and many in private datacentres are operated by internal staff.

Among the reasons why businesses are choosing colocation, according to ISG, are the need for auditing data location, migrating software licences to the public cloud, hyper-converged systems capacity and affordability, and improved management tools.

With the UK facing the dual challenges of Brexit and the pandemic, the report sees UK enterprises moving cautiously with new IT projects. Due to Brexit, companies are not circulating large requests for proposals and requests for information, but at the same time, many are concerned about a potential shortage of niche tech skills. The perceived skills gap, along with continuous demand for innovation, could eventually lead to more IT outsourcing deals.

This skills gap was highlighted in a recent survey of 1,870 IT decision-makers for Rackspace Technologies. The survey revealed that more than a third (35%) of artificial intelligence (AI) research and development initiatives in the UK either fail or are abandoned, with a large proportion (48%) therefore outsourcing the tech support to trusted external partners due to a lack of internal resources.

The technical know-how to deploy hardware and software infrastructure for AI also exists in the area of high-performance computing (HPC). In Forrestersreport, analyst Sunil notes that almost all major IT providers now offer HPC infrastructure capabilities in varying degrees.

HPC workloads impose intensive physical requirements that include purpose-built, high-density datacentres that vendors provide directly or through partnerships with cloud companies, says Sunil. For instance, Cyxtera, Digital Realty and Equinix offer Nvidia DGX-ready datacentres for supporting AI-based deep learning applications.

It is largely recognised that hiring new datacentre staff is extremely difficult. The Uptime Institute recently reported that across Europe, the Middle East and Africa (EMEA), 81,500 new roles will be created for datacentre staff by 2025. This suggests a growing skills crisis, which the Uptime Institute believes will further drive enterprises to outsource at least part of their datacentre computing to public cloud or colocation providers.

While IT leaders are cognisant of the skills shortage in hot technology areas such as HPC and AI, the Uptime Institutes research shows that the skills crisis envelopes all areas of the datacentre. Enterprise facilities are currently the most numerous type of datacentre and are typically smaller than many colocation and almost all cloud facilities. According to the Uptime Institute, this often means there are fewer opportunities for economies of scale including for staff. For example, job roles such as IT hardware technicians and electricians are required in all datacentres, regardless of size.

Last year, in its Multi-tenant datacentre and services industry 2020 report, 451 Group described colocation as the obvious vehicle for connecting enterprises, service providers and cloud platforms.

UK enterprises frequently want to use more than one hyperscaler because each one has particular strengths related to vertical solutions, pricing and other factors. But as ISG points out, enterprises see some barriers to a multicloud setup, including orchestrating their workloads. This is one of the trends driving the expansion of colocation services. ISGs research found that many customers are turning to service providers to help them manage multicloud environments.

See the article here:
How colocation fits alongside a cloud-native architecture - ComputerWeekly.com

Read More..

Akka, the Leading Framework for Building Cloud-Native Applications, is Now Available as a Cloud Service on AWS – GlobeNewswire

SAN FRANCISCO, Feb. 23, 2021 (GLOBE NEWSWIRE) -- Lightbend today announced the introduction of its newest offering, Akka Cloud Platform. Available now on the AWS Marketplace, Akka Cloud Platform makes it simple for enterprise development teams to quickly build and deploy cloud-native microservices on AWS and easily integrate with other AWS and third-party services such as Kafka and Cassandra. Akka Cloud Platform opens the door to more businesses in pursuit of digital transformation initiatives such as IoT, real-time financial processes and modern e-commerce where massive scale, high resiliency and low latency are crucial to success.

Akka Cloud Platform is built on Lightbends powerful Akka Platform technology, the leading framework for building distributed applications, marking the availability of Akka for the first time on a cloud service marketplace. With more than 10 million downloads, Akka is the most used programming model for cloud native applications running on containers on Kubernetes. Enterprises such as Starbucks, Verizon, Tesla and PayPal use Akka to develop microservices based applications optimized for distributed cloud computing requirements of unlimited scale and infallible resiliency to power their most mission critical business applications.

Akka gives developers the building blocks they need to develop cloud native applications without having to worry about the challenges of designing for a massively scalable environment, said Jonas Bonr, the creator of Akka, and CTO at Lightbend. Cloud native is about much more than just the infrastructure layer. Its also about the application architecture that handles things like performance, intelligent scaling, service failure, and streaming data pipelines. Now any developer using AWS can easily leverage the same powerful distributed computing technology that powers many of the worlds most innovative companies.

Akka Cloud Platform for AWS optimizes the capabilities of Akka into the AWS ecosystem, including improved deployment in Amazon EKS. Telemetry and observability provide advanced system monitoring and management and there are simple integration paths for relevant AWS services. Billing is managed through AWS. Akka Cloud Platform will be available on additional cloud service providers throughout the year.

Learn more about Akka Cloud Platform.

About LightbendLightbend (@Lightbend) is leading the enterprise transformation toward real-time, cloud-native applications and setting the standard for cloud native architectures. Lightbend provides scalable, high-performance microservices frameworks and streaming engines for building data-centric systems that are optimized to run on cloud-native infrastructure like Red Hat OpenShift. Lightbend powers the worlds most innovative companies. For more information, visit http://www.lightbend.com.

Editorial ContactJoanne Harris+1 425 295 1373joanne.harris@lightbend.com

The rest is here:
Akka, the Leading Framework for Building Cloud-Native Applications, is Now Available as a Cloud Service on AWS - GlobeNewswire

Read More..

IBM adds 10 historically Black colleges and universities to quantum computing center – TechRepublic

The IBM-HBCU Quantum Center is a research network and a hands-on learning program.

The IBM-HBCU Quantum Center announced on Monday that it is adding 10 historically Black colleges and universities to the center's 13 founding institutions. The center was launched last fall with the goal of advancing quantum information science and expanding science and technology opportunities to a broader group of students.

Kayla Lee, PhD, growth product manager for community partnerships at IBM Quantum and Qiskit, said she anticipates that new career paths such as quantum developer will become more defined as the field continues to evolve over the next few years.

"I hope that the IBM-HBCU Quantum Center accomplishes two things: inspires people to consider careers in quantum computing and provides additional support for students and faculty as they explore various research topics in quantum computing," she said. "I hope that our students participating in the center are more than equipped to thrive in this emerging industry."

The new schools joining the center are:

This multiyear investment connects researchers and students across a network of HBCUs. The program provides schools with access to IBM quantum computers via the cloud, educational support for students learning to use the Qiskit open source software development framework, and funding for undergraduate and graduate research.

SEE:Quantum computing: A cheat sheet(TechRepublic)

One of the initiative's goals is to create a more diverse quantum-ready workforce from students across multiple disciplines including physics, chemistry, computer science and business.

Researchers from the HBCUs are also on center's board, including Howard University associate professor of physics Thomas Searles; Serena Eley, an assistant professor of physics at the Colorado School of Mines and head of the Eley Quantum Materials Group; and Anderson Sunda-Meya, an associate professor of physics at Xavier University of Louisiana.

Since opening last fall, the center has hosted a community hack-a-thon and contributed to a pre-print on arXiv that investigates the use of machine learning and quantum computing to better understand unknown quantum systems. arXiv is a free distribution service and an open-access archive for scholarly articles in the fields of physics, mathematics, computer science, quantitative biology, quantitative finance, statistics, electrical engineering and systems science, and economics.

IBM is measuring the impact of the center by tracking student engagement, talent and workforce development and research capacity. The center also plans to look for ways to support professors and students map out career plans that have a long-term impact on quantum computing.

SEE: To do in 2021: Get up to speed with quantum computing 101 (TechRepublic)

JPMorgan Chase also is building a pipeline of people with quantum computing experience. The banking company was one of the early customers for IBM's quantum computer and is planning a Quantum Computing Summer Associates program for 2021.

The quantum industry is supporting several initiatives to expand educational opportunities. The European Organization for Nuclear Research recently offered a series of free webinars about quantum computing. The course covers the basic concepts of the quantum circuit model, including qubits, gates, and measures, as well as quantum algorithms and protocols. Q-CTRL recently hired quantum physics professor Chris Ferrie as a quantum education adviser. Q-CTRL specializes in controls for quantum computing.

This is your go-to resource for XaaS, AWS, Microsoft Azure, Google Cloud Platform, cloud engineering jobs, and cloud security news and tips. Delivered Mondays

Original post:
IBM adds 10 historically Black colleges and universities to quantum computing center - TechRepublic

Read More..

bp joins the IBM Quantum Network to advance use of quantum computing in energy – Green Car Congress

IBM announced that bp has joined the IBM Quantum Network to advance the use of quantum computing in the energy industry. IBM Quantum is an industry-first initiative to build universal quantum systems for business and science applications.

By joining the IBM Quantum Network as an Industry Partner, bp will have access to IBMs quantum expertise and software and cloud-based access to the most advanced quantum computers available via the cloud. This includes access to a premium 65-qubit quantum computer, the largest universal quantum system available to industry today, and an important milestone on the IBM Quantum roadmap to a 1,000-plus qubit system (IBM Quantum Condor), targeted for the end of 2023.

ExxonMobil and Daimler are also IBM Quantum Network Industry Partners.

bp will work with IBM to explore using quantum computing to solve business and engineering challenges and explore the potential applications for driving efficiencies and reducing carbon emissions.

bps ambition is to become a net zero company by 2050 or sooner and help the world get to net zero. Next-generation computing capabilities such as quantum computing will assist in solving the science and engineering challenges we will face, enabling us to reimagine energy and design new lower carbon products.

Morag Watson, senior vice president, digital science and engineering for bp

Quantum computing has the potential to be applied in areas such as: modeling the chemistry and build-up of various types of clay in hydrocarbon wellsa crucial factor in efficient hydrocarbon production; analyzing and managing the fluid dynamics of wind farms; optimizing autonomous robotic facility inspection; and helping create opportunities not yet imagined to deliver the clean energy the world wants and needs.

In 2020, bp announced its net zero ambition and its new strategy. By the end of this decade, it aims to have developed around 50 gigawatts of net renewable-generating capacity (a 20-fold increase), increased annual low carbon investment 10-fold to around $5 billion and cut its oil and gas production by 40%.

Joining the IBM Quantum Network will enhance bps ability to leverage quantum advances and applications as they emerge and then influence on how those breakthroughs can be applied to its industry and the energy transition.

bp joins a rapidly growing number of clients working with IBM to explore quantum computing to help accelerate the discovery of solutions to some of todays biggest challenges. The energy industry is ripe with opportunities to see value from the use of quantum computing through the discovery of new materials designed to improve the generation, transfer, and storage of energy.

Dario Gil, Senior Vice President and Director of IBM Research

See original here:
bp joins the IBM Quantum Network to advance use of quantum computing in energy - Green Car Congress

Read More..

How researchers are mapping the future of quantum computing, using the tech of today – GeekWire

Pacific Northwest National Laboratory computer scientist Sriram Krishnamoorthy. (PNNL Photo)

Imagine a future where new therapeutic drugs are designed far faster and at a fraction of the cost they are today, enabled by the rapidly developing field of quantum computing.

The transformation on healthcare and personalized medicine would be tremendous, yet these are hardly the only fields this novel form of computing could revolutionize. From cryptography to supply-chain optimization to advances in solid-state physics, the coming era of quantum computers could bring about enormous changes, assuming its potential can be fully realized.

Yet many hurdles still need to be overcome before all of this can happen. This one of the reasons the Pacific Northwest National Laboratory and Microsoft have teamed up to advance this nascent field.

The developer of the Q# programming language, Microsoft Quantum recently announced the creation of an intermediate bridge that will allow Q# and other languages to be used to send instructions to different quantum hardware platforms. This includes the simulations being performed on PNNLs own powerful supercomputers, which are used to test the quantum algorithms that could one day run on those platforms. While scalable quantum computing is still years away, these simulations make it possible to design and test many of the approaches that will eventually be used.

We have extensive experience in terms of parallel programming for supercomputers, said PNNL computer scientist Sriram Krishnamoorthy. The question was, how do you use these classical supercomputers to understand how a quantum algorithm and quantum architectures would behave while we build these systems?

Thats an important question given that classical and quantum computing are so extremely different from each other. Quantum computing isnt Classical Computing 2.0. A quantum computer is no more an improved version of a classical computer than a lightbulb is a better version of a candle. While you might use one to simulate the other, that simulation will never be perfect because theyre such fundamentally different technologies.

Classical computing is based on bits, pieces of information that are either off or on to represent a zero or one. But a quantum bit, or qubit, can represent a zero or a one or any proportion of those two values at the same time. This makes it possible to perform computations in a very different way.

However, a qubit can only do this so long as it remains in a special state known as superposition. This, along with other features of quantum behavior such as entanglement, could potentially allow quantum computing to answer all kinds of complex problems, many of which are exponential in nature. These are exactly the kind of problems that classical computers cant readily solve if they can solve them at all.

For instance, much of the worlds electronic privacy is based on encryption methods that rely on prime numbers. While its easy to multiply two prime numbers, its extremely difficult to reverse the process by factoring the product of two primes. In some cases, a classical computer could run for 10,000 years and still not find the solution. A quantum computer, on the other hand, might be capable of performing the work in seconds.

That doesnt mean quantum computing will replace all tasks performed by classical computers. This includes programming the quantum computers themselves, which the very nature of quantum behaviors can make highly challenging. For instance, just the act of observing a qubit can make it decohere, causing it to lose its superposition and entangled states.

Such challenges drive some of the work being done by Microsoft Azures Quantum group. Expecting that both classical and quantum computing resources will be needed for large-scale quantum applications, Microsoft Quantum has developed a bridge they call QIR, which stands for quantum intermediate representation. The motivation behind QIR is to create a common interface at a point in the programming stack that avoids interfering with the qubits. Doing this makes the interface both language- and platform-agnostic, which allows different software and hardware to be used together.

To advance the field of quantum computing, we need to think beyond just how to build a particular end-to-end system, said Bettina Heim, senior software engineering manager with Microsoft Quantum, during a recent presentation. We need to think about how to grow a global ecosystem that facilitates developing and experimenting with different approaches.

Because these are still very early days think of where classical computing was 75 years ago many fundamental components still need to be developed and refined in this ecosystem, including quantum gates, algorithms and error correction. This is where PNNLs quantum simulator, DM-SIM comes in. By designing and testing different approaches and configurations of these elements, they can discover better ways of achieving their goals.

As Krishnamoorthy explains: What we currently lack and what we are trying to build with this simulation infrastructure is a turnkey solution that could allow, say a compiler writer or a noise model developer or a systems architect, to try different approaches in putting qubits together and ask the question: If they do this, what happens?

Of course, there will be many challenges and disappointments along the way, such as an upcoming retraction of a 2018 paper in the journal, Nature. The original study, partly funded by Microsoft, declared evidence of a theoretical particle called a Majorana fermion, which could have been a major quantum breakthrough. However, errors since found in the data contradict that claim.

But progress continues, and once reasonably robust and scalable quantum computers are available, all kinds of potential uses could become possible. Supply chain and logistics optimization might be ideal applications, generating new levels of efficiency and energy savings for business. Since quantum computing should also be able to perform very fast searches on unsorted data, applications that focus on financial data, climate data analysis and genomics are likely uses, as well.

Thats only the beginning. Quantum computers could be used to accurately simulate physical processes from chemistry and solid-state physics, ushering in a new era for these fields. Advances in material science could become possible because well be better able to simulate and identify molecular properties much faster and more accurately than we ever could before. Simulating proteins using quantum computers could lead to new knowledge about biology that would revolutionize healthcare.

In the future, quantum cryptography may also become common, due to its potential for truly secure encrypted storage and communications. Thats because its impossible to precisely copy quantum data without violating the laws of physics. Such encryption will be even more important once quantum computers are commonplace because their unique capabilities will also allow them to swiftly crack traditional methods of encryption as mentioned earlier, rendering many currently robust methods insecure and obsolete.

As with many new technologies, it can be challenging to envisage all of the potential uses and problems quantum computing might bring about, which is one reason why business and industry need to become involved in its development early on. Adopting an interdisciplinary approach could yield all kinds of new ideas and applications and hopefully help to build what is ultimately a trusted and ethical technology.

How do you all work together to make it happen? asks Krishnamoorthy. I think for at least the next couple of decades, for chemistry problems, for nuclear theory, etc., well need this hypothetical machine that everyone designs and programs for at the same time, and simulations are going to be crucial to that.

The future of quantum computing will bring enormous changes and challenges to our world. From how we secure our most critical data to unlocking the secrets of our genetic code, its technology that holds the keys to applications, fields and industries weve yet to even imagine.

More:
How researchers are mapping the future of quantum computing, using the tech of today - GeekWire

Read More..

Physicists Need to Be More Careful with How They Name Things – Scientific American

In 2012, the quantum physicist John Preskill wrote, We hope to hasten the day when well controlled quantum systems can perform tasks surpassing what can be done in the classical world. Less than a decade later, two quantum computing systems have met that mark: Googles Sycamore, and the University of Science and Technology of Chinas Jizhng. Both solved narrowly designed problems that are, so far as we know, impossible for classical computers to solve quickly. How quickly? How impossible? To solve a problem that took Jizhng 200 seconds, even the fastest supercomputers are estimated to take at least two billion years.

Describing what then may have seemed a far-off goal, Preskill gave it a name: quantum supremacy. In a blog post at the time, he explained Im not completely happy with this term, and would be glad if readers could suggest something better.

Were not happy with it either, and we believe that the physics community should be more careful with its language, for both social and scientific reasons. Even in the abstruse realms of matter and energy, language matters because physics is done by people.

The word supremacyhaving more power, authority or status than anyone elseis closely linked to white supremacy. This isnt supposition; its fact. The Corpus of Contemporary American English finds white supremacy is 15 times more frequent than the next most commonly used two-word phrase, judicial supremacy. Though English is the global lingua franca of science, it is notable that the USTC team avoided quantum supremacy because in Chinese, the character meaning supremacy also has uncomfortable, negative connotations. The problem is not confined merely to English.

White supremacist movements have grown around the globe in recent years, especially in the United States, partly as a racist backlash to the Black Lives Matter movement. As Preskill has recently acknowledged, the word unavoidably evokes a repugnant political stance.

Quantum supremacy has also become a buzzword in popular media (for example, here and here). Its suggestion of domination may have contributed to unjustified hype, such as the idea that quantum computers will soon make classical computers obsolete. Tamer alternatives such as quantum advantage, quantum computational supremacy and even quantum ascendancy have been proposed, but none have managed to supplant Preskills original term. More jargony proposals like Noisy Intermediate Scale Quantum computing (NISQ) and tongue-in-cheek suggestions like quantum non-uselessness have similarly failed to displace supremacy.

Here, we propose an alternative we believe succinctly captures the scientific implications with less hype andcruciallyno association with racism: quantum primacy.

Whats in a name? Its not just that quantum supremacy by any other name would smell sweeter. By making the case for quantum primacy we hope to illustrate some of the social and scientific issues at hand. In President Joe Bidens letter to his science adviser, the biologist Eric Lander, he asks How can we ensure that Americans of all backgrounds are drawn into both the creation and the rewards of science and technology? One small change can be in the language we use. GitHub, for example, abandoned the odious master/slave terminology after pressure from activists.

Were physics, computer science and engineering more diverse, perhaps we would not still be having this discussion, which one of us wrote about four years ago. But in the U.S., when only 2 percent of bachelors degrees in physics are awarded to Black students, when Latinos comprise less than 7 percent of engineers, and women account for a mere 12 percent of full professors in physics, this is a conversation that needs to happen. As things stand, quantum supremacy can come across as adding insult to injury.

The nature of quantum computing, and its broad interest to the public outside of industry laboratories and academia means that the debate around quantum supremacy was inevitably going to be included in the broader culture war.

In 2019, a short correspondence to Nature argued that the quantum computing community should adopt different terminology to avoid overtones of violence, neocolonialism and racism. Within days, the dispute was picked up by the conservative editorial pages of the Wall Street Journal, which attacked quantum wokeness and suggested that changing the term would be a slippery slope all the way down to cancelling Diana Ross The Supremes.

The linguist Steven Pinker weighed in to argue that the prissy banning of words by academics should be resisted. It dumbs down understanding of language: word meanings are conventions, not spells with magical powers, and all words have multiple senses, which are distinguished in context. Also, it makes academia a laughingstock, tars the innocent, and does nothing to combat actual racism & sexism.

It is true that supremacy is not a magic word, that its meaning comes from convention, not conjurers. But the context of quantum supremacy, which Pinker neglects, is that of a historically white, male-dominated discipline. Acknowledging this by seeking better language is a basic effort to be polite, not prissy.

Perhaps the most compelling argument raised in favor of quantum supremacy is that it could function to reclaim the word. Were quantum supremacy 15 times more common than white supremacy, the shoe would be on the other foot. Arguments for reclamation, however, must account for who is doing the reclaiming. If the charge to take back quantum supremacy were led by Black scientists and other underrepresented minorities in physics, that would be one thing. No survey exists, but anecdotal evidence suggests this is decidedly not the case.

To replace supremacy, we need to have a thoughtful conversation. Not any alternative will do, and there is genuinely tricky science at stake. Consider the implications of quantum advantage. An advantage might be a stepladder that makes it easier to reach a high shelf, or a small head start in a race. Some quantum algorithms are like this. Grovers search algorithm is only quadratically faster than its classical counterpart, so a quantum computer running Grovers algorithm might solve a problem that took classical computers 100 minutes in the square root of that time10 minutes. Not bad! Thats definitely an advantage, especially as runtimes get longer, but it doesnt compare to some quantum speedups.

Perhaps the most famous quantum speedup comes from Shor's algorithm, which can find the factors of numbers (e.g. 5 and 3 are factors of 15) almost exponentially faster than the best classical algorithms. While classical computers are fine with small numbers, every digit takes a toll. For example, a classical computer might factor a 100-digit number in seconds, but a 1000-digit number would take billions of years. A quantum computer running Shor's algorithm could do it in an hour.

When quantum computers can effectively do things that are impossible for classical computers, they have something much more than an advantage. We believe primacy captures much of this meaning. Primacy means preeminent position or the condition of being first. Additionally, it shares a Latin root (primus, or first) with mathematical terms such as prime and primality.

While quantum computers may be first to solve a specific problem, that does not imply they will dominate; we hope quantum primacy helps avoid the insinuation that classical computers will be obsolete. This is especially important because quantum primacy is a moving target. Classical computers and classical algorithms can and do improve, so quantum computers will have to get bigger and better to stay ahead.

These kinds of linguistic hotfixes do not reach even a bare minimum for diversifying science; the most important work involves hiring and retention and actual material changes to the scientific community to make it less white and male. But if opposition to improving the language of science is any indication about broader obstacles to diversifying it, this is a conversation we must have.

Physicists may prefer vacuums for calculation, but science does not occur in one. It is situated in the broader social and political landscape, one which both shapes and is shaped by the decisions of researchers.

This is an opinion and analysis article.

Read more:
Physicists Need to Be More Careful with How They Name Things - Scientific American

Read More..