Page 3,414«..1020..3,4133,4143,4153,416..3,4203,430..»

The global cloud computing market at a CAGR of over 16% during the forecast period – Yahoo Finance UK

Global Cloud Computing Market: About this market This cloud computing market analysis considers sales from SaaS, IaaS, and PaaS services. Our analysis also considers the sales of cloud computing in APAC, Europe, North America, South America, and MEA.

New York, Aug. 26, 2020 (GLOBE NEWSWIRE) -- Reportlinker.com announces the release of the report "Cloud Computing Market by Service and Geography - Forecast and Analysis 2019-2023" - https://www.reportlinker.com/p05816973/?utm_source=GNW In 2018, the SaaS segment had a significant market share, and this trend is expected to continue over the forecast period. The use of SaaS eliminates the expenses and complexities associated with purchasing, configuring, and managing hardware products. This will play a significant role in the SaaS segment to maintain its market position. Also, our global cloud computing market report looks at factors such as the increased inclination toward cloud computing for cost-cutting, control of data backup and recovery, and increased use of containers boosting cloud adoption. However, system integration issues, network connectivity issues and latency, and problems associated with vendor lack-in may hamper the growth of the cloud computing industry over the forecast period.

Global Cloud Computing Market: Overview

Increased inclination toward cloud computing for cost-cutting

Enterprises are increasingly adopting cloud services for their computing needs as it helps them in minimizing their overall CAPEX. Similarly, organizations are utilizing public cloud resources and setting up the infrastructure on-premises or in private cloud through the deployment of hybrid cloud. Hybrid cloud solutions leverage the cost benefits and allow portability of the applications between different clouds. Furthermore, SMEs and large-scale organizations are increasingly adopting cloud solutions as they provide security, reliability, and result in optimum utilization of resources.?Owing to the cost-saving benefits offered, the demand for cloud computing is expected to rise which will lead to the expansion of the global cloud computing market at a CAGR of over 16% during the forecast period.

Rise in edge computing and shift toward serverless computing

Edge computing improves the server response time and ensures reduced latency. This network architecture is being implemented by large-scale enterprises with the advent of IoT devices and growing demand for the management of data generated by these devices. There is an increase in the demand for technologically advanced edge platforms due to growth in the velocity of data generation in the energy and telecommunication industries. The trend of edge data center deployments is expected to have a positive impact on the overall market growth.

Competitive Landscape

With the presence of a few major players, the global cloud computing market is moderately concentrated. This robust vendor analysis is designed to help clients improve their market position, and in line with this, this report provides a detailed analysis of few leading cloud computing manufacturers, that include Adobe Inc., Alibaba Cloud, Amazon Web Services Inc., Google LLC, Hewlett Packard Enterprise Development LP, IBM Corp., Microsoft Corp., Oracle Corp., Salesforce.com Inc., and SAP SE.

Also, the cloud computing market analysis report includes information on upcoming trends and challenges that will influence market growth. This is to help companies strategize and leverage on all forthcoming growth opportunities.Read the full report: https://www.reportlinker.com/p05816973/?utm_source=GNW

About ReportlinkerReportLinker is an award-winning market research solution. Reportlinker finds and organizes the latest industry data so you get all the market research you need - instantly, in one place.

__________________________

See the original post:
The global cloud computing market at a CAGR of over 16% during the forecast period - Yahoo Finance UK

Read More..

Cloud Computing Market Witness the Growth of $832.1 Billion by 2025 – Press Release – Digital Journal

To provide detailed information about the key factors (drivers, restraints, opportunities, and industry-specific challenges) influencing the growth of the market.

This press release was orginally distributed by SBWire

Northrook, IL -- (SBWIRE) -- 08/26/2020 -- According to a research report "Cloud Computing Market by Service Model (Infrastructure as a Service (IaaS), Platform as a Service (PaaS), and Software as a Service (SaaS)), Deployment Model (Public and Private), Organization Size, Vertical, and Region - Global Forecast to 2025", size is expected to grow from USD 371.4 billion in 2020 to USD 832.1 billion by 2025, at a Compound Annual Growth Rate (CAGR) of 17.5% during the forecast period. The flexibility and agility of cloud-based models would support the IT service needs of enterprises. The leading CSPs/hyperscalersMicrosoft, Alphabet, IBM, and AWSare expected to increase their CAPEX primarily for data center expansion to support the increasing workload for their internal and external stakeholders. The increasing volume of data generation in websites and mobile apps, rising focus on delivering customer-centric applications for driving customer satisfaction, and growing need to control and reduce Capital Expenditure (CAPEX) and Operational Expenditure (OPEX) are a few factors driving the growth of the emerging technologies. The emerging technologies, such as big data, Artificial Intelligence (AI), and Machine Learning (ML) are gaining traction which is ultimately leading to the growth of the cloud computing market, globally.

Browse 360 market data Tables and 75 Figures spread through 320 Pages and in-depth TOC on "Cloud Computing Market - Global Forecast to 2025"

Download PDF Brochure: https://www.marketsandmarkets.com/pdfdownloadNew.asp?id=234

The sudden shutdown of offices, schools, colleges, and physical retail stores has massively disrupted operations; this has led to an increase in the demand for digital workplace tools and services, such as Zoom, Slack, Blackboard, Lynda, Canvas, Google Classroom, AnyMeeting, and Moodle. AWS, Microsoft, and Google host and manage all applications in a public cloud environment. Increased spend on cloud services by select industries due to COVID-19. Industries, such as IT and ITeS, telecom, online retail/commerce, media, and BFSI, are expected to increase spending on cloud-based services to sustain their business. Highly regulated and cash-rich industries, such as BFSI, are also expected to move selective workloads to public cloud environments.

Growth of IaaS to be driven by increasing need of enterprises to shift enterprise workloads to cloud

The key features of IaaS include automated administrative tasks, dynamic scaling, platform virtualization, and network connectivity. The ever-changing business environment and customer demands encourage enterprises to increase their focus on their core business operations. IaaS enables enterprises to leverage their IT infrastructure without paying for the construction of the physical infrastructure. Moreover, it provides flexibility, mobility, easy, and scalable access to applications, and enhanced collaboration to help enterprises focus on their core businesses.

Speak To Expert Analyst: https://www.marketsandmarkets.com/speaktoanalystNew.asp?id=234

Lower cost and increased security capabilities result in rising popularity of public cloud

The services offered over the public deployment model are either free or offered under a subscription model. The advantages of using the public cloud include simplicity and ease of deployment. Moreover, the initial investment required for the deployment is minimum, and there are no responsibilities involved in managing the infrastructure.

North America to dominate the global cloud computing market in 2020

North America is a mature market in terms of cloud computing services adoption, owing to a large presence of enterprises with advanced IT infrastructure and the availability of technical expertise. BFSI, IT and telecommunications, and government and public sector verticals majorly adopt cloud computing services. As the benefits of adopting cloud computing services are becoming more evident, more companies are expected to implement cloud computing services. The US and Canada are the top countries contributing to the growth of cloud computing market in North America.

The report also studies various growth strategies, such as mergers and acquisitions, partnerships and collaborations, and developments, adopted by the major players to expand their presence in the global cloud computing market. The cloud computing market includes major vendors, such as AWS (US), Microsoft (US), Google (US), Alibaba (China), SAP (Germany), IBM (US), Oracle (US), VMware (US), Rackspace (US), Salesforce (US), Adobe (US), CenturyLink (US), Fujitsu (Japan), Workday (US), Infor (US), Sage Group (UK), Intuit (US), Epicor (US), IFS (Sweden), ServiceNow (US), OpenText (US), Cisco (US), Box (US), Zoho (US), Citrix (US), Upland Software (US), DigitalOcean (US), Bluelock (US), OVH (France), Joyent (US), Skytap (US), Virtuestream (US), Tencent (China), DXC (US), NEC (Japan), and Navisite (US).

About MarketsandMarketsMarketsandMarkets provides quantified B2B research on 30,000 high growth niche opportunities/threats which will impact 70% to 80% of worldwide companies' revenues. Currently servicing 7500 customers worldwide including 80% of global Fortune 1000 companies as clients. Almost 75,000 top officers across eight industries worldwide approach MarketsandMarkets for their painpoints around revenues decisions.

Our 850 fulltime analyst and SMEs at MarketsandMarkets are tracking global high growth markets following the "Growth Engagement Model GEM". The GEM aims at proactive collaboration with the clients to identify new opportunities, identify most important customers, write "Attack, avoid and defend" strategies, identify sources of incremental revenues for both the company and its competitors. MarketsandMarkets now coming up with 1,500 MicroQuadrants (Positioning top players across leaders, emerging companies, innovators, strategic players) annually in high growth emerging segments. MarketsandMarkets is determined to benefit more than 10,000 companies this year for their revenue planning and help them take their innovations/disruptions early to the market by providing them research ahead of the curve.

MarketsandMarkets's flagship competitive intelligence and market research platform, "Knowledgestore" connects over 200,000 markets and entire value chains for deeper understanding of the unmet insights along with market sizing and forecasts of niche markets.

Contact:Mr. Aashish MehraMarketsandMarkets INC.630 Dundee RoadSuite 430Northbrook, IL 60062USA : 1-888-600-6441sales@marketsandmarkets.com

For more information on this press release visit: http://www.sbwire.com/press-releases/cloud-computing-market-witness-the-growth-of-8321-billion-by-2025-1301371.htm

See original here:
Cloud Computing Market Witness the Growth of $832.1 Billion by 2025 - Press Release - Digital Journal

Read More..

Worldwide To Maintain The Momentum In The Healthcare Cloud Computing Market Between 2025 – Scientect

The research report on healthcare cloud computing market includes current market scenario analysis as well as a revised forecast for a period of eight years. According to a recent market report published by Persistence Market Research titled,Healthcare Cloud Computing Market: Global Industry Analysis (2012-2016) and Forecast (2017-2025)the globalhealthcare cloud computing marketis anticipated to be valued at US$ 7791.4 Mn in 2025, and is expected to register a CAGR of 18.9% from 2017 to 2025.

Increasing demand for better healthcare facilities and Rising investments by healthcare IT players are major factors driving growth of the global healthcare cloud computing market.

Get Sample Copy of Report @ https://www.persistencemarketresearch.com/samples/19390

Company Profiles

Get To Know Methodology of Report @ https://www.persistencemarketresearch.com/methodology/19390

Cloud refers to a prototype in which data is permanently stored on servers and accessed by clients with the help of different information systems such as computers, sensors, laptops, and others. Cloud computing refers to a process which involves delivering hosted services to clients.

Global Healthcare Cloud Computing Market: Segmentation & Forecast

Global healthcare cloud computing market is categorized on the basis of application, deployment model, by components, by service model and region. On the basis of application, the market is segmented as CIS and NCIS. The CIS segment is anticipated to register a CAGR of 20.3% during the forecast period.

The component segment is segmented into software, hardware, and services. On the basis of deployment model, the market is segmented as public cloud, private cloud, and hybrid cloud. Private cloud segment accounted for highest market share and was valued at US$ 2,504 Mn in 2016.

The service model segment is segmented as SaaS, Paas, and IaaS. The SaaS segment is poised to be highly lucrative in the coming years. This segment is estimated to reach a value of about US$ 25.4 Bn by 2025 end and is the fastest growing segment to register an exponential CAGR of 19.7% throughout the period of assessment, 2017-2025.

The PaaS segment is the smallest segment with a low estimate of US$ 360.3 Mn in 2017 and is expected to reach US$ 1.2 Bn by 2025 end. By component, the software segment is projected to grow at a higher rate to register value CAGR of 19.2% throughout the period of forecast and is the largest segment in terms of value share. It is estimated to reach a valuation of more than US$ 21 Bn by the end of 2025.

Access Full Report @ https://www.persistencemarketresearch.com/checkout/19390

Global Healthcare Cloud Computing Market: Regional Forecast

This report also covers drivers, restraints and trends driving each segment and offers analysis and insights regarding the potential of healthcare cloud computing market in regions including North America, Latin America, Europe, Asia Pacific, and Middle East and Africa.

Among these regions, North America accounted for the largest market share in 2016. Moreover, North America region is also expected to register a healthy CAGR of 19.6% during the forecast period.

Explore Extensive Coverage of PMR`sLife Sciences & Transformational HealthLandscape

Persistence Market Research (PMR) is a third-platform research firm. Our research model is a unique collaboration of data analytics andmarket research methodologyto help businesses achieve optimal performance.

To support companies in overcoming complex business challenges, we follow a multi-disciplinary approach. At PMR, we unite various data streams from multi-dimensional sources. By deploying real-time data collection, big data, and customer experience analytics, we deliver business intelligence for organizations of all sizes.

Our client success stories feature a range of clients from Fortune 500 companies to fast-growing startups. PMRs collaborative environment is committed to building industry-specific solutions by transforming data from multiple streams into a strategic asset.

Contact us:

Ashish KoltePersistence Market ResearchAddress 305 Broadway, 7th FloorNew York City,NY 10007 United StatesU.S. Ph. +1-646-568-7751USA-Canada Toll-free +1 800-961-0353Sales[emailprotected]Websitehttps://www.persistencemarketresearch.com

Follow this link:
Worldwide To Maintain The Momentum In The Healthcare Cloud Computing Market Between 2025 - Scientect

Read More..

Impact of COVID-19 Outbreak on Global Cloud Computing in K-12 Market 2020 Industry Size, Advancement Strategy, Top Players, SWOT Analysis and 2025…

Global Cloud Computing in K-12 Market Research Report 2020-2025 is designed covering micro level of analysis by manufacturers and key business segments. The Global Cloud Computing in K-12 Market survey analysis offers energetic visions to conclude and study market size, market hopes, and competitive surroundings. The research is derived through primary and secondary statistics source and it comprises both qualitative and quantitative detailing

Get Sample Copy of This Report @https://www.orianresearch.com/request-sample/1525098

Impact of COVID-19 Outbreak on this Market:

The rise of COVID-19 has brought the world to a halt. We comprehend that this health crisis has brought an unprecedented impact on organizations across industries. However, this too shall pass. Rising helps from governments and several companies can help in the battle against this highly contagious disease. There are few industries that are struggling and some are thriving. Almost every organization is anticipated to be impacted by the pandemic.

We are taking continuous efforts to help your business to continue and develop COVID-19 pandemics. In light of our experience and expertise, we will offer you an impact analysis of coronavirus outbreak across industries to help you prepare for the future.

Key players in global Cloud Computing in K-12 market include:,Adobe Systems,Blackboard,Cisco,Ellucian,Dell EMC,Instructure,Microsoft,NetApp,Oracle,Salesforce,SA

Market segmentation, by product types:,SaaS,IaaS,Paa

Market segmentation, by applications:,Training & Consulting,Integration & Migration,Support & Maintenanc

Target Audience:* Cloud Computing in K-12 Manufactures* Traders, Importers, and Exporters* Raw Material Suppliers and Distributors* Research and Consulting Firms* Government and Research Organizations* Associations and Industry Bodies

Order Copy of this Report @https://www.orianresearch.com/checkout/1525098

Research Methodology:

The research methodology that has been used to forecast and estimate the global Cloud Computing in K-12 market consists of primary and secondary research methods. The primary research include detailed interview with authoritative personal such as directors, CEO, executives, and VPs.Sales, values, capacity, Revenue, regional market examination, section insightful information, and market forecast are including technical growth scenario, consumer behavior, and end use trends and dynamics, and production capacity were taken into consideration. There are Different weightageswhich have been allotted to these parameters and evaluated their market impacts using the weighted average analysis to derive the market growth rate.

The Market estimates and Industry forecast have beenconfirmed through exhaustive primary research with the Key Industry Participants (KIPs), which typically include:* Manufacturers* Suppliers* Distributors* Government Body & Associations* Research Institutes

Customization Service of the Report:-

Orian Research provides customization of Reports as your need. This Report can be personalized to meet all your requirements. If you have any question get in touch with our sales team, who will guarantee you to get a Report that suits your necessities.

Seeking to initiate fruitful business relationships with you!

About Us

Orian Research is one of the most comprehensive collections of market intelligence reports on the World Wide Web. Our reports repository boasts of over 500000+ industry and country research reports from over 100 top publishers. We continuously update our repository so as to provide our clients easy access to the worlds most complete and current database of expert insights on global industries, companies, and products. We also specialize in custom research in situations where our syndicate research offerings do not meet the specific requirements of our esteemed clients.

Contact Us:

Ruwin Mendez

Vice President Global Sales & Partner Relations

Orian Research Consultants

US +1 (415) 830-3727| UK +44 020 8144-71-27

Email:[emailprotected]

View post:
Impact of COVID-19 Outbreak on Global Cloud Computing in K-12 Market 2020 Industry Size, Advancement Strategy, Top Players, SWOT Analysis and 2025...

Read More..

Financial Governance for Big Data in the Cloud – Express Computer

By Ashish Dubey, VP Solutions Architecture, Qubole

As more and more businesses move from on-premise infrastructure to the cloud to benefit from cost saving, efficiency, speed and a democratization of data access, CFOs are glancing nervously at the rising cost of cloud overruns. Cloud sprawl, an unfettered proliferation of your organizations cloud instances due to lack of real time control of cloud computing resources is a real problem plaguing data-driven organizations.

While recent competition between various cloud service providers (Amazon, Google and Microsoft) have benefited customers, failing financial governance of cloud computing costs threaten to dilute the delta gains companies have been experiencing as a result of harnessing the potential of big data like improved customer experience and product development roadmaps.

The Case for Strong Financial Governance

A marked difference between on-premise infrastructure costs (large upfront commitments for long term savings) versus cloud infrastructure is the on demand, per instance usage of cloud computing resources. A rather simplified comparison is signing up for a highly optimized data package from your ISP but burning through large usage pools of bandwidth without real time checks and filters. This can lead to unwanted surprises in your cloud bill. Governance is what keeps the checks and balances in place and is essentially a series of everyday tasks that are critical to keeping accountability and control on cloud spends.

Moving to the cloud has fewer risks today. The move, done with proper planning and POCs is easy and not very time consuming. Most cloud payment models are pay as you go and on-demand so organizations do not see a hefty upfront bill. Howeveras cloud projects mature, use cases and instances get layered, more complex; the danger for a runaway cloud bill goes up. Its easy to bucket some of the reasons for this. As application requests are not known in advance, server allocations are made in advance thereby increasing server running time. Most web applications are engineered to reduce latency (for better customer experience) rather than costs. This means we have to forego the on-demand advantage that cloud servers allow for changing workloads and leads to poor performance optimization.

While most applications are designed assuming gradual increase and decrease of data processing requirements, in the real world data can represent burstiness which sharply increases the need for more servers. This is complementary to the concept of idle time as well. Most web applications can have a steady traffic flow but large workloads can be scattered through the day, leading to idle times when usage is much lower.

How Can Organisations Strengthen Financial Governance?

One IDC report showed that cloud infrastructure spending went up 23.8% in 2019 from the year before and estimates that it could further go up by as much another 43% by 2022. Whilst traceability and predictability are important elements in financial governance policies, cost control, and expense reduction is usually the starting focus of any financial governance exercise. There are ways for organizations to mitigate these costs:

Optimize for performance while accounting for speed of query execution as well as timeliness of execution By prioritizing capacity management as an ongoing exercise by building systems that let teams build faster by nor worrying about unexpected costs by having financial guard rails tomaintain traceability and predictability on user, cluster and job cost metrics level.

Adopting Data Platforms with Built-In Financial Governance Metrics

More mature applications can leverage modern technology platforms involving AI/ML to drive stronger governance. Enterprises should look at platforms which enable Workload Aware Autoscaling in order to strengthen the financial governance within an organization. This will help support multiple teams run big data in a shared cloud environment or separate ones which can be combined to deliver more savings without compromising performance. Additionally, it also needs to include the strong tenets of Optimized Upscaling to reclaim and reallocate unused resources, Aggressive Downscalingto prevent cost overruns due to idle nodes,Container Packing as a resource allocation strategy and Diversified Spot which reduces the chances of bulk interruption of Spot nodes by your Cloud Provider.

To summarise, it is important to have a granular visibility of your infrastructure spend at a job, cluster, or cluster instance levels. This adds myriad benefits to track costs, monitor show-back, justify business plans, prepare budgets, and build ROI analyses. The foundation of robust financial governance of data costs is built by providing data teams the tools to view and make corrections to infrastructure needs immediately irrespective of application complexity and data analytics requirements.

If you have an interesting article / experience / case study to share, please get in touch with us at [emailprotected]

Follow this link:
Financial Governance for Big Data in the Cloud - Express Computer

Read More..

In Depth Analysis and Survey of COVID-19 Pandemic Impact on Global Cloud Computing in Retail Banking Market Report 2020 Key Players Amazon Web…

Report is a detailed study of the Cloud Computing in Retail Banking market, which covers all the essential information required by a new market entrant as well as the existing players to gain a deeper understanding of the market.The primary objective of this research report named Cloud Computing in Retail Banking market is to help making reliable strategic decisions regarding the opportunities in Cloud Computing in Retail Banking market. It offers business accounts, industry investors, and industry segments with consequential insights enhancing decision making ability.[emailprotected] or call us on +1-312-376-8303.

Request Report from CMR Website:https://cognitivemarketresearch.com/servicesoftware/cloud-computing-in-retail-banking-market-report

Global and Regional Cloud Computing in Retail Banking Market Segmentation by Type: Public Clouds, Private Clouds, Hybrid Clouds

Global Cloud Computing in Retail Banking Market Segmentation by Applications: Personal, Family, Small and Medium-Sized Enterprises (SMES)

Major Market Players with an in-depth analysis: Amazon Web Services (AWS), Ellie Mae, IBM, Infosys, Intuit, Medidata, Microsoft, Oracle, Salesforce, SAP, TCS, Veeva Systems, Wipro, Workday, BBVA, Bankinter, Intel, Google, Alibaba, Tencent, Kingsoft, Ucloud, Baidu, Huawei, China Telecom, China Unicom

Request Free Sample Copy of Cloud Computing in Retail Banking Market Research [emailprotected] https://cognitivemarketresearch.com/servicesoftware/cloud-computing-in-retail-banking-market-report#download_report

The Cloud Computing in Retail Banking market report offers the current state of the market around the world. The report starts with the market outline and key components of the Cloud Computing in Retail Banking market which assumes a significant job for clients to settle on the business choice. It additionally offers the key focuses to upgrade the development in the Cloud Computing in Retail Banking market. Some fundamental ideas are likewise secured by reports, for example, item definition, its application, industry esteem chain structure and division which help the client to break down the market without any problem. Also, the report covers different factors, for example, arrangements, efficient and innovative which are affecting the Cloud Computing in Retail Banking business and market elements.

Any query? Enquire Here For Discount (COVID-19 Impact Analysis Updated Sample): Click Here>Download Sample Report of Cloud Computing in Retail Banking Market Report 2020 (Coronavirus Impact Analysis on Cloud Computing in Retail Banking Market)

The research comprises primary information about the products. Similarly, it includes supply-demand statistics, and segments that constrain the growth of an industry. It also includes raw materials used and manufacturing process of Cloud Computing in Retail Banking market. Additionally, report provides market drivers and challenges & opportunities for overall market in the particular provincial sections.

Competitive Analysis has been done to understand overall market which will be helpful to take decisions. Major players involved in the manufacture of Cloud Computing in Retail Banking product has been completely profiled along with their SWOT. Some of the key players include Amazon Web Services (AWS), Ellie Mae, IBM, Infosys, Intuit, Medidata, Microsoft, Oracle, Salesforce, SAP, TCS, Veeva Systems, Wipro, Workday, BBVA, Bankinter, Intel, Google, Alibaba, Tencent, Kingsoft, Ucloud, Baidu, Huawei, China Telecom, China Unicom. It helps in understanding their strategy and activities. Business strategy described for every company helps to get idea about the current trends of company. The industry intelligence study of the Cloud Computing in Retail Banking market covers the estimation size of the market each in phrases of value (Mn/Bn USD) and volume (tons). Report involves detailed chapter on COVID 19 and its impact on this market. Additionally, it involves changing consumer behavior due to outbreak of COVID 19.

Further, report consists of Porters Five Forces and BCG matrix as well as product life cycle to help you in taking wise decisions. Additionally, this report covers the inside and out factual examination and the market elements and requests which give an entire situation of the business.

Regional Analysis for Cloud Computing in Retail Banking North America (United States, Canada)Europe (Germany, Spain, France, UK, Russia, and Italy)Asia-Pacific (China, Japan, India, Australia, and South Korea)Latin America (Brazil, Mexico, etc.)The Middle East and Africa (GCC and South Africa)

DOWNLOAD FREE SAMPLE [emailprotected]: https://cognitivemarketresearch.com/servicesoftware/cloud-computing-in-retail-banking-market-report#download_report

Chapters Define in TOC (Table of Content) of the Report:Chapter 1: Market Overview, Drivers, Restraints and Opportunities, SegmentationOverviewChapter 2: COVID ImpactChapter 3: Market Competition by ManufacturersChapter 4: Production by RegionsChapter 5: Consumption by RegionsChapter 6: Production, By Types, Revenue and Market share by TypesChapter 7: Consumption, By Applications, Market share (%) and Growth Rate byApplicationsChapter 8: Complete profiling and analysis of ManufacturersChapter 9: Manufacturing cost analysis, Raw materials analysis, Region-wiseManufacturing expensesChapter 10: Industrial Chain, Sourcing Strategy and Downstream BuyersChapter 11: Marketing Strategy Analysis, Distributors/TradersChapter 12: Market Effect Factors AnalysisChapter 13: Market ForecastChapter 14: Cloud Computing in Retail Banking Research Findings and Conclusion, Appendix, methodology and data source To check the complete Table of Content click here: @ https://cognitivemarketresearch.com/servicesoftware/cloud-computing-in-retail-banking-market-report#table_of_contents

The qualitative contents for geographical analysis will cover market trends in each region and country which includes highlights of the key players operating in the respective region/country, PEST analysis of each region which includes political, economic, social and technological factors influencing the growth of the market. The research report includes specific segments by Type and by Application. This study provides information about the sales and revenue during the historic and forecasted period of 2015 to 2027.

About Us: Cognitive Market Research is one of the finest and most efficient Market Research and Consulting firm. The company strives to provide research studies which include syndicate research, customized research, round the clock assistance service, monthly subscription services, and consulting services to our clients.

Contact Us: +1-312-376-8303Email: [emailprotected]Web: https://www.cognitivemarketresearch.com/

**********Download the Entire Report*************************************************https://cognitivemarketresearch.com/servicesoftware/cloud-computing-in-retail-banking-market-report

Link:
In Depth Analysis and Survey of COVID-19 Pandemic Impact on Global Cloud Computing in Retail Banking Market Report 2020 Key Players Amazon Web...

Read More..

Fermilab to lead $115 million National Quantum Information Science Research Center to build revolutionary quantum computer with Rigetti Computing,…

One of the goals of theSuperconducting Quantum Materials and Systems Centeris to build a beyond-state-of-the-art quantum computer based on superconducting technologies.The center also will develop new quantum sensors, which could lead to the discovery of the nature of dark matter and other elusive subatomic particles.

The U.S. Department of Energys Fermilab has been selected to lead one of five national centers to bring about transformational advances in quantum information science as a part of the U.S. National Quantum Initiative, announced the White House Office of Science and Technology Policy, the National Science Foundation and the U.S. Department of Energy today.

The initiative provides the newSuperconducting Quantum Materials and Systems Centerfunding with the goal of building and deploying a beyond-state-of-the-art quantum computer based on superconducting technologies. The center also will develop new quantum sensors, which could lead to the discovery of the nature of dark matter and other elusive subatomic particles. Total planned DOE funding for the center is $115 million over five years, with $15 million in fiscal year 2020 dollars and outyear funding contingent on congressional appropriations. SQMS will also receive an additional $8 million in matching contributions from center partners.

The SQMS Center is part of a $625 million federal program to facilitate and foster quantum innovation in the United States. The 2018 National Quantum Initiative Act called for a long-term, large-scale commitment of U.S. scientific and technological resources to quantum science.

The revolutionary leaps in quantum computing and sensing that SQMS aims for will be enabled by a unique multidisciplinary collaboration that includes 20 partners national laboratories, academic institutions and industry. The collaboration brings together world-leading expertise in all key aspects: from identifying qubits quality limitations at the nanometer scale to fabrication and scale-up capabilities into multiqubit quantum computers to the exploration of new applications enabled by quantum computers and sensors.

The breadth of the SQMS physics, materials science, device fabrication and characterization technology combined with the expertise in large-scale integration capabilities by the SQMS Center is unprecedented for superconducting quantum science and technology, said SQMS Deputy Director James Sauls of Northwestern University. As part of the network of National QIS Research centers, SQMS will contribute to U.S. leadership in quantum science for the years to come.

SQMS researchers are developing long-coherence-time qubits based on Rigetti Computings state-of-the-art quantum processors. Image: Rigetti Computing

At the heart of SQMS research will be solving one of the most pressing problems in quantum information science: the length of time that a qubit, the basic element of a quantum computer, can maintain information, also called quantum coherence. Understanding and mitigating sources of decoherence that limit performance of quantum devices is critical to engineering in next-generation quantum computers and sensors.

Unless we address and overcome the issue of quantum system decoherence, we will not be able to build quantum computers that solve new complex and important problems. The same applies to quantum sensors with the range of sensitivity needed to address long-standing questions in many fields of science, said SQMS Center Director Anna Grassellino of Fermilab. Overcoming this crucial limitation would allow us to have a great impact in the life sciences, biology, medicine, and national security, and enable measurements of incomparable precision and sensitivity in basic science.

The SQMS Centers ambitious goals in computing and sensing are driven by Fermilabs achievement of world-leading coherence times in components called superconducting cavities, which were developed for particle accelerators used in Fermilabs particle physics experiments. Researchers have expanded the use of Fermilab cavities into the quantum regime.

We have the most coherent by a factor of more than 200 3-D superconducting cavities in the world, which will be turned into quantum processors with unprecedented performance by combining them with Rigettis state-of-the-art planar structures, said Fermilab scientist Alexander Romanenko, SQMS technology thrust leader and Fermilab SRF program manager. This long coherence would not only enable qubits to be long-lived, but it would also allow them to be all connected to each other, opening qualitatively new opportunities for applications.

The SQMS Centers goals in computing and sensing are driven by Fermilabs achievement of world-leading coherence times in components called superconducting cavities, which were developed for particle accelerators used in Fermilabs particle physics experiments. Photo: Reidar Hahn, Fermilab

To advance the coherence even further, SQMS collaborators will launch a materials-science investigation of unprecedented scale to gain insights into the fundamental limiting mechanisms of cavities and qubits, working to understand the quantum properties of superconductors and other materials used at the nanoscale and in the microwave regime.

Now is the time to harness the strengths of the DOE laboratories and partners to identify the underlying mechanisms limiting quantum devices in order to push their performance to the next level for quantum computing and sensing applications, said SQMS Chief Engineer Matt Kramer, Ames Laboratory.

Northwestern University, Ames Laboratory, Fermilab, Rigetti Computing, the National Institute of Standards and Technology, the Italian National Institute for Nuclear Physics and several universities are partnering to contribute world-class materials science and superconductivity expertise to target sources of decoherence.

SQMS partner Rigetti Computing will provide crucial state-of-the-art qubit fabrication and full stack quantum computing capabilities required for building the SQMS quantum computer.

By partnering with world-class experts, our work will translate ground-breaking science into scalable superconducting quantum computing systems and commercialize capabilities that will further the energy, economic and national security interests of the United States, said Rigetti Computing CEO Chad Rigetti.

SQMS will also partner with the NASA Ames Research Center quantum group, led by SQMS Chief Scientist Eleanor Rieffel. Their strengths in quantum algorithms, programming and simulation will be crucial to use the quantum processors developed by the SQMS Center.

The Italian National Institute for Nuclear Physics has been successfully collaborating with Fermilab for more than 40 years and is excited to be a member of the extraordinary SQMS team, said INFN President Antonio Zoccoli. With its strong know-how in detector development, cryogenics and environmental measurements, including the Gran Sasso national laboratories, the largest underground laboratory in the world devoted to fundamental physics, INFN looks forward to exciting joint progress in fundamental physics and in quantum science and technology.

Fermilab is excited to host this National Quantum Information Science Research Center and work with this extraordinary network of collaborators, said Fermilab Director Nigel Lockyer. This initiative aligns with Fermilab and its mission. It will help us answer important particle physics questions, and, at the same time, we will contribute to advancements in quantum information science with our strengths in particle accelerator technologies, such as superconducting radio-frequency devices and cryogenics.

We are thankful and honored to have this unique opportunity to be a national center for advancing quantum science and technology, Grassellino said. We have a focused mission: build something revolutionary. This center brings together the right expertise and motivation to accomplish that mission.

The Superconducting Quantum Materials and Systems Center at Fermilab is supported by the DOE Office of Science.

Fermilab is supported by the Office of Science of the U.S. Department of Energy. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit science.energy.gov.

Read the original:
Fermilab to lead $115 million National Quantum Information Science Research Center to build revolutionary quantum computer with Rigetti Computing,...

Read More..

I confess, I’m scared of the next generation of supercomputers – TechRadar

Earlier this year, a Japanese supercomputer built on Arm-based Fujitsu A64FX processors snatched the crown of worlds fastest machine, blowing incumbent leader IBM Summit out of the water.

Fugaku, as the machine is known, achieved 415.5 petaFLOPS by the popular High Performance Linpack (HPL) benchmark, which is almost three times the score of the IBM machine (148.5 petaFLOPS).

It also topped the rankings for Graph 500, HPL-AI and HPCH workloads - a feat never before achieved in the world of high performance computing (HPC).

Modern supercomputers are now edging ever-closer to the landmark figure of one exaFLOPS (equal to 1,000 petaFLOPS), commonly described as the exascale barrier. In fact, Fugaku itself can already achieve one exaFLOPS, but only in lower precision modes.

The consensus among the experts we spoke to is that a single machine will breach the exascale barrier within the next 6 - 24 months, unlocking a wealth of possibilities in the fields of medical research, climate forecasting, cybersecurity and more.

But what is an exaFLOPS? And what will it mean to break the exascale milestone, pursued doggedly for more than a decade?

To understand what it means to achieve exascale computing, its important to first understand what is meant by FLOPS, which stands for floating point operations per second.

A floating point operation is any mathematical calculation (i.e. addition, subtraction, multiplication or division) that involves a number containing a decimal (e.g. 3.0 - a floating point number), as opposed to a number without a decimal (e.g. 3 - a binary integer). Calculations involving decimals are typically more complex and therefore take longer to solve.

An exascale computer can perform 10^18 (one quintillion/100,000,000,000,000,000) of these mathematical calculations every second.

For context, to equal the number of calculations an exascale computer can process in a single second, an individual would have to perform one sum every second for 31,688,765,000 years.

The PC Im using right now, meanwhile, is able to reach 147 billion FLOPS (or 0.00000014723 exaFLOPS), outperforming the fastest supercomputer of 1993, the Intel Paragon (143.4 billion FLOPS).

This both underscores how far computing has come in the last three decades and puts into perspective the extreme performance levels attained by the leading supercomputers today.

The key to building a machine capable of reaching one exaFLOPS is optimization at the processing, storage and software layers.

The hardware must be small and powerful enough to pack together and reach the necessary speeds, the storage capacious and fast enough to serve up the data and the software scalable and programmable enough to make full use of the hardware.

For example, there comes a point at which adding more processors to a supercomputer will no longer affect its speed, because the application is not sufficiently optimized. The only way governments and private businesses will realize a full return on HPC hardware investment is through an equivalent investment in software.

Organizations such as the Exascale Computing Project (EPC) the ExCALIBUR programme are interested in solving precisely this problem. Those involved claim a renewed focus on algorithm and application development is required in order to harness the full power and scope of exascale.

Achieving the delicate balance between software and hardware, in an energy efficient manner and avoiding an impractically low mean time between failures (MTBF) score (the time that elapses before a system breaks down under strain) is the challenge facing the HPC industry.

15 years ago as we started the discussion on exascale, we hypothesized that it would need to be done in 20 mega-watts (MW); later that was changed to 40 MW. With Fugaku, we see that we are about halfway to a 64-bit exaFLOPS at the 40 MW power envelope, which shows that an exaFLOPS is in reach today, explained Brent Gorda, Senior Director HPC at UK-based chip manufacturer Arm.

We could hit an exaFLOPS now with sufficient funding to build and run a system. [But] the size of the system is likely to be such that MTBF is measured in single digit number-of-days based on todays technologies and the number of components necessary to reach these levels of performance.

When it comes to building a machine capable of breaching the exascale barrier, there are a number of other factors at play, beyond technological feasibility. An exascale computer can only come into being once an equilibrium has been reached at the intersection of technology, economics and politics.

One could in theory build an exascale system today by packing in enough CPUs, GPUs and DPUs. But what about economic viability? said Gilad Shainer of NVIDIA Mellanox, the firm behind the Infiniband technology (the fabric that links the various hardware components) found in seven of the ten fastest supercomputers.

Improvements in computing technologies, silicon processing, more efficient use of power and so on all help to increase efficiency and make exascale computing an economic objective as opposed to a sort of sporting achievement.

According to Paul Calleja, who heads up computing research at the University of Cambridge and is working with Dell on the Open Exascale Lab, Fugaku is an excellent example of what is theoretically possible today, but is also impractical by virtually any other metric.

If you look back at Japanese supercomputers, historically theres only ever been one of them made. They have beautifully exquisite architectures, but theyre so stupidly expensive and proprietary that no one else could afford one, he told TechRadar Pro.

[Japanese organizations] like these really large technology demonstrators, which are very useful in industry because it shows the direction of travel and pushes advancements, but those kinds of advancements are very expensive and not sustainable, scalable or replicable.

So, in this sense, there are two separate exascale landmarks; the theoretical barrier, which will likely be met first by a machine of Fugakus ilk (a technological demonstrator), and the practical barrier, which will see exascale computing deployed en masse.

Geopolitical factors will also play a role in how quickly the exascale barrier is breached. Researchers and engineers might focus exclusively on the technological feat, but the institutions and governments funding HPC research are likely motivated by different considerations.

Exascale computing is not just about reaching theoretical targets, it is about creating the ability to tackle problems that have been previously intractable, said Andy Grant, Vice President HPC & Big Data at IT services firm Atos, influential in the fields of HPC and quantum computing.

Those that are developing exascale technologies are not doing it merely to have the fastest supercomputer in the world, but to maintain international competitiveness, security and defence.

In Japan, their new machine is roughly 2.8x more powerful than the now-second place system. In broad terms, that will enable Japanese researchers to address problems that are 2.8x more complex. In the context of international competitiveness, that creates a significant advantage.

In years gone by, rival nations fought it out in the trenches or competed to see who could place the first human on the moon. But computing may well become the frontier at which the next arms race takes place; supremacy in the field of HPC might prove just as politically important as military strength.

Once exascale computers become an established resource - available for businesses, scientists and academics to draw upon - a wealth of possibilities will be unlocked across a wide variety of sectors.

HPC could prove revelatory in the fields of clinical medicine and genomics, for example, which require vast amounts of compute power to conduct molecular modelling, simulate interactions between compounds and sequence genomes.

In fact, IBM Summit and a host of other modern supercomputers are being used to identify chemical compounds that could contribute to the fight against coronavirus. The Covid-19 High Performance Computing Consortium assembled 16 supercomputers, accounting for an aggregate of 330 petaFLOPS - but imagine how much more quickly research could be conducted using a fleet of machines capable of reaching 1,000 petaFLOPS on their own.

Artificial intelligence, meanwhile, is another cross-disciplinary domain that will be transformed with the arrival of exascale computing. The ability to analyze ever-larger datasets will improve the ability of AI models to make accurate forecasts (contingent on the quality of data fed into the system) that could be applied to virtually any industry, from cybersecurity to e-commerce, manufacturing, logistics, banking, education and many more.

As explained by Rashid Mansoor, CTO at UK supercomputing startup Hadean, the value of supercomputing lies in the ability to make an accurate projection (of any variety).

The primary purpose of a supercomputer is to compute some real-world phenomenon to provide a prediction. The prediction could be the way proteins interact, the way a disease spreads through the population, how air moves over an aerofoil or electromagnetic fields interact with a spacecraft during re-entry, he told TechRadar Pro.

Raw performance such as the HPL benchmark simply indicates that we can model bigger and more complex systems to a greater degree of accuracy. One thing that the history of computing has shown us is that the demand for computing power is insatiable.

Other commonly cited areas that will benefit significantly from the arrival of exascale include brain mapping, weather and climate forecasting, product design and astronomy, but its also likely that brand new use cases will emerge as well.

The desired workloads and the technology to perform them form a virtuous circle. The faster and more performant the computers, the more complex problems we can solve and the faster the discovery of new problems, explained Shainer.

What we can be sure of is that we will see the continuous needs or ever growing demands for more performance capabilities in order to solve the unsolvable. Once this is solved, we will find the new unsolvable.

By all accounts, the exascale barrier will likely fall within the next two years, but the HPC industry will then turn its attention to the next objective, because the work is never done.

Some might point to quantum computers, which approach problem solving in an entirely different way to classical machines (exploiting symmetries to speed up processing), allowing for far greater scale. However, there are also problems to which quantum computing cannot be applied.

Mid-term (10 year) prospects for quantum computing are starting to shape up, as are other technologies. These will be more specialized where a quantum computer will very likely show up as an application accelerator for problems that relate to logistics first. They wont completely replace the need for current architectures for IT/data processing, explained Gorda.

As Mansoor puts it, on certain problems even a small quantum computer can be exponentially faster than all of the classical computing power on earth combined. Yet on other problems, a quantum computer could be slower than a pocket calculator.

The next logical landmark for traditional computing, then, would be one zettaFLOPS, equal to 1,000 exaFLOPS or 1,000,000 petaFLOPS.

Chinese researchers predicted in 2018 that the first zettascale system will come online in 2035, paving the way for new computing paradigms. The paper itself reads like science fiction, at least for the layman:

To realize these metrics, micro-architectures will evolve to consist of more diverse and heterogeneous components. Many forms of specialized accelerators are likely to co-exist to boost HPC in a joint effort. Enabled by new interconnect materials such as photonic crystal, fully optical interconnecting systems may come into use.

Assuming one exaFLOPS is reached by 2022, 14 years will have elapsed between the creation of the first petascale and first exascale systems. The first terascale machine, meanwhile, was constructed in 1996, 12 years before the petascale barrier was breached.

If this pattern were to continue, the Chinese researchers estimate would look relatively sensible, but there are firm question marks over the validity of zettascale projections.

While experts are confident in their predicted exascale timelines, none would venture a guess at when zettascale might arrive without prefacing their estimate with a long list of caveats.

Is that an interesting subject? Because to be honest with you, its so not obtainable. To imagine how we could go 1000x beyond [one exaFLOPS] is not a conversation anyone could have, unless theyre just making it up, said Calleja, asked about the concept of zettascale.

Others were more willing to theorize, but equally reticent to guess at a specific timeline. According to Grant, the way zettascale machines process information will be unlike any supercomputer in existence today.

[Zettascale systems] will be data-centric, meaning components will move to the data rather than the other way around, as data volumes are likely to be so large that moving data will be too expensive. Regardless, predicting what they might look like is all guesswork for now, he said.

It is also possible that the decentralized model might be the fastest route to achieving zettascale, with millions of less powerful devices working in unison to form a collective supercomputer more powerful than any single machine (as put into practice by the SETI Institute).

As noted by Saurabh Vij, CEO of distributed supercomputing firm Q Blocks, decentralized systems address a number of problems facing the HPC industry today, namely surrounding building and maintenance costs. They are also accessible to a much wider range of users and therefore democratize access to supercomputing resources in a way that is not otherwise possible.

There are benefits to a centralized architecture, but the cost and maintenance barrier overshadows them. [Centralized systems] also alienate a large base of customer groups that could benefit, he said.

We think a better way is to connect distributed nodes together in a reliable and secure manner. It wouldnt be too aggressive to say that, 5 years from now, your smartphone could be part of a giant distributed supercomputer, making money for you while you sleep by solving computational problems for industry, he added.

However, incentivizing network nodes to remain active for a long period is challenging and a high rate of turnover can lead to reliability issues. Network latency and capacity problems would also need to be addressed before distributed supercomputing can rise to prominence.

Ultimately, the difficulty in making firm predictions about zettascale lies in the massive chasm that separates present day workloads and HPC architectures from those that might exist in the future. From a contemporary perspective, its fruitless to imagine what might be made possible by a computer so powerful.

We might imagine zettascale machines will be used to process workloads similar to those tackled by modern supercomputers, only more quickly. But its possible - even likely - the arrival of zettascale computing will open doors that do not and cannot exist today, so extraordinary is the leap.

In a future in which computers are 2,000+ times as fast as the most powerful machine today, philosophical and ethical debate surrounding the intelligence of man versus machine are bound to be played out in greater detail - and with greater consequence.

It is impossible to directly compare the workings of a human brain with that of a computer - i.e. to assign a FLOPS value to the human mind. However, it is not insensible to ask how many FLOPS must be achieved before a machine reaches a level of performance that might be loosely comparable to the brain.

Back in 2013, scientists used the K supercomputer to conduct a neuronal network simulation using open source simulation software NEST. The team simulated a network made up of 1.73 billion nerve cells connected by 10.4 trillion synapses.

While ginormous, the simulation represented only 1% of the human brains neuronal network and took 40 minutes to replicate 1 seconds worth of neuronal network activity.

However, the K computer reached a maximum computational power of only 10 petaFLOPS. A basic extrapolation (ignoring inevitable complexities), then, would suggest Fugaku could simulate circa 40% of the human brain, while a zettascale computer would be capable of performing a full simulation many times over.

Digital neuromorphic hardware (supercomputers created specifically to simulate the human brain) like SpiNNaker 1 and 2 will also continue to develop in the post-exascale future. Instead of sending information from point A to B, these machines will be designed to replicate the parallel communication architecture of the brain, sending information simultaneously to many different locations.

Modern iterations are already used to help neuroscientists better understand the mysteries of the brain and future versions, aided by advances in artificial intelligence, will inevitably be used to construct a faithful and fully-functional replica.

The ethical debates that will arise with the arrival of such a machine - surrounding the perception of consciousness, the definition of thought and what an artificial uber-brain could or should be used for - are manifold and could take generations to unpick.

The inability to foresee what a zettascale computer might be capable of is also an inability to plan for the moral quandaries that might come hand-in-hand.

Whether a future supercomputer might be powerful enough to simulate human-like thought is not in question, but whether researchers should aspire to bringing an artificial brain into existence is a subject worthy of discussion.

View original post here:
I confess, I'm scared of the next generation of supercomputers - TechRadar

Read More..

Q-NEXT collaboration awarded National Quantum Initiative funding – University of Wisconsin-Madison

The University of WisconsinMadison solidified its standing as a leader in the field of quantum information science when the U.S. Department of Energy (DOE) and the White House announced the Q-NEXT collaboration as a funded Quantum Information Science Research Center through the National Quantum Initiative Act. The five-year, $115 million collaboration was one of five Centers announced today.

Q-NEXT, a next-generation quantum science and engineering collaboration led by the DOEs Argonne National Laboratory, brings together nearly 100 world-class researchers from three national laboratories, 10 universities including UWMadison, and 10 leading U.S. technology companies to develop the science and technology to control and distribute quantum information.

The main goals for Q-NEXT are first to deliver quantum interconnects to find ways to quantum mechanically connect distant objects, says Mark Eriksson, the John Bardeen Professor of Physics at UWMadison and a Q-NEXT thrust lead. And next, to establish a national resource to both develop and provide pristine materials for quantum science and technology.

Q-NEXT will focus on three core quantum technologies:

Eriksson is leading the Materials and Integration thrust, one of six Q-NEXT focus areas that features researchers from across the collaboration. This thrust aims to: develop high-coherence materials, including for silicon and superconducting qubits, which is an essential component of preserving entanglement; develop a silicon-based optical quantum memory, which is important in developing a quantum repeater; and improve color-center quantum bits, which are used in both communication and sensing.

One of the key goals in Materials and Integration is to not just improve the materials but also to improve how you integrate those materials together so that in the end, quantum devices maintain coherence and preserve entanglement, Eriksson says. The integration part of the name is really important. You may have a material that on its own is really good at preserving coherence, yet you only make something useful when you integrate materials together.

Six other UWMadison and Wisconsin Quantum Institute faculty members are Q-NEXT investigators: physics professors Victor Brar, Shimon Kolkowitz, Robert McDermott, and Mark Saffman, electrical and computer engineering professor Mikhail Kats, and chemistry professor Randall Goldsmith. UWMadison researchers are involved in five of the six research thrusts.

Im excited about Q-NEXT because of the connections and collaborations it provides to national labs, other universities, and industry partners, Eriksson says. When youre talking about research, its those connections that often lead to the breakthroughs.

The potential impacts of Q-NEXT research include the creation ofa first-ever National Quantum Devices Databasethat will promote the development and fabrication of next generation quantum devices as well as the development of the components and systems that enable quantum communications across distances ranging from microns to kilometers.

This funding helps ensure that the Q-NEXT collaboration will lead the way in future developments in quantum science and engineering, says Steve Ackerman, UWMadison vice chancellor for research and graduate education. Q-NEXT is the epitome of the Wisconsin Idea as we work together to transfer new quantum technologies to the marketplace and support U.S. economic competitiveness in this growing field.

Read more from the original source:
Q-NEXT collaboration awarded National Quantum Initiative funding - University of Wisconsin-Madison

Read More..

UArizona Scientists to Build What Einstein Wrote off as Science Fiction – UANews

By Daniel Stolte, University Communications

Wednesday

Arizona Gov. Doug Ducey today joined University of Arizona President Robert C. Robbins and leading scientists from the new University of Arizona-based Center for Quantum Networks to talk about how the center will help develop the "internet of the future."

The National Science Foundation has awarded UArizona a five-year, $26 million grant with an additional $24 million, five-year option to lead the Center for Quantum Networks, or CQN, which is a National Science Foundation Engineering Research Center. The award has placed Arizona at the forefront of quantum networking technologies, which are expected to transform areas such as medicine, finance, data security, artificial intelligence, autonomous systems and smart devices, which together are often are referred to as "the internet of things."

"Arizona continues to lead the nation in innovation. Establishing the Center for Quantum Networks will position the state as a global leader in advancing this technology and developing the workforce of the future," Gov. Doug Ducey said. "We're proud of the work the University of Arizona has done to secure this grant and look forward to the scientific achievements that will result from it."

The CQN will take center stage in a burgeoning field. Companies like IBM, Microsoft and Google are racing to build reliable quantum computers, and China has invested billions of dollars in quantum technology research. The U.S. has begun a serious push to exceed China's investment and to "win" the global race to harness quantum technologies.

"Less than a year ago, a quantum computer for the first time performed certain calculations that are no longer feasible for even the largest conventional supercomputers," said Saikat Guha, CQN director and principal investigator and associate professor in the UArizona James C. Wyant College of Optical Sciences, who joined Ducey and Robbins for the virtual event. "The quantum internet will allow for applications that will never be possible on the internet as we know it."

Unlike the existing internet in which computers around the globe exchange data encoded in the familiar 0s and 1s the quantum internet will rely on a global network of quantum processors speaking to one another via "quantum bits," or qubits.

Qubits offer dramatic increases in processing capacity over conventional bits because they can exist in not just one state, but two at the same time. Known as superposition, this difficult-to-grasp principle was first popularized by "Schrdinger's Cat" the famous thought experiment in which an imaginative cat inside a box is neither dead nor alive until an equally imaginative observer opens the box and checks.

The key new resource that the quantum network enables by being able to communicate qubits from one point to another is to create "entanglement" across various distant users of the network. Entanglement another hallmark of quantum mechanics so strange that even Einstein was reluctant to accept it at first allows a pair of particles, including qubits, to stay strongly correlated despite being separated by large physical distances. Entanglement enables communication among parties that is impossible to hack.

One of the center's goals is to develop technologies that will put the entanglement principle to use in real-world applications for example, to stitch together far-apart sensors, such as the radio telescopes that glimpsed the first image of a black hole in space, into one giant instrument that is far more capable than the sum of the individual sensors. Similar far-reaching implications are expected in the autonomous vehicles industry and in medicine.

"Who knows, 50 years from now, your internet service provider may send a technician to your house to install a CQN-patented quantum-enabled router that does everything your current router does, but more," Guha said. "It lets you hook up your quantum gadgets to what we are beginning to build today the new internet of the future."

A first-of-its-kind campuswide quantum networking testbed will be built at the University of Arizona, connecting laboratories across the UArizona campus, initially spanning the College of Optical Sciences, Department of Electrical and Computer Engineering, Department of Materials Science and Engineering and the BIO5 Institute.

"The next few years will be very exciting, as we are at a time when the community puts emerging quantum computers, processors, sensors and other gadgets to real use," Guha said. "We are just beginning to connect small quantum computers, sensors and other gadgets into quantum networks that transmit quantum bits."

According to Guha, quantum-enabled sensors will be more sensitive than classical ones, and will dramatically improve technologies such as microscopes used in biomedical research to look for cancer cells, sensors on low-Earth-orbit satellites, and magnetic field sensors used for positioning and navigation.

Guha says today's internet is a playground for hackers, due to insecure communication links to inadequately guarded data in the cloud. Quantum systems will provide a level of privacy, security and computational clout that is impossible to achieve with today's internet.

"The Center for Quantum Networking stands as an example for the core priorities of our university-wide strategic plan," said UArizona President Robert C. Robbins. "As a leading international research university bringing the Fourth Industrial Revolution to life, we are deeply committed to (our strategic plan to) advance amazing new information technologies like quantum networking to benefit humankind. And we are equally committed to examining the complex, social, legal, economic and policy questions raised by these new technologies.

"In addition to bringing researchers together from intellectually and culturally diverse disciplines, the CQN will provide future quantum engineers and social scientists with incredible learning opportunities and the chance to work side by side with the world's leading experts."

The center will bring together scientists, engineers and social scientists working on quantum information science and engineering and its societal impacts. UArizona has teamed up with core partners Harvard University, the Massachusetts Institute of Technology and Yale University to work on the core hardware technologies for quantum networks and create an entrepreneurial ecosystem for quantum network technology transfer.

In addition to creating a diverse quantum engineering workforce, the center will develop a roadmap with industry partners to help prioritize CQN's research investments in response to new application concepts generated by the center.

Jane Bambauer, CQN co-deputy director and professor in the James E. Rogers College of Law, who also spoke about the center, said that "the classical internet changed our relationship to computers and each other."

"While we build the technical foundations for the quantum internet, we are also building the foundation for a socially responsible rollout of the new technology," Bambauer said. "We are embedding policy and social science expertise into our center's core research activities. We're also creating effective and inclusive education programs to make sure that the opportunities for jobs and for invention are shared broadly."

This is the third National Science Foundation Engineering Research Center led by the University of Arizona. The other two are the ERC for Environmentally Benign Semiconductor Manufacturing, led by the College of Engineering, and the Center for Integrated Access Networks, led by the Wyant College of Optical Sciences. CQN will be bolstered by the Wyant College's recent endowments including the largest faculty endowment gift in the history of the University of Arizona and the planned construction of the new Grand Challenges Research Building, supported by the state of Arizona.

Additional speakers at today's event included:

Original post:
UArizona Scientists to Build What Einstein Wrote off as Science Fiction - UANews

Read More..