Page 1,714«..1020..1,7131,7141,7151,716..1,7201,730..»

Alibaba Invests in Cloud Computing Business With New Campus – ETF Trends

Chinese tech giant Alibaba Group Holding Ltd. has opened a new campus for its cloud computing unit, Alibaba Cloud, in its home city of Hangzhou. Per the South China Morning Post, which Alibaba owns, the 10-building, 2.1 million-square-foot campus is roughly the size of the 2 million-square-foot campus for Googles Silicon Valley headquarters, aka the Googleplex, in Mountain View, California.

Alibaba Cloud also highlighted the campuss eco-friendly designs in a video, including a photovoltaic power generation system, flowerpots made from recycled plastic, and high-efficiency, low-energy devices in the on-site coffee shop, according to SCMP.

The new campus signals the firms commitment to investing in its growing cloud computing business. While Alibabas net income dropped 50% year-over-year in the second quarter to 22.74 billion yuan ($3.4 billion), Alibaba Cloud experienced the fastest growth among all of Alibabas business segments in Q2, making up 9% of total revenue.

The new facilities also come at a time when Chinas economy has been facing a slowdown. While Chinas economy is slowing down, Alibabas cloud computing unit has been eyeing expansion opportunities overseas. For example, Alibaba Cloud announced last month a $1 billion commitment to upgrading its global partner ecosystem.

Alibaba is currently thethird-largest holding in EMQQ Globals flagship exchange-traded fund, the Emerging Markets Internet & Ecommerce ETF (NYSEArca: EMQQ) with a weighting of 7.01% as of October 14. EMQQ seeks to offer investors exposure to the growth in internet and e-commerce activities in the developing world as middle classes expand and affordable smartphones provide unprecedentedly large swaths of the population with access to the internet for the first time, according to the issuer.

EMQQ tracks an index of leading internet and e-commerce companies that includes online retail, search engines, social networking, online video, e-payments, online gaming, and online travel.

For more news, information, and strategy, visit ourEmerging Markets Channel.

See original here:
Alibaba Invests in Cloud Computing Business With New Campus - ETF Trends

Read More..

Cloud Computing: A catalyst for the IoT Industry – SiliconIndia

Cloud computing is a great enabler for todays businesses for a variety of reasons. It helps companies, particularly small and medium enterprises jumpstart their operations sooner as there is very little lead time needed to stand up a full-fledged in-house IT infrastructure. Secondly, it eases the financial requirements by avoiding heavy capex and turning the IT costs into an opex model. Even more advantageous is the opex costs can be scaled up and down dynamically based on demand thus optimizing IT costs.

I think Cloud computing became a catalyst for the IoT industry and the proliferation that is seen today probably may not have happened in the absence of Cloud integration. Typically, IoT devices like sensors generate huge amounts of data that require both storage and processing thus making Cloud platforms the perfect choice for building IoT-based solutions. In an IoT implementation, apart from data assimilation there are some fundamental aspects like security, managing devices, etc. that needs to be considered and Cloud platforms take over some of these implementation aspects enabling the solution provider to focus on the core problem.

An interesting case study of how IoT and Cloud technologies can help to create innovative solutions was presented in a Microsoft conference few years back. Its a solution developed to monitor the pollution levels in Ganges which is a project sponsored by Central Pollution Control Board. For more information, readers could go to this link https://azure.microsoft.com/en-us/blog/cleaning-up-the-ganges-river-with-help-from-iot/

Digital technology in the financial services

When we talk about disruptive digital technologies in Financial Services industry, perhaps Blockchain is the one that stands out immediately. The concept of DLT (Decentralised Ledger Technology) has been around for some time and theres lots of interest in leveraging this technology primarily for transparency and efficiency reasons. After an article by Reserve Bank of India in 2020, many Indian banks responded to this initiative by starting to look at opportunities that involve DLT. For e.g. State Bank of India tied up with JP Morgan to use their Blockchain technology.

Adoption of Blockchain could simplify Inter-bank payment settlement and perhaps could be extended in future to cross-border payment settlements across different DLT platforms. It could also be used for settlement of securitized assets by putting them on a common ledger. Another application is using DLT for KYC whereby multiple agencies (like banks) can access customer data from a decentralized and secure database. In fact, EQ uses Blockchain in its product offering to privately funded companies and PEs for Cap table management.

The next one is probably Artificial Intelligence (AI) and Machine Learning (ML) which is predominantly being applied in Financial Services industry in managing internal and external risks. AI-based algorithms now underpin risk-based pricing in Insurance sector and in reducing NPAs in the Banking sector. The technology helps banks predict defaults and take proactive measures to mitigate that risk.

In the Indian context, Unified Payments Interface (UPI) and Aadhar-enabled Payment Service (AePS) are classic examples of disruptive products in financial services industry.

Effective Network Security acts as a gatekeeper

In todays connected world where much of the commerce happens online, its imperative businesses focus on security to safeguard them from threats in cyberspace. The recent approach to Network security is Zero Trust model which basically means never trusts any user/device unless verified. In this model, mutual authentication happens between the two entities in multiple ways, for e.g. using User credentials followed by a second factor like an OTP and sometimes application authentication happens through a digital certificate. The process also uses analytics and log analysis to detect abnormalities in user behaviour and enforce additional authenticating measures while sending alerts at the same time. This is something many of us might have come across when we try to connect to an application from a new device that the application is not aware of. The security mechanism might enforce additional authentication whilst sending an alert to us. Nowadays, businesses also use innovative methods of authentication like biometrics, voice recognition, etc. and some of these are powered by AI/ML.

Fintech players leverage Artificial Intelligence to bridge the gap in MSME lending

I think MSME lending (maybe Retail Lending too) is one of the segments significantly disrupted by technology. In a way, it has opened unconventional options for MSMEs to attract capital both for capex and working capital requirements. There are products ranging from P2P lending to Invoice Discounting offered by Fintech companies which is opening up a new market place. There are Fintech players interested in lending in this space and they use AI/ML models to predict probability of defaults and assess credit risk and appropriately hedge against them.

Read more from the original source:
Cloud Computing: A catalyst for the IoT Industry - SiliconIndia

Read More..

Cleveland Clinic and IBM Begin Installation of IBM Quantum System One – IBM Newsroom

Key milestone in 10-Year Partnership Aimed at Accelerating Discovery in Healthcare and Life Sciences

Oct 18, 2022

Cleveland, OH and Armonk, N.Y. October 18, 2022:Cleveland Clinicand IBM have begundeployment of the first private sector onsite,IBM-managedquantum computer in the United States.The IBM Quantum Systemis to be located on Cleveland Clinics main campus in Cleveland.

The first quantum computer in healthcare, anticipated to be completed in early 2023, is a key part of the two organizations10-year partnership aimed at fundamentally advancing the pace of biomedical research through high-performance computing. Announced in 2021, the Cleveland Clinic-IBM Discovery Accelerator is a joint center that leverages Cleveland Clinics medical expertise with the technology expertise of IBM, including its leadership in quantum computing.

The current pace of scientific discovery is unacceptably slow, while our research needs are growing exponentially, said Lara Jehi, M.D., Cleveland Clinics Chief Research Information Officer. We cannot afford to continue to spend a decade or more going from a research idea in a lab to therapies on the market. Quantum offers a future to transform this pace, particularly in drug discovery and machine learning.

A step change in the way we solve scientific problems is on the horizon, said Dr. Ruoyi Zhou, Director, IBM Research - Cleveland Clinic Partnership. At IBM, were more motivated than ever to create with Cleveland Clinic and others lasting communities of discovery and harness the power of quantum computing, AI and hybrid cloud to usher in a new era of accelerated discovery in healthcare and life sciences.

The Discovery Accelerator at Cleveland Clinic upon a variety of IBMs latest advancements in high performance computing, including:

The Discovery Accelerator also serves as the technology foundation for Cleveland Clinics Global Center for Pathogen Research & Human Health, part of the Cleveland Innovation District. The center, supported by a $500 million investment from the State of Ohio, Jobs Ohio and Cleveland Clinic, brings together a team focused on studying, preparing and protecting against emerging pathogens and virus-related diseases. Through Discovery Accelerator, researchers are leveraging advanced computational technology to expedite critical research into treatments and vaccines.

Together, the teams have already begun several collaborative projects that benefit from the new computational power. The Discovery Accelerator projects include a research study developing a quantum computing method to screen and optimize drugs targeted to specific proteins; improving a prediction model for cardiovascular risk following non-cardiac surgery; and using artificial intelligence to search genome sequencing findings and large drug-target databases to find effective, existing drugs that could help patients with Alzheimers and other diseases.

A significant part of the collaboration is a focus on educating the workforce of the future and creating jobs to grow the economy. An innovative educational curriculum has been designed for participants from high school to professional level, offering training and certification programs in data science, machine learning and quantum computing to build the skilled workforce needed for cutting-edge computational research of the future.

About Cleveland Clinic

Cleveland Clinic is a nonprofit multispecialty academic medical center that integrates clinical and hospital care with research and education. Located in Cleveland, Ohio, it was founded in 1921 by four renowned physicians with a vision of providing outstanding patient care based upon the principles of cooperation, compassion and innovation. Cleveland Clinic has pioneered many medical breakthroughs, including coronary artery bypass surgery and the first face transplant in the United States. U.S. News & World Report consistently names Cleveland Clinic as one of the nations best hospitals in its annual Americas Best Hospitals survey. Among Cleveland Clinics 72,500 employees worldwide are more than 5,050 salaried physicians and researchers, and 17,800 registered nurses and advanced practice providers, representing 140 medical specialties and subspecialties. Cleveland Clinic is a 6,500-bed health system that includes a 173-acre main campus near downtown Cleveland, 22 hospitals, more than 220 outpatient facilities, including locations in northeast Ohio; southeast Florida; Las Vegas, Nevada; Toronto, Canada; Abu Dhabi, UAE; and London, England. In 2021, there were 10.2 million total outpatient visits, 304,000 hospital admissions and observations, and 259,000 surgical cases throughout Cleveland Clinics health system. Patients came for treatment from every state and 185 countries. Visit us at clevelandclinic.org.Follow us at twitter.com/ClevelandClinic. News and resources available atnewsroom.clevelandclinic.org.

Editors Note: Cleveland Clinic News Service is available to provide broadcast-quality interviews and B-roll upon request.

About IBM

IBM is a leading global hybrid cloud and AI, and business services provider, helping clients in more than 175 countries capitalize on insights from their data, streamline business processes, reduce costs and gain the competitive edge in their industries. Nearly 4,000 government and corporate entities in critical infrastructure areas such as financial services, telecommunications and healthcare rely on IBM's hybrid cloud platform and Red Hat OpenShift to affect their digital transformations quickly, efficiently and securely. IBM's breakthrough innovations in AI, quantum computing, industry-specific cloud solutions and business services deliver open and flexible options to our clients. All of this is backed by IBM's legendary commitment to trust, transparency, responsibility, inclusivity and service. For more information, visit https://research.ibm.com.

Contacts:

Alicia RealeCleveland Clinic216-408-7444Realeca@ccf.org

Sarah BenchaitaIBM Research281-455-6432sarah.benchaita@ibm.com

Read the original post:
Cleveland Clinic and IBM Begin Installation of IBM Quantum System One - IBM Newsroom

Read More..

Revitalising data and infrastructure management through cloud – ETCIO South East Asia

The Cloud has been a significant contributor to the digital optimisation and transformation of businesses and institutions globally since the 2010s. It seems almost an eternity ago, when the IT department was essentially a support function, with KRAs around design and delivery of Information Technology Architecture encompassing Infrastructure, Data Centres and constituent servers, personal computers, software, networking and security systems, along with the associated, vendor evaluation, outsourcing, contracting, commissioning and finally aligning with business systems and goals, as this pre-millennium Research Gate paper indicates.

The one and a half decades since the advent of the millennium saw the rise of many trends, besides the cloud, such as shift from integrated to business specific applications, resulting data management and insights, globalisation, adoption of Infrastructure as a Service (IaaS), Platform as a Service (PaaS), implosion of telecom, mobility and Mobile Backend as a Service (MBaaS), other technologies such as social media, E-commerce, Extended Reality, Digital Twins, AI/ ML, RPA, Internet of things, Blockchain and Chatbots and lastly, the growing skill-gaps and demands of talent.

The cloud has now taken over a major chunk of responsibilities pertaining to infrastructure, data centre, hosting, SaaS and architectural applications, platforms, networking and security functions thus freeing up the IT and Business Teams for leveraging technology for more strategic tasks related to operations, customers, R&D, supply chain and others. The cloud, hence enabled companies and institutions to leverage the amalgamation of technology, people and processes across their extended enterprises to have ongoing digital programmes for driving revenue, customer satisfaction, and profitability. The cloud can potentially add USD 1 Trillion of Economic Value across the Fortune 500 band of companies by 2030, as this research by McKinsey estimates.

Even before the pandemic, although the initial adoption of Cloud was in the SaaS applications, IaaS and PaaS was surely catching up, thus shifting away from the traditional Data Centre and On-premise Infrastructure. Gartners research way back in 2015 predicted a 30 % plus increase in the IaaS spending with the public cloud IaaS workloads finally surpassing those of on-premise loads. In the same year, Gartners similar paper highlighted significant growth in PaaS as well : both for Infrastructure and Application iPaaS.

The Cloud is adding significant value across Industry verticals and business functions, right from remote working with online meetings & collaboration tools, automated factory operations, extended reality, digital twins, remote field services and many others. The cloud has also been adopted as the platform for deploying other new technologies such as RPA and Artificial Intelligence/ Machine Learning (AI/ ML) Depending on industry best practices, business use cases and IT strategies, it became feasible to leverage infrastructure, assets, applications, assets, and software, in a true Hybrid / Multi/ Industry Cloud scenario with separate private and public cloud environments covering IaaS, PaaS, SaaS and MBaaS. As Platforms were maturing, organisations were furthermore transitioning from Virtual Machine and even from IaaS based solutions to PaaS based. Gartner had predicted in this research that by 2021, over 75% of enterprises and mid-sized organisations would adopt a hybrid or multi-cloud strategy.

There was also a clear transition from the traditional lift and shift to the cloud native approach, which makes full use of cloud elasticity and optimisation levers and moreover minimises technical debt and inefficiencies. This approach makes use of cloud computing to build and run microservices based scalable applications running in virtualised containers, orchestrated in the Container-as-a-service applications and managed and deployed using DevOps workflows. Microservices, container management, infrastructure as a code, serverless architectures, declarative code and continuous integration and delivery (CI/CD) are the fundamental tenets of this cloud native approach. Organisations are balancing use of containerization along with leveraging the cloud hosting provider capabilities especially considering the extent of hybrid cloud, efforts and costs of container infrastructure and running commodity applications.

From the architecture standpoint, cloud-based composable architectures such as MACH- Microservices based, API-first, Cloud-native SaaS and Headless and Packaged business capabilities (PBCs) are increasingly being used in organisations for enhancing Digital Experience Platforms enabling customers, employees and supply chain with the new age Omnichannel experience. These architectures facilitate faster deployment and time to market through quick testing by sample populations and subsequent full-fledged implementations. These composable architectures help organisations in future proofing their IT investments, and improve business resilience and recovery with the ability to decouple and recouple the technology stacks. At the end of the 1st year of the pandemic, Gartner here highlighted the importance of composable architecture in its Hype Cycle of 2021 especially in business resilience and recovery during a crisis.

Intelligently deploying Serverless Computing in the architecture also enhances cloud native strategies immensely, thus enabling developers to focus on triggers and running function/ event-based computing, also resulting in more optimised cloud economics. Also, access to the cloud service providers Function-as-a-service (FaaS) and Backend-as-service (BaaS) models significantly reduce the IT environment transformation costs. This Deloitte Research illustrates the advantages that Serverless computing can bring about to retail operations.

To enhance their cloud native strategies, to further encourage citizen development, reducing over reliance on IT and bridging the IT-Business Gap, organisations are also making use of Low Code No Code (LCNC) tools and assembling, clearly shifting from application development. Citizen developers are making use of LCNC functionalities such as Drag and drop, pre-built User Interfaces, APIs and Connectors, one-click delivery and others to further augment their containerisation and microservices strategies. This Gartner Research predicts that 70% of new applications developed by organisations will use LCNC by 2025, well up from less than 25% in 2020.

Infrastructure and Data Management in the cloud are being immensely powered up by Automation and Orchestration. Especially in minimising manual efforts and errors in processes such as provisioning, configuring, sizing and auto-scaling, asset tagging, clustering and load balancing, performance monitoring, deploying, DevOps and CI/ CD testing and performance management. Further efficiencies are brought to fruition through automation especially in areas such as shutting down unutilised instances, backups, workflow version control, and establishing Infrastructure as Code (IAC). This further hence value-adds to robust cloud native architecture by enhancing containerisation, clustering, network configuration, storage connectivity, load balancing and managing the workload lifecycle, besides highlighting vulnerabilities and risks. Enterprises pursuing hybrid cloud strategies are hence driving automation in private clouds as well as integrating with public clouds by creating automation assets that perform resource codification across all private and public clouds and offer a single API. This McKinsey research highlights that companies that have adopted end-to-end automation in their cloud platforms and initiatives report a 20-40% increase in speed of releasing new capabilities to market. A similar report by Deloitte, mentions that intelligent automation in the cloud enables scale in just 4-12 months, compared to the earlier 6-24 months period, through streamlined development and deployment processes.

CIOs are also increasingly turning to distributed cloud models to address edge or location-based cloud use cases especially across Banks and Financial Institutions, Healthcare, Smart cities, and Manufacturing. It is expected that decentralised and distributed cloud computing will move from the initial private cloud substation deployments to an eventually Wi-Fi like distributed cloud substation ecosystems, especially considering the necessary availability, bandwidth and other operational and security aspects.

These rapid developments in the cloud ecosystems especially for hybrid and multi cloud environments have necessitated infrastructure and data management to encompass dashboards for end-to-end visibility of all the cloud resources and usage across providers, business functions, and departments. Governance and Compliance, Monitoring, Inventory Management, Patches and Version Control, Disaster Recovery, Hybrid Cloud Management Platforms (HCMP), Cloud Service Brokers (CSB) and other Tools aid companies in better Infrastructure Management in the Cloud, while catering to fluctuating demands and corresponding under and over utilisation scenarios, while continuously identifying pockets for optimisation and corrections. For companies with customers spread across diverse geographies, it is important to have tools for infrastructure management, global analytics, database engines and application architectures across these global Kubernetes clusters and Virtual Machines.

The vast increase in attack surfaces and potential breach points have necessitated CIOs and CISOs to incorporating robust security principles and tools within the Cloud Native ecosystem itself, through Cloud Security Platforms such as Cloud Access Security Broker (CASB), Cloud security posture management (CSPM), Secure Access Service Edge (SASE), DevSecOps and incorporation of AI and ML in their proactive threat hunting and response systems. This is also critical in adhering to Governance, Risk and Compliance (GRC) and Regulatory compliances, in with Zero Trust Architecture and Cyber Resilient frameworks and strategy. This McKinsey article highlights the importance of Security as Code (SaC) in cloud native strategies and its reliance on architecture and the right automation capabilities.

This EY article highlights the importance of cybersecurity in cloud native strategies as well as the corresponding considerations in processes, cyber security tools, architecture, risk management, skills and competencies and controls. Data Encryption and Load Protection, Identity and Access management, Extended Threat and Response Systems (XDR), Security Incident and Environment Management (SIEM), and Security Orchestration and Response (SOAR) tools that incorporate AI/ ML capabilities ensure more of proactive vis--vis a reactive response. Considering the vast information to ingest, store and analyse, organisations are also considering/ deploying Cyber Data Lakes as either alternatives or in conjunction to complement their SIEM ecosystems.

There is an increasing popularity of Financial Operations (FinOps) which is helping organisations to gain the maximum value from the cloud, through the cross functional involvement of business, finance, procurement, supply chain, engineering, DevOps/ DevSecOps and cloud operational teams. Augmented FinOps has been listed by Gartner in the 2022 Hype Cycle for emerging technologies here. FinOps immensely value-adds to infrastructure and data management in the cloud through dynamic and continuous sourcing and managing cloud consumption, demand mapping, crystallising the Total Cost of Ownership and Operations with end-to-end cost visibility and forecasting to make joint decisions and monitor comprehensive KPIs. Besides the cloud infrastructure management strategies listed in this section, FinOps also incorporates vendor management strategies and leveraging cloud carbon footprint tools for their organisations Sustainability Goals.

What about Data Management through the Cloud?

The 2nd half of the 2010s and especially the COVID-19 period have also resulted in an implosion of IoT, social media, E-commerce and other Digital Transformation. This has made organisations deal with diverse data sources residing on cloud, on-premise and on the edge, diversity in data sets across sensor, text, image, Audio-Visual, Voice, E-commerce, social media and others, and the volume of data that is now required to be ingested, managed and delivered on real-time and batch mode. Even before the pandemic, this implosion of unstructured data, necessitated companies to leverage Hadoop and other Open Source based Data Lakes besides their structured data residing in Data Warehouses. According to this Deloitte Article, for their surveyed CXOs, Data Modernisation is even a more critical aspect than cost and performance consideration for migrating to the cloud.

This research by Statista estimated the total worldwide data amount rose from 9 Zettabytes in 2013 to over 27 Zettabytes in 2021, and the prediction is this growing to well over 180 Zettabytes in 2025. Decentralised and distributed cloud computing, Web 3.0, the Metaverse and rise in Edge Computing will further contribute to this data growth.

Many organisations are looking at the Cloud as the future of data management as this article by Gartner states. As the cloud encompasses more and more data sources, this becomes more pivotal for data architects to have a deeper understanding of metadata and schema, the end-to-end data lifecycle pipeline of ingestion, cleaning, storage, analysis, delivery and visualisation, APIs, cloud automation and orchestration, Data Streaming, AI/ ML models, Analytics, Data Storage and Visualisation, as well as Governance and Security.

Data Architects are hence leveraging cloud computing in their strategies including scaling, elasticity and decoupling, ensuring high availability and optimal performance with relation to bursts and shutdowns, while optimising cost at the same time. The Data Teams are also deploying Automated and Active Cloud Data management covering classification, validation and governance with extensible and decoupling. There is also careful consideration for ensuring security for data at rest and in motion, as well as seamless data integration and sharing

It is also important to choose the right MetaData Strategy and consider options of Tiered apps with push-based services, pull-based ETL, and Event based Metadata. It is also worthwhile to stress upon the importance of having a robust Data Architecture/ DataOps culture as well, especially considering business and technology perspectives of the end-to-end data lifecycle right from data sources and ingestion, meta data and active data management, streaming, storage, analytics and visualisation. Deploying elasticity, AI/ ML and automation bring about immense benefits to the cloud native strategies.

Considering these aspects in Data Management, organisations have looking at ML and API powered Data Fabrics along with Data Lakes, Warehouses and Layers to manage this end-to-end data lifecycle by creating, maintaining and providing outputs to the consumers of this data, as this Gartner article on technology trends for 2022 highlights.

This article by McKinsey summarises the major pivot points in the data architecture ethos which are fundamentally based on Cloud with containerization and serverless data. These cover hybrid real time and batch data processing, shift from end-to-end COTS applications to modular best in function/ industry, move to APIs and decoupling, shift from centralised Data Warehousing to domain-based architecture and lastly from proprietary predefined datasets to data schema that is light and flexible, especially the NoSQL family.

For BFSI, Telecoms and other industry verticals which need customer data to reside locally, CXOs have been deploying Hybrid Data Management environments, that leverage Cloud Data Management tools to also automate, orchestrate, and re-use the on-premise data, thus providing a unified data model and access interface to both cloud and on-premise datasets.

Application of Automation and Orchestration in Data Storage also ensures prioritisation of processes, tasks and resources to balance speed, efficiency, usage and cost along with eliminating security vulnerabilities. This is especially applicable for tasks such as provisioning and configuration, capacity management, workflows and data migration, resource optimisation, software updates and data protection and disaster recovery. This World Economic Forum report right before the pandemic highlighted the fact that the conventional optical/ magnetic storage systems will be unable to handle this phenomenon for more than a century. CIOs and Leaders are hence leveraging automation and cloud, Storage-as-a Service (STaaS), decentralised Blockchain powered data storage and storage on the Edge, besides alternates to conventional electromagnetic/ optical data storage mechanism

What is the role of people and culture in this cloud powered data and infrastructure management ecosystem?

People, talent pool and organisation culture play a pivotal part in successful FinOps, cloud native and cloud data management strategies. In this dynamic and uncertain world, it is of paramount importance to have uniformity, alignment and resonance of business KPIs to best practices for Enterprise and Data Architecture, DevOps as well as those of Engineering, Finance, and Procurement. This environment of continuous evolution, and optimisation can be only brought about by an ethos of Communication, Trust, Change Management, Business-Finance-IT alignment, which are equally important cloud native strategies, Architecture, DevOps, DataOps, Security and other Engineering Talent Pools.

The continuing trends of the Great Resignation, Quiet Quitting and Moonlighting necessitate a combination of having the best employee and vendor engagement strategies, a readily available talent pool of architects, analysts, engineers and other skillsets, as well as upskilling.

Winding up?

The Cloud has deeply impacted and revitalised Infrastructure and Data Management in all aspects in the workplace. As per this Deloitte research, it is ideal to leverage an equal mix of people, tools and approaches to address cloud complexity, and have a powerful, agile, elastic, secure and resilient virtual business infrastructure deriving maximum value from the cloud.

Cloud-centric digital infrastructure is a bedrock in the post COVID world, aligning technology with business to support digital transformation, resilience, governance along with business outcomes through a combination of operations, technology and deployment as mentioned in this IDC paper. This is so important in the increasing complexity of todays world across Public Cloud Infrastructure, On-Premises and on the Edge.

With continuing business uncertainty, competitiveness, customer, supplier and employee pressures and stringent IT budgets, organisations are looking at the Cloud to revitalise their Infrastructure and Data Management and gain maximum value.

View original post here:
Revitalising data and infrastructure management through cloud - ETCIO South East Asia

Read More..

Edge computing has given wings to low Earth orbit (LEO) satellite communication, a 6G core technology! – EurekAlert

image:. view more

Credit: .

Tworesearchteams,oneledbyProfessorJeonghoKwakoftheDepartmentofElectricalEngineeringandComputerScienceatDGIST(PresidentKukYang)andtheotherbyProfessorJihwanChoioftheDepartmentofAerospaceEngineeringatKAIST(PresidentKwangHyungLee),havedevelopednewedge-computingoffloadingandnetwork-slicingtechniquesthatcanbeutilizedinnext-generationlowEarthorbit(LEO)satellitenetworksystems.

LEOsatellitenetworkreferstoacommunicationnetworkthatprovidesstableInternetservicesusingsatellitesthatorbit3001500kmfromEarth.Unlikebasestationsbuiltontheground,toandfromwhichradiowavesareoccasionallyobstructedbymountainsorbuildings,LEOsatellitescanbeusedtobuildcommunicationnetworksinlocationswherebasestationsaredifficulttodeployowingtolowpopulationdensitybylaunchingthesatellitesintoorbit.Therefore,LEOsatellitenetworkshavereceivedattentionasnext-generationsatellite-communicationsystemsthatcanrapidlyprovidecommunicationservicestomorediverseregions.

Edgecomputingdiffersfromcloudcomputinginthatdataisprocessedineachdeviceinadistributedmanner.Sincedataisprocessedandthecomputationalresultsareappliedtotheedgewherethedataiscollected,congestioninthedatacentercanbemitigated.

Althoughstudiesonedgecomputinginexistingterrestrialnetworkshavebeenactivelyconducted,adifferentapproachisneededtoapplyedgecomputingtoLEOsatellites.Thisisbecauseallsatellitecomponentsofthecorenetworks,includingLEOsatellitenetworks,areconnectedwirelessly,andthesatellitesorbitaroundtheEarthataveryhighspeed.Furthermore,thesatelliteshavealowerpowersupplyandcomputingpowerthanterrestrialnetworks.Therefore,customizedsolutionsareneededfornewareasthathavenotbeencoveredbyterrestrialnetworks.

Therefore,ProfessorJeonghoKwakandProfessorJihwanChoisresearchteamsproposedanetworkslicingtechnique[1]thatharnessesthedistributionandmovementcharacteristicsofLEOsatellitesandthecharacteristicsofwireless-channelenvironmentsinascenariowithseveralvirtualizedservices.Atthesametime,theyalsoproposedacodeanddata-offloadingtechnique[2]forsatellite-edgecomputing.

Theedge-computingandslicingtechniquesdevelopedforLEOsatellitesinthisresearcharesignificantbecausetheyadvancethedomesticsatellitenetworktechnologyonestepfurther.However,inSouthKorea,thistechnologyisstillintheearlystagescomparedwithoverseascountries,whereLEOsatelliteInternetservicessuchasElonMusk'sStarlinkarebeingcommercialized.

ProfessorJeonghoKwakoftheDepartmentofElectricalEngineeringandComputerScienceatDGISTstated,Thisresearchanalyzedtheeffectofnetworkslicingandcode/dataoffloadingratioaccordingtothechangingLEOsatelliteenvironment."Headded,"OurgoalistoprovideablueprintfornovelapplicationsforLEOsatellitesinthe6Gerainthefuture."

Meanwhile,theresearchresultswerepublishedintheIEEEInternetofThingsJournalonAugust1,2022,withTaeyeonKim,aPh.D.studentoftheDepartmentofElectricalEngineeringandComputerScienceatDGIST,asthefirstauthor.

Correspondent author's e-mail address :jeongho.kwak@dgist.ac.kr

[1]Networkslicing:Atechnologythatcanprovidepersonalizedservicesbydividingonephysicalcorenetworkintomultiplevirtualnetworks.

[2]Offloadingtechnique:Distributesrapidlyincreasingdatatraffictoothernetworks

IEEE Internet of Things Journal

Satellite Edge Computing Architecture and Network Slice Scheduling for IoT Support

1-Aug-2022

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.

Read the original post:
Edge computing has given wings to low Earth orbit (LEO) satellite communication, a 6G core technology! - EurekAlert

Read More..

Interpretations of quantum mechanics – Wikipedia

Set of statements that attempt to explain how quantum mechanics informs our understanding of nature

An interpretation of quantum mechanics is an attempt to explain how the mathematical theory of quantum mechanics might correspond to experienced reality. Although quantum mechanics has held up to rigorous and extremely precise tests in an extraordinarily broad range of experiments, there exist a number of contending schools of thought over their interpretation. These views on interpretation differ on such fundamental questions as whether quantum mechanics is deterministic or stochastic, which elements of quantum mechanics can be considered real, and what the nature of measurement is, among other matters.

Despite nearly a century of debate and experiment, no consensus has been reached among physicists and philosophers of physics concerning which interpretation best "represents" reality.[1][2]

The definition of quantum theorists' terms, such as wave function and matrix mechanics, progressed through many stages. For instance, Erwin Schrdinger originally viewed the electron's wave function as its charge density smeared across space, but Max Born reinterpreted the absolute square value of the wave function as the electron's probability density distributed across space.[3]:2433

The views of several early pioneers of quantum mechanics, such as Niels Bohr and Werner Heisenberg, are often grouped together as the "Copenhagen interpretation", though physicists and historians of physics have argued that this terminology obscures differences between the views so designated.[3][4] Copenhagen-type ideas were never universally embraced, and challenges to a perceived Copenhagen orthodoxy gained increasing attention in the 1950s with the pilot-wave interpretation of David Bohm and the many-worlds interpretation of Hugh Everett III.[3][5][6]

The physicist N. David Mermin once quipped, "New interpretations appear every year. None ever disappear."[7] As a rough guide to development of the mainstream view during the 1990s and 2000s, a "snapshot" of opinions was collected in a poll by Schlosshauer et al. at the "Quantum Physics and the Nature of Reality" conference of July 2011.[8]The authors reference a similarly informal poll carried out by Max Tegmark at the "Fundamental Problems in Quantum Theory" conference in August 1997. The main conclusion of the authors is that "the Copenhagen interpretation still reigns supreme", receiving the most votes in their poll (42%), besides the rise to mainstream notability of the many-worlds interpretations: "The Copenhagen interpretation still reigns supreme here, especially if we lump it together with intellectual offsprings such as information-based interpretations and the Quantum Bayesian interpretation. In Tegmark's poll, the Everett interpretation received 17% of the vote, which is similar to the number of votes (18%) in our poll."

Some concepts originating from studies of interpretations have found more practical application in quantum information science.[9][10]

More or less, all interpretations of quantum mechanics share two qualities:

Two qualities vary among interpretations:

In philosophy of science, the distinction of knowledge versus reality is termed epistemic versus ontic. A general law is a regularity of outcomes (epistemic), whereas a causal mechanism may regulate the outcomes (ontic). A phenomenon can receive interpretation either ontic or epistemic. For instance, indeterminism may be attributed to limitations of human observation and perception (epistemic), or may be explained as a real existing maybe encoded in the universe (ontic). Confusing the epistemic with the ontic, if for example one were to presume that a general law actually "governs" outcomesand that the statement of a regularity has the role of a causal mechanismis a category mistake.

In a broad sense, scientific theory can be viewed as offering scientific realismapproximately true description or explanation of the natural worldor might be perceived with antirealism. A realist stance seeks the epistemic and the ontic, whereas an antirealist stance seeks epistemic but not the ontic. In the 20th century's first half, antirealism was mainly logical positivism, which sought to exclude unobservable aspects of reality from scientific theory.

Since the 1950s, antirealism is more modest, usually instrumentalism, permitting talk of unobservable aspects, but ultimately discarding the very question of realism and posing scientific theory as a tool to help humans make predictions, not to attain metaphysical understanding of the world. The instrumentalist view is carried by the famous quote of David Mermin, "Shut up and calculate", often misattributed to Richard Feynman.[11]

Other approaches to resolve conceptual problems introduce new mathematical formalism, and so propose alternative theories with their interpretations. An example is Bohmian mechanics, whose empirical equivalence with the three standard formalismsSchrdinger's wave mechanics, Heisenberg's matrix mechanics, and Feynman's path integral formalismhas been demonstrated.

The Copenhagen interpretation is a collection of views about the meaning of quantum mechanics principally attributed to Niels Bohr and Werner Heisenberg. It is one of the oldest attitudes towards quantum mechanics, as features of it date to the development of quantum mechanics during 19251927, and it remains one of the most commonly taught.[14][15] There is no definitive historical statement of what is the Copenhagen interpretation, and there were in particular fundamental disagreements between the views of Bohr and Heisenberg.[16][17] For example, Heisenberg emphasized a sharp "cut" between the observer (or the instrument) and the system being observed,[18]:133 while Bohr offered an interpretation that is independent of a subjective observer or measurement or collapse, which relies on an "irreversible" or effectively irreversible process which imparts the classical behavior of "observation" or "measurement".[19][20][21][22]

Features common to Copenhagen-type interpretations include the idea that quantum mechanics is intrinsically indeterministic, with probabilities calculated using the Born rule, and the principle of complementarity, which states that objects have certain pairs of complementary properties which cannot all be observed or measured simultaneously. Moreover, the act of "observing" or "measuring" an object is irreversible, no truth can be attributed to an object except according to the results of its measurement. Copenhagen-type interpretations hold that quantum descriptions are objective, in that they are independent of physicists' mental arbitrariness.[23]:8590 The statistical interpretation of wavefunctions due to Max Born differs sharply from Schrdinger's original intent, which was to have a theory with continuous time evolution and in which wavefunctions directly described physical reality.[3]:2433[24]

The many-worlds interpretation is an interpretation of quantum mechanics in which a universal wavefunction obeys the same deterministic, reversible laws at all times; in particular there is no (indeterministic and irreversible) wavefunction collapse associated with measurement. The phenomena associated with measurement are claimed to be explained by decoherence, which occurs when states interact with the environment. More precisely, the parts of the wavefunction describing observers become increasingly entangled with the parts of the wavefunction describing their experiments. Although all possible outcomes of experiments continue to lie in the wavefunction's support, the times at which they become correlated with observers effectively "split" the universe into mutually unobservable alternate histories.

Quantum informational approaches[25][26] have attracted growing support.[27][8] They subdivide into two kinds.[28]

The state is not an objective property of an individual system but is that information, obtained from a knowledge of how a system was prepared, which can be used for making predictions about future measurements....A quantum mechanical state being a summary of the observer's information about an individual physical system changes both by dynamical laws, and whenever the observer acquires new information about the system through the process of measurement. The existence of two laws for the evolution of the state vector...becomes problematical only if it is believed that the state vector is an objective property of the system...The "reduction of the wavepacket" does take place in the consciousness of the observer, not because of any unique physical process which takes place there, but only because the state is a construct of the observer and not an objective property of the physical system.[31]

The essential idea behind relational quantum mechanics, following the precedent of special relativity, is that different observers may give different accounts of the same series of events: for example, to one observer at a given point in time, a system may be in a single, "collapsed" eigenstate, while to another observer at the same time, it may be in a superposition of two or more states. Consequently, if quantum mechanics is to be a complete theory, relational quantum mechanics argues that the notion of "state" describes not the observed system itself, but the relationship, or correlation, between the system and its observer(s). The state vector of conventional quantum mechanics becomes a description of the correlation of some degrees of freedom in the observer, with respect to the observed system. However, it is held by relational quantum mechanics that this applies to all physical objects, whether or not they are conscious or macroscopic. Any "measurement event" is seen simply as an ordinary physical interaction, an establishment of the sort of correlation discussed above. Thus the physical content of the theory has to do not with objects themselves, but the relations between them.[32][33]

QBism, which originally stood for "quantum Bayesianism", is an interpretation of quantum mechanics that takes an agent's actions and experiences as the central concerns of the theory. This interpretation is distinguished by its use of a subjective Bayesian account of probabilities to understand the quantum mechanical Born rule as a normative addition to good decision-making. QBism draws from the fields of quantum information and Bayesian probability and aims to eliminate the interpretational conundrums that have beset quantum theory.

QBism deals with common questions in the interpretation of quantum theory about the nature of wavefunction superposition, quantum measurement, and entanglement.[34][35] According to QBism, many, but not all, aspects of the quantum formalism are subjective in nature. For example, in this interpretation, a quantum state is not an element of realityinstead it represents the degrees of belief an agent has about the possible outcomes of measurements. For this reason, some philosophers of science have deemed QBism a form of anti-realism.[36][37] The originators of the interpretation disagree with this characterization, proposing instead that the theory more properly aligns with a kind of realism they call "participatory realism", wherein reality consists of more than can be captured by any putative third-person account of it.[38][39]

The consistent histories interpretation generalizes the conventional Copenhagen interpretation and attempts to provide a natural interpretation of quantum cosmology. The theory is based on a consistency criterion that allows the history of a system to be described so that the probabilities for each history obey the additive rules of classical probability. It is claimed to be consistent with the Schrdinger equation.

According to this interpretation, the purpose of a quantum-mechanical theory is to predict the relative probabilities of various alternative histories (for example, of a particle).

The ensemble interpretation, also called the statistical interpretation, can be viewed as a minimalist interpretation. That is, it claims to make the fewest assumptions associated with the standard mathematics. It takes the statistical interpretation of Born to the fullest extent. The interpretation states that the wave function does not apply to an individual system for example, a single particle but is an abstract statistical quantity that only applies to an ensemble (a vast multitude) of similarly prepared systems or particles. In the words of Einstein:

The attempt to conceive the quantum-theoretical description as the complete description of the individual systems leads to unnatural theoretical interpretations, which become immediately unnecessary if one accepts the interpretation that the description refers to ensembles of systems and not to individual systems.

Einstein in Albert Einstein: Philosopher-Scientist, ed. P.A. Schilpp (Harper & Row, New York)

The most prominent current advocate of the ensemble interpretation is Leslie E. Ballentine, professor at Simon Fraser University, author of the text book Quantum Mechanics, A Modern Development.

The de BroglieBohm theory of quantum mechanics (also known as the pilot wave theory) is a theory by Louis de Broglie and extended later by David Bohm to include measurements. Particles, which always have positions, are guided by the wavefunction. The wavefunction evolves according to the Schrdinger wave equation, and the wavefunction never collapses. The theory takes place in a single spacetime, is non-local, and is deterministic. The simultaneous determination of a particle's position and velocity is subject to the usual uncertainty principle constraint. The theory is considered to be a hidden-variable theory, and by embracing non-locality it satisfies Bell's inequality. The measurement problem is resolved, since the particles have definite positions at all times.[40] Collapse is explained as phenomenological.[41]

Quantum Darwinism is a theory meant to explain the emergence of the classical world from the quantum world as due to a process of Darwinian natural selection induced by the environment interacting with the quantum system; where the many possible quantum states are selected against in favor of a stable pointer state. It was proposed in 2003 by Wojciech Zurek and a group of collaborators including Ollivier, Poulin, Paz and Blume-Kohout. The development of the theory is due to the integration of a number of Zurek's research topics pursued over the course of twenty-five years including: pointer states, einselection and decoherence.

The transactional interpretation of quantum mechanics (TIQM) by John G. Cramer is an interpretation of quantum mechanics inspired by the WheelerFeynman absorber theory.[42] It describes the collapse of the wave function as resulting from a time-symmetric transaction between a possibility wave from the source to the receiver (the wave function) and a possibility wave from the receiver to source (the complex conjugate of the wave function). This interpretation of quantum mechanics is unique in that it not only views the wave function as a real entity, but the complex conjugate of the wave function, which appears in the Born rule for calculating the expected value for an observable, as also real.

Objective-collapse theories differ from the Copenhagen interpretation by regarding both the wave function and the process of collapse as ontologically objective (meaning these exist and occur independent of the observer). In objective theories, collapse occurs either randomly ("spontaneous localization") or when some physical threshold is reached, with observers having no special role. Thus, objective-collapse theories are realistic, indeterministic, no-hidden-variables theories. Standard quantum mechanics does not specify any mechanism of collapse; QM would need to be extended if objective collapse is correct. The requirement for an extension to QM means that objective collapse is more of a theory than an interpretation. Examples include

In his treatise The Mathematical Foundations of Quantum Mechanics, John von Neumann deeply analyzed the so-called measurement problem. He concluded that the entire physical universe could be made subject to the Schrdinger equation (the universal wave function). He also described how measurement could cause a collapse of the wave function.[44] This point of view was prominently expanded on by Eugene Wigner, who argued that human experimenter consciousness (or maybe even dog consciousness) was critical for the collapse, but he later abandoned this interpretation.[45][46]

Quantum logic can be regarded as a kind of propositional logic suitable for understanding the apparent anomalies regarding quantum measurement, most notably those concerning composition of measurement operations of complementary variables. This research area and its name originated in the 1936 paper by Garrett Birkhoff and John von Neumann, who attempted to reconcile some of the apparent inconsistencies of classical boolean logic with the facts related to measurement and observation in quantum mechanics.

Modal interpretations of quantum mechanics were first conceived of in 1972 by Bas van Fraassen, in his paper "A formal approach to the philosophy of science". Van Fraassen introduced a distinction between a dynamical state, which describes what might be true about a system and which always evolves according to the Schrdinger equation, and a value state, which indicates what is actually true about a system at a given time. The term "modal interpretation" now is used to describe a larger set of models that grew out of this approach. The Stanford Encyclopedia of Philosophy describes several versions, including proposals by Kochen, Dieks, Clifton, Dickson, and Bub.[47] According to Michel Bitbol, Schrdinger's views on how to interpret quantum mechanics progressed through as many as four stages, ending with a non-collapse view that in respects resembles the interpretations of Everett and van Fraassen. Because Schrdinger subscribed to a kind of post-Machian neutral monism, in which "matter" and "mind" are only different aspects or arrangements of the same common elements, treating the wavefunction as ontic and treating it as epistemic became interchangeable.[48]

Time-symmetric interpretations of quantum mechanics were first suggested by Walter Schottky in 1921.[49][50] Several theories have been proposed which modify the equations of quantum mechanics to be symmetric with respect to time reversal.[51][52][53][54][55][56] (See WheelerFeynman time-symmetric theory.) This creates retrocausality: events in the future can affect ones in the past, exactly as events in the past can affect ones in the future. In these theories, a single measurement cannot fully determine the state of a system (making them a type of hidden-variables theory), but given two measurements performed at different times, it is possible to calculate the exact state of the system at all intermediate times. The collapse of the wavefunction is therefore not a physical change to the system, just a change in our knowledge of it due to the second measurement. Similarly, they explain entanglement as not being a true physical state but just an illusion created by ignoring retrocausality. The point where two particles appear to "become entangled" is simply a point where each particle is being influenced by events that occur to the other particle in the future.

Not all advocates of time-symmetric causality favour modifying the unitary dynamics of standard quantum mechanics. Thus a leading exponent of the two-state vector formalism, Lev Vaidman, states that the two-state vector formalism dovetails well with Hugh Everett's many-worlds interpretation.[57]

As well as the mainstream interpretations discussed above, a number of other interpretations have been proposed which have not made a significant scientific impact for whatever reason. These range from proposals by mainstream physicists to the more occult ideas of quantum mysticism.

The most common interpretations are summarized in the table below. The values shown in the cells of the table are not without controversy, for the precise meanings of some of the concepts involved are unclear and, in fact, are themselves at the center of the controversy surrounding the given interpretation. For another table comparing interpretations of quantum theory, see reference.[58]

No experimental evidence exists that distinguishes among these interpretations. To that extent, the physical theory stands, and is consistent with itself and with reality; difficulties arise only when one attempts to "interpret" the theory. Nevertheless, designing experiments which would test the various interpretations is the subject of active research.

Most of these interpretations have variants. For example, it is difficult to get a precise definition of the Copenhagen interpretation as it was developed and argued about by many people.

Although interpretational opinions are openly and widely discussed today, that was not always the case. A notable exponent of a tendency of silence was Paul Dirac who once wrote: "The interpretation of quantum mechanics has been dealt with by many authors, and I do not want to discuss it here. I want to deal with more fundamental things."[67] This position is not uncommon among practitioners of quantum mechanics.[68] Others, like Nico van Kampen and Willis Lamb, have openly criticized non-orthodox interpretations of quantum mechanics.[69][70]

Almost all authors below are professional physicists.

View original post here:

Interpretations of quantum mechanics - Wikipedia

Read More..

Quantum Entanglement Is the Strangest Phenomenon in Physics, But What Is It? – HowStuffWorks

It took until the 1960s before there were any clues to an answer. John Bell, a brilliant Irish physicist who did not live to receive the Nobel Prize, devised a scheme to test whether the notion of hidden variables made sense.

Bell produced an equation now known as Bell's inequality that is always correct and only correct for hidden variable theories, and not always for quantum mechanics. Thus, if Bell's equation was found not to be satisfied in a real-world experiment, local hidden variable theories can be ruled out as an explanation for quantum entanglement.

The experiments of the 2022 Nobel laureates, particularly those of Alain Aspect, were the first tests of the Bell inequality. The experiments used entangled photons, rather than pairs of an electron and a positron, as in many thought experiments. The results conclusively ruled out the existence of hidden variables, a mysterious attribute that would predetermine the states of entangled particles. Collectively, these and many follow-up experiments have vindicated quantum mechanics. Objects can be correlated over large distances in ways that physics before quantum mechanics cannot explain.

Importantly, there is also no conflict with special relativity, which forbids faster-than-light communication. The fact that measurements over vast distances are correlated does not imply that information is transmitted between the particles. Two parties far apart performing measurements on entangled particles cannot use the phenomenon to pass along information faster than the speed of light.

Today, physicists continue to research quantum entanglement and investigate potential practical applications. Although quantum mechanics can predict the probability of a measurement with incredible accuracy, many researchers remain skeptical that it provides a complete description of reality. One thing is certain, though. Much remains to be said about the mysterious world of quantum mechanics.

Andreas Muller is an associate professor of physics at the University of South Florida. He receives funding from the National Science Foundation.

This article is republished from The Conversation under a Creative Commons license. You can find the original article here.

Here is the original post:

Quantum Entanglement Is the Strangest Phenomenon in Physics, But What Is It? - HowStuffWorks

Read More..

Quantum Entanglement Has Now Been Directly Observed at The Macroscopic Scale – ScienceAlert

Quantum entanglement is the binding together of two particles or objects, even though they may be far apart their respective properties are linked in a way that's not possible under the rules of classical physics.

It's a weird phenomenon that Einstein described as "spooky action at a distance", but its weirdness is what makes it so fascinating to scientists. In a 2021 study, quantum entanglement was directly observed and recorded at the macroscopic scale a scale much bigger than the subatomic particles normally associated with entanglement.

The dimensions involved are still very small from our perspective the experiments involved two tiny aluminum drums one-fifth the width of a human hair but in the realm of quantum physics they're absolutely huge.

"If you analyze the position and momentum data for the two drums independently, they each simply look hot," said physicist John Teufel, from the National Institute of Standards and Technology (NIST) in the US, last year.

"But looking at them together, we can see that what looks like random motion of one drum is highly correlated with the other, in a way that is only possible through quantum entanglement."

While there's nothing to say that quantum entanglement can't happen with macroscopic objects, before this it was thought that the effects weren't noticeable at larger scales or perhaps that the macroscopic scale was governed by another set of rules.

The recent research suggests that's not the case. In fact, the same quantum rules apply here, too, and can actually be seen as well. Researchers vibrated the tiny drum membranes using microwave photons and kept them kept in a synchronized state in terms of their position and velocities.

To prevent outside interference, a common problem with quantum states, the drums were cooled, entangled, and measured in separate stages while inside a cryogenically chilled enclosure. The states of the drums are then encoded in a reflected microwave field that works in a similar way to radar.

Previous studies had also reported on macroscopic quantum entanglement, but the 2021 research went further: All of the necessary measurements were recorded rather than inferred, and the entanglement was generated in a deterministic, non-random way.

In a related but separate series of experiments, researchers also working with macroscopic drums (or oscillators) in a state of quantum entanglement have shown how it's possible to measure the position and momentum of the two drumheads at the same time.

"In our work, the drumheads exhibit a collective quantum motion," said physicist Laure Mercier de Lepinay, from Aalto University in Finland. "The drums vibrate in an opposite phase to each other, such that when one of them is in an end position of the vibration cycle, the other is in the opposite position at the same time."

"In this situation, the quantum uncertainty of the drums' motion is canceled if the two drums are treated as one quantum-mechanical entity."

What makes this headline news is that it gets around Heisenberg's Uncertainty Principle the idea that position and momentum can't be perfectly measured at the same time. The principle states that recording either measurement will interfere with the other through a process called quantum back action.

As well as backing up the other study in demonstrating macroscopic quantum entanglement, this particular piece of research uses that entanglement to avoid quantum back action essentially investigating the line between classical physics (where the Uncertainty Principle applies) and quantum physics (where it now doesn't appear to).

One of the potential future applications of both sets of findings is in quantum networks being able to manipulate and entangle objects on a macroscopic scale so that they can power next-generation communication networks.

"Apart from practical applications, these experiments address how far into the macroscopic realm experiments can push the observation of distinctly quantum phenomena," write physicists Hoi-Kwan Lau and Aashish Clerk, who weren't involved in the studies, in a commentary on the research published at the time.

Both the first and the second study were published in Science.

A version of this article was first published in May 2021.

Continue reading here:

Quantum Entanglement Has Now Been Directly Observed at The Macroscopic Scale - ScienceAlert

Read More..

The sparks that ignited curiosity: How quantum researchers found their path – EurekAlert

The Quantum Systems Accelerator (QSA) assembles a broad breadth of talent from 15 member institutions, many of whom have pioneered todays quantum information science (QIS) and technology capabilities. QSA is a National QIS Research Center funded by the United States Department of Energy Office of Science. In celebration of Hispanic Heritage Month in the U.S., five researchers affiliated with QSA shared what first sparked their interest in quantum physics. Perhaps, more importantly, they all stressed the importance of being attentive to the tiny sparks of curiosity, which might come from the unlikeliest sources of inspiration: a university lecture, a chance encounter with a professor, or a book.

Ana Maria Rey

Adjoint Professor, University of Colorado BoulderJILA Fellow, NIST Fellow

Ana Maria Rey, a world-renowned theoretical physicist from Bogota, Colombia, has built a prolific career for over two decades. Reys research in atomic, molecular, and optical (AMO) physics contributed to the most accurate atomic clock ever developed. She continues to advance the techniques for controlling quantum systems in novel ways and applying them to quantum simulations, information, and metrology.

Reys prominence in pushing the boundaries of theoretical physics has earned her several prestigious accolades, such as the MacArthur Fellowship and Presidential Early Career Award in 2013. In addition, Rey is the first Hispanic woman to win the Blavatnik National Award for Young Scientists in 2019. However, her early journey in physics met an unlikely source of resistance: her family.

My fascination with physics began in high school in Colombia, thanks to a physics teacher who promoted my initial interest in using mathematical formulas to describe nature. There were few professional opportunities for the field at that time, so my parents opposed me pursuing a career in physics, said Rey.

Despite her parents objections, Rey majored in physics as an undergraduate at the Universidad de Los Andes and pursued a focus in non-linear optics and general relativity. Rey had a clear idea of what she thought she wanted to specialize in for her doctorate at the University of Maryland, but a lecture changed her direction.

I wanted to continue studying non-linear optics, but at that time, the atomic physics that we do now was not popular. It was just starting to mature. So, for the Ph.D. program, I was offered a fellowship to study non-linear equations in plasma, which was the closest thing to non-linear optics. During my studies, though, I was struck by a lecture by Bill Phillips, Nobel Prize in Physics. As Phillips explained how he manipulated cold atoms with lasers, I changed what I wanted to do, she said.

Reys research has been cited thousands of times in the scientific literature. She believes there are transferable methods and techniques across different quantum technologies.

She explained:

The concepts to model a quantum system can be applied globally in different disciplines or help establish a synergy between the theories connecting completely different experiments. So, for example, even though you are talking about the same models or Hamiltonians, the language you use from one technology to another is different. Generally, this language barrier is often reduced simply by collaborating and studying systems and mathematical techniques to connect different regimes and develop unifying ways to explain specific behaviors.

Pablo Poggi

Research Assistant Professor, University of New Mexico

Born and raised in Buenos Aires, Argentina, Pablo Poggi is a theoretical physicist specializing in quantum control methods to counteract and tailor unwanted noise, environmental effects, and errors in quantum devices and atomic systems. He studies the commonalities shared by different quantum technologies and develops hardware-agnostic, unified theoretical models to build, run, and benchmark quantum simulation devices.

Poggi notes that a high school physics teacher was pivotal in encouraging him to study relativity and quantum mechanics. Nurturing a love for math and its connection with physics early on, Poggi undertook his undergraduate and doctorate degrees at the Universidad de Buenos Aires, one of Latin Americas largest and most prominent public research universities.

Latin America has a robust and important academic tradition. I was very exposed to state-of-the-art quantum physics science at my university because many groups were working on this. However, I did my doctorate in a somewhat risky way because I had no contact with any experimental group. And when I had to define which specific quantum technology to specialize in, I decided to take an alternative. That is, to learn a little about all the general and unified methods that would apply to several technologies. And that was very useful to me in my work in the U.S., said Poggi.

Poggi moved to the U.S. to first work as a postdoctoral fellow at the University of New Mexico. Hes been a research assistant professor and QSA collaborator since 2020.

Noting the difference in approaching problems and experiments in QSAs collaborative ecosystem, Poggi said: In college, I was always used to working alone or in compact groups putting together a vision to publish or highlight results in a conference or workshop, maybe once a year. In contrast, thanks to the frequent meetings and exchanges with QSA partner institutions, I am up to date with the critical questions and advances in the field quickly and immediately.

Sergio Cantu

Research Fellow, MITResearch Scientist, QuEra Computing Inc.

Sergio Cantu is an experimental physicist specializing in atomic physics and quantum optics. During his doctorate studies, Cantu was a National Science Foundation graduate research fellow at the Massachusetts Institute of Technology (MIT). As part of his student fellowship, Cantu participated in QSA-funded research at MIT by using Rydberg atoms in a tightly focused optical trap to to study how photon-photon interactions can be used to generate new quantum light states for quantum information processing. In addition to continuing to contribute to QSA research at MIT, Cantu also works at QuEra Computing. This Boston-basedquantum computingstartup evolved from the leading-edge research in neutral atoms at MIT and Harvard University. El Mundo Boston named Cantu as one of the 30 under 30 most influential Latino leaders thanks to his work inspiring young generations of scientists in underrepresented communities.

What I like the most about my work is when I do experiments. You can do the theoretical research, but the atoms, and in my case, light, will always show you if your assumptions are correct. Atoms are more unbiased, and because they are such a basic thing that we see every day, understanding that dynamic has always fascinated me, said Cantu

His passion for studying atoms and light using the laws of quantum mechanics dates back to his undergraduate years at the University of Texas at Brownsville, close to the border with Mexico. Cantu was one of the few students in his community pursuing degrees in math and physics.

The journey was more or less a gamble. When I was accepted into an optics laboratory at the university, where I only worked with lasers, I thought this was like magic. But as I continued studying, quantum physics caught my attention, he said.

Cantu mentioned how QSAs organizational structure has allowed him to quickly raise questions to other experts in different areas at partner institutions for potential experimental overlap. Furthermore, being part of industry, Cantu also recognizes a tipping point in the fields growth, where engineering systems and assessing scalability come into greater focus.

In a laboratory, I believe that the experiments are still constructed in a half Frankenstein-fashion with inherent fragility. But when conducting experiments and engineering prototypes as industry, you have to interact with vendors and consider other factors of production, said Cantu.

Elmer Guardado-Sanchez

Postdoctoral Fellow, Harvard University

Mexican-born experimental physicist Elmer Guardado-Sanchez is happiest when hes fabricating quantum systems to test novel research ideas. Currently a postdoc at Harvard University, Guardado explores ways to build quantum processors by integrating Rydberg arrays of single atoms in optical tweezers with optical cavities.

What excites me the most when studying something I dont yet understand is the moment I first realize why our measurements might look the way they look and why it happens in a particular way. In other words, I build these complex systems that, due to the simple fact of their level of complexity, are going to exhibit different effects that are sometimes not expected, said Guardado.

Growing up in Monterrey, Mexico, Guardado often participated in high-school-level physics olympiads. He was always sure of his interest in pursuing a career in physics, but he remembered how frequently others asked him how he would find work. In the city where he grew up, with a burgeoning industrial and business center, there werent many research professors who advanced experimental work and hired a high-school student simultaneously.

Guardado achieved national and international recognition in these competitions, deciding to undertake his undergraduate degree at MIT. However, it has not been a linear path to quantum information. First, he studied cold atoms at MIT. Still, it was until his doctorate program at Princeton University, where he started working on quantum simulation, that he considered working broadly in the quantum technology field.

It was a bit of luck initially, as I was looking for someone to do undergraduate research based on what caught my attention. I liked the quantum field because it is very varied and open, featuring lasers, magnetic fields, vacuum chambers, and many more tools and applications, he said.

Guardado is interested in seeing how the experimental integration of different quantum technologies, such as those being studied and developed at QSA trapped ions, neutral atoms, and superconducting circuits can ultimately pave the way for larger systems that may leverage modularity.

Diego Barberena

Ph.D. candidate, University of Colorado Boulder

Diego Barberena is a graduate student from Lima, Peru, who studies quantum metrology at the University of Colorado Boulder. Barberena joined Reys group at JILA in 2017, and hes excited by the experimental and theoretical possibilities in the research and development of cold atom systems. He collaborated with researchers across disciplines and scientific institutes in the U.S. to demonstrate a quantum sensor composed of 150 beryllium ions with a record-setting sensitivity.

From a theoretical point of view, it helps a lot to have these experimental collaborations. For example, we pay closer attention to specific problems we can test with the quantum technologies currently available. In the same way, there are many ideas that we have been working on that can serve as feedback for future experiments, said Barberena.

Barberenas scientific journey is similar to others in QSA, where an initial curiosity led him to new fields of study. For example, he was first interested in quantum physics thanks to a professor at the Universidad Catlica del Per who worked on experiments with photons and quantum entanglement. After that, Barberena continued researching the state of quantum technologies by looking up conferences and broader networks in Latin America that could give him a glimpse of the long-term research questions.

It was not a clear path because there was no definitive guide, and I share these experiences with many international colleagues. It has been quite random, actually, because when I was studying at the university, there was not even a widespread notion of this field of research where scientists developed different models and hardware devices for quantum information and simulations, he said.

Thanks to the support of a Peruvian professor, Barberena continued to acquire more experience in this field until he applied for a doctorate at the University of Colorado. With QSA, he hopes to continue to have constant contact with the broader quantum community through regular meetings and conversations, which is quicker than waiting for a preprint.

####

Founded in 1931 on the belief that the biggest scientific challenges are best addressed by teams, Lawrence Berkeley National Laboratory and its scientists have been recognized with 16 Nobel Prizes. Today, Berkeley Lab researchers develop sustainable energy and environmental solutions, create useful new materials, advance the frontiers of computing, and probe the mysteries of life, matter, and the universe. Scientists from around the world rely on the Labs facilities for their own discovery science. Berkeley Lab is a multiprogram national laboratory, managed by the University of California for the U.S. Department of Energys Office of Science. DOEs Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.

Sandia National Laboratories is a multimission laboratory operated by National Technology and Engineering Solutions of Sandia LLC, a wholly owned subsidiary of Honeywell International Inc., for the U.S. Department of Energys National Nuclear Security Administration. Sandia Labs has major research and development responsibilities in nuclear deterrence, global security, defense, energy technologies and economic competitiveness, with main facilities in Albuquerque, New Mexico, and Livermore, California.

The Quantum Systems Accelerator (QSA) is one of the five National Quantum Information Science Research Centers funded by the U.S. Department of Energy Office of Science. Led by Lawrence Berkeley National Laboratory (Berkeley Lab) and with Sandia National Laboratories as lead partner, QSA will catalyze national leadership in quantum information science to co-design the algorithms, quantum devices, and engineering solutions needed to deliver certified quantum advantage in scientific applications. QSA brings together dozens of scientists who are pioneers of many of todays unique quantum engineering and fabrication capabilities. In addition to industry and academic partners across the world, 15 institutions are part of QSA: Lawrence Berkeley National Laboratory, Sandia National Laboratories, University of Colorado at Boulder, MIT Lincoln Laboratory, Caltech, Duke University, Harvard University, Massachusetts Institute of Technology, Tufts University, UC Berkeley, University of Maryland, University of New Mexico, University of Southern California, UT Austin, and Canadas Universit de Sherbrooke. For more information, please visit https://quantumsystemsaccelerator.org/

See the rest here:

The sparks that ignited curiosity: How quantum researchers found their path - EurekAlert

Read More..

Inside the Proton, the ‘Most Complicated Thing’ Imaginable – Quanta Magazine

More than a century after Ernest Rutherford discovered the positively charged particle at the heart of every atom, physicists are still struggling to fully understand the proton.

High school physics teachers describe them as featureless balls with one unit each of positive electric charge the perfect foils for the negatively charged electrons that buzz around them. College students learn that the ball is actually a bundle of three elementary particles called quarks. But decades of research have revealed a deeper truth, one thats too bizarre to fully capture with words or images.

This is the most complicated thing that you could possibly imagine, said Mike Williams, a physicist at the Massachusetts Institute of Technology. In fact, you cant even imagine how complicated it is.

The proton is a quantum mechanical object that exists as a haze of probabilities until an experiment forces it to take a concrete form. And its forms differ drastically depending on how researchers set up their experiment. Connecting the particles many faces has been the work of generations. Were kind of just starting to understand this system in a complete way, said Richard Milner, a nuclear physicist at MIT.

As the pursuit continues, the protons secrets keep tumbling out. Most recently, a monumental data analysis published in August found that the proton contains traces of particles called charm quarks that are heavier than the proton itself.

The proton has been humbling to humans, Williams said. Every time you think you kind of have a handle on it, it throws you some curveballs.

Recently, Milner, together with Rolf Ent at Jefferson Lab, MIT filmmakers Chris Boebel and Joe McMaster, and animator James LaPlante, set out to transform a set of arcane plots that compile the results of hundreds of experiments into a series of animations of the shape-shifting proton. Weve incorporated their animations into our own attempt to unveil its secrets.

Proof that the proton contains multitudes came from the Stanford Linear Accelerator Center (SLAC) in 1967. In earlier experiments, researchers had pelted it with electrons and watched them ricochet off like billiard balls. But SLAC could hurl electrons more forcefully, and researchers saw that they bounced back differently. The electrons were hitting the proton hard enough to shatter it a process called deep inelastic scattering and were rebounding from point-like shards of the proton called quarks. That was the first evidence that quarks actually exist, said Xiaochao Zheng, a physicist at the University of Virginia.

After SLACs discovery, which won the Nobel Prize in Physics in 1990, scrutiny of the proton intensified. Physicists have carried out hundreds of scattering experiments to date. They infer various aspects of the objects interior by adjusting how forcefully they bombard it and by choosing which scattered particles they collect in the aftermath.

Here is the original post:

Inside the Proton, the 'Most Complicated Thing' Imaginable - Quanta Magazine

Read More..