Page 283«..1020..282283284285..290300..»

Letter | Decentralization: A path to stability and progress in Haiti – Haitian Times

I was deeply moved by the recent two-part series discussing the need for decentralization in Haiti. It echoes sentiments I have long held, but often felt were isolated. For me, decentralizing major infrastructures is not merely a matter of debate me; it is a passionate advocacy grounded in a desire for a more equitable and resilient future for our nation.

For too long, those of us advocating for decentralization have faced criticism and ridicule, accused of being insensitive to the plight of those suffering from gang activities in metropolitan Port-au-Prince. However, I firmly believe that the concentration of resources and infrastructure in certain urban centers exacerbates rather than alleviates these issues. While insecurity may be concentrated in less than 10% of the country, its impact extends far beyond those boundaries, affecting a quarter of the population. We cannot afford to neglect the peaceful regions, where a majority of the people reside while focusing solely on the concentrated hotspots of violence.

One glaring example of the consequences of centralization is the fuel crisis that regularly grips our nation. The blockade of roads leading to Varreux, Haitis primary fuel storage location, paralyzes the entire economy and disrupts the lives of countless citizens. This is not simply a matter of gang activity; it is a systemic flaw stemming from overreliance on a single point of distribution. Imagine the resilience we could achieve if we decentralized fuel storage to at least three dispersed geographic locations across the country, mitigating the impact of such disruptions and ensuring a more reliable energy supply.

Decentralization is not a mere buzzword either. It is a fundamental shift in governance that holds the key to unlocking Haitis full potential. By empowering local communities and distributing resources more equitably, we can foster economic development and social cohesion across the nation. By the way, the Haitian constitution mandates for the country to be decentralized. Decisions have been made not to implement what is required, thus contributing to the reality we face today.

I refuse to rest until I witness a paradigm shift in policy-making towards decentralization. It is time for our leaders to prioritize the long-term interests of the entire nation over short-sighted gains. Let us work together to build a Haiti where prosperity and security are not privileges reserved for the few, but rights enjoyed by all. A decentralization of the country significantly contributes to this much better future.

Read the original post:

Letter | Decentralization: A path to stability and progress in Haiti - Haitian Times

Read More..

Paris Blockchain Week 2024: blockchain expansion and innovation – Cointribune EN

Thu 11 Apr 2024 4 min of reading by Eddy S.

The second day of Paris Blockchain Week 2024 was marked by significant moments. Pioneers of the blockchain industry took the stage to share their visionary insights on the evolution of this rapidly growing sector. Their illuminations sparked keen interest among participants, offering an exciting glimpse into future trends.

Tim Draper, renowned investor, emphasized the critical importance of trust in the success of businesses. According to him, the greatest leaders in the world trust their citizens and set them free. He then explained how blockchain establishes a society based on trust, thereby eliminating the need for centralized transactions. Indeed, this technology creates a world that does not need to address a centralized authority to decide whether a transaction can take place. Such a perspective highlights the potential benefits of decentralization.

Moreover, Yat Siu, co-founder of Animoca Brands, took an interesting approach by considering tokens as a multidimensional investment. He noted that every individual is an investor, investing not only money but also social connections. This expanded view of investment underscores the opportunities offered by blockchain networks. These networks where tokens represent both a financial stake and membership in a growing community.

The intervention of Yoni Assia, founder of eToro, also captivated the audience. He addressed the issue of the increasing institutionalization of the crypto sector, pointing out that crypto is becoming more and more institutionalized. His advice to investors, particularly those interested in ICOs, is to never sell everything if they believe in the long-term potential of a project. This perspective sheds light on the structural changes taking place in the crypto ecosystem, with increased institutional participation.

Finally, the announcement of the launch of 1USD, the first stablecoin on the Aleph Zero blockchain, generated a lot of excitement. As highlighted by Christian Walker of Archblock, this stablecoin offers users the stability of an asset indexed to the dollar, the privacy expected from cash, and the assurance of regulatory compliance. This innovation illustrates how blockchain can overcome the challenge of privacy while retaining the advantages of stablecoins.

The second day of Paris Blockchain Week 2024 has clearly demonstrated that blockchain innovation is flourishing. Experts shared captivating perspectives on trust, tokens, and institutional adoption. These insights offer a fascinating preview of whats to come. Furthermore, the blockchain continues to carve out a prominent place in the global financial ecosystem. Thus, this event has once again confirmed its essential role in the transformation of our economic and social systems.

Dcouvre notre newsletter gratuite Ce lien utilise un programme daffiliation

Maximize your Cointribune experience with our 'Read to Earn' program! Earn points for each article you read and gain access to exclusive rewards. Sign up now and start accruing benefits.

Le monde volue et l'adaptation est la meilleure arme pour survivre dans cet univers ondoyant. Community manager crypto la base, je m'intresse tout ce qui touche de prs ou de loin la blockchain et ses drivs. Dans l'optique de partager mon exprience et de faire connatre un domaine qui me passionne, rien de mieux que de rdiger des articles informatifs et dcontracts la fois.

DISCLAIMER

The views, thoughts, and opinions expressed in this article belong solely to the author, and should not be taken as investment advice. Do your own research before taking any investment decisions.

More here:

Paris Blockchain Week 2024: blockchain expansion and innovation - Cointribune EN

Read More..

Advanced Decentralized Exchange dYdX Partners With Privy to Onboard Users to Web3 – BeInCrypto

Editorial Note: The following content does not reflect the views or opinions of BeInCrypto. It is provided for informational purposes only and should not be interpreted as financial advice. Please conduct your own research before making any investment decisions.

One of the leading DeFi protocols for advanced crypto trading, dYdX, took to Twitter, now X, to announce their new collaboration with Privy. dYdX, a decentralized exchange focusing mainly on perpetual futures trading, entered the partnership to enable its users to create new accounts using their already existing social media credentials or email.

dYdX announced on 9th April 2024 that users will now have a straightforward approach to creating accounts with them. James Hallam, Head of Business Development at dYdX Foundation, spoke about the new event and said the collaboration would make dYdX faster and more intuitive.

Thanks to Privy, transitioning to dYdX Chain is now as easy as using your existing email or social media profiles, ensuring a faster, simpler, and safer experience for users. This streamlined process removes traditional barriers to entry, allowing newcomers to seamlessly enter the world of perpetuals, he said.

Furthermore, he added that one of the primary reasons for the collaboration is to attract a larger audience due to the ease of onboarding Web3 users.

dYdX Chain now feels instantly more accessible to everyday users. This increased accessibility has the potential to attract a wider audience, promoting a more inclusive and rapidly expanding dYdX Chain user base, he added.

Indeed, Privy quoted the dYdX announcement on X, acknowledging the importance of the key partnership. First, they said they were excited to work closely with dYdX and look forward to accomplishing new things together. Moreover, the major objective is to ensure that dYdX maintains the transparency and security of a DEX while integrating the speed and usability of CEX.

Were so proud to support them as they work to make the experience accessible for all, Privy said.

Privy is a company dedicated to helping crypto organizations onboard and engage new users seamlessly. It offers developer libraries that help provide wallet management solutions and an efficient onboarding process. With their latest collaboration, dYdX users can now create a web3 wallet by simply logging in to their social media accounts. Subsequently, they eliminate the need for a seed phrase while creating crypto wallets.

dYdX is a decentralized exchange built on the dYdX blockchain, an open-source application-specific blockchain software. Furthermore, the creation of the dYdX blockchain is based on the Cosmos SDK and CometBFT proof-of-stake (PoS) consensus protocol.

This DEXs primary service is trading perpetual futures contracts for more than 62 cryptocurrencies, including Bitcoin, Ethereum, Cardano, XRP, and Solana. It was founded in August 2017 by Antonio Juliano, a California-based entrepreneur, and initially offered crypto margin trading, lending, and borrowing services. However, it soon transferred to cross-margin perpetual trading in August 2021.

The dYdX DEX and Chain were created in a way that introduces interested users to web3. Together with decentralization, it features other things such as matching engines, order books, and consensus mechanisms. Recent data from DefiLlama indicates a total value locked (TVL) of $505 million. Furthermore, the total value of fees generated in the past year is about $50 million.

Disclaimer

This article contains a press release provided by an external source and may not necessarily reflect the views or opinions of BeInCrypto. In compliance with the Trust Project guidelines, BeInCrypto remains committed to transparent and unbiased reporting. Readers are advised to verify information independently and consult with a professional before making decisions based on this press release content. Please note that ourTerms and Conditions,Privacy Policy, andDisclaimershave been updated.

Read the original post:

Advanced Decentralized Exchange dYdX Partners With Privy to Onboard Users to Web3 - BeInCrypto

Read More..

Breaking the Limits: Overcoming Heisenberg’s Uncertainty in Quantum Measurements – SciTechDaily

An artistic illustration shows how microscopic bolometers (depicted on the right) can be used to sense very weak radiation emitted from qubits (depicted on the left). Credit: Aleksandr Kkinen/Aalto University

Aalto University researchers are the first in the world to measure qubits with ultrasensitive thermal detectorsthus evading the Heisenberg uncertainty principle.

Chasing ever-higher qubit counts in near-term quantum computers constantly demands new feats of engineering.

Among the troublesome hurdles of this scaling-up race is refining how qubits are measured. Devices called parametric amplifiers are traditionally used to do these measurements. But as the name suggests, the device amplifies weak signals picked up from the qubits to conduct the readout, which causes unwanted noise and can lead to decoherence of the qubits if not protected by additional large components. More importantly, the bulky size of the amplification chain becomes technically challenging to work around as qubit counts increase in size-limited refrigerators.

Cue the Aalto University research group Quantum Computing and Devices (QCD). They have a hefty track record of showing how thermal bolometers can be used as ultrasensitive detectors, and they just demonstrated in an April 10 Nature Electronics paper that bolometer measurements can be accurate enough for single-shot qubit readout.

To the chagrin of many physicists, the Heisenberg uncertainty principle determines that one cannot simultaneously know a signals position and momentum, or voltage and current, with accuracy. So it goes with qubit measurements conducted with parametric voltage-current amplifiers. But bolometric energy-sensing is a fundamentally different kind of measurementserving as a means of evading Heisenbergs infamous rule. Since a bolometer measures power, or photon number, it is not bound to add quantum noise stemming from the Heisenberg uncertainty principle in the way that parametric amplifiers are.

Unlike amplifiers, bolometers very subtly sense microwave photons emitted from the qubit via a minimally invasive detection interface. This form factor is roughly 100 times smaller than its amplifier counterpart, making it extremely attractive as a measurement device.

When thinking of a quantum-supreme future, it is easy to imagine high qubit counts in the thousands or even millions could be commonplace. A careful evaluation of the footprint of each component is absolutely necessary for this massive scale-up. We have shown in the Nature Electronics paper that our nanobolometers could seriously be considered as an alternative to conventional amplifiers, says Aalto University Professor Mikko Mttnen, who heads the QCD research group.

In our very first experiments, we found these bolometers accurate enough for single-shot readout, free of added quantum noise, and they consume 10,000 times less power than the typical amplifiersall in a tiny bolometer, the temperature-sensitive part of which can fit inside of a single bacterium, Mttnen explains.

Single-shot fidelity is an important metric physicists use to determine how accurately a device can detect a qubits state in just one measurement as opposed to an average of multiple measurements. In the case of the QCD groups experiments, they were able to obtain a single-shot fidelity of 61.8% with a readout duration of roughly 14 microseconds. When correcting for the qubits energy relaxation time, the fidelity jumps up to 92.7%.

With minor modifications, we could expect to see bolometers approaching the desired 99.9% single-shot fidelity in 200 nanoseconds. For example, we can swap the bolometer material from metal to graphene, which has a lower heat capacity and can detect very small changes in its energy quickly. And by removing other unnecessary components between the bolometer and the chip itself, we can not only make even greater improvements on the readout fidelity, but we can achieve a smaller and simpler measurement device that makes scaling-up to higher qubit counts more feasible, says Andrs Gunyh, the first author on the paper and a doctoral researcher in the QCD group.

Prior to demonstrating the high single-shot readout fidelity of bolometers in their most recent paper, the QCD research group first showed that bolometers can be used for ultrasensitive, real-time microwave measurements in 2019. They then published in 2020 a paper in Nature showing how bolometers made of graphene can shorten readout times to well below a microsecond.

Reference: Single-shot readout of a superconducting qubit using a thermal detector by Andrs M. Gunyh, Suman Kundu, Jian Ma, Wei Liu, Sakari Niemel, Giacomo Catto, Vasilii Vadimov, Visa Vesterinen, Priyank Singh, Qiming Chen and Mikko Mttnen, 10 April 2024, Nature Electronics. DOI: 10.1038/s41928-024-01147-7

The work was carried out in the Research Council of Finland Centre of Excellence for Quantum Technology (QTF) using OtaNano research infrastructure in collaboration with VTT Technical Research Centre of Finland and IQM Quantum Computers. It was primarily funded by the European Research Council Advanced Grant ConceptQ and the Future Makers Program of the Jane and Aatos Erkko Foundation and the Technology Industries of Finland Centennial Foundation.

View original post here:
Breaking the Limits: Overcoming Heisenberg's Uncertainty in Quantum Measurements - SciTechDaily

Read More..

Top Academics: Here’s How We Facilitate the Next Big Leap in Quantum Computing – PCMag AU

Table of Contents From Quantum Physics to Quantum Computing Grand Challenges and Error Correction The Road to Quantum Advantage Education and Workforce Development The Quantum Bottom Line

In advance of the ribbon-cutting for its new IBM System One quantum computer, the first one on a college campus, Rensselaer Polytechnic Institute (RPI) last week hosted a quantum computing day which featured several prominent speakers who together provided a snapshot of where the field is now. I've been writing about quantum computing for a long time, and have noted some big improvements, but there are also a host of challenges that still need to be overcome.

Here are some highlights.

The first plenary speaker was Jay M. Gambetta, Vice President of Quantum Computing at IBM, who gave an overview of the history and progress of quantum computing, as well as the challenges and opportunities ahead. He explained that quantum computing is based on exploiting the quantum mechanical properties of qubits, such as superposition and entanglement, to perform computations that are impossible or intractable for classical computers. He talked about watching the development of superconducting qubits, as they moved from single qubit systems in 2007, to 3-qubit systems in 2011, and now with IBM's Eagle chip, which has 127 qubits and is the heart of the Quantum System One.

He then asked how we could make quantum computing useful. His answer: We need to keep building larger and larger systems and we need to improve error correction.

"There are very strong reasons to believe there are problems that are going to be easy for a quantum computer but hard for a classical computer, and this is why we're all excited," Gambetta said. He discussed the development of quantum circuits and that while the number of qubits was important, equally important was the "depth," detailing how many operations you can do and the accuracy of the results. Key to solving this are larger and larger systems, and also error mitigation, a topic that would be discussed in much greater detail later in the day.

To get to "quantum utility"which he said would be reached when a quantum computer is better than a brute force simulation of a quantum computer on a classical machineyou would need larger systems with at least 1000 gates, along with improved accuracy and depth, and new efficient algorithms.

He talked about quantum algorithmic discovery, which means finding new and efficient ways to map problems to quantum circuits. For instance, a new variation on Shor's algorithm, which allows for factorization in much faster time than would be possible on a classical computer. "The future of running error-mitigated circuits and mixing classical and quantum circuits sets us up to explore this space, " he said.

In a panel discussion that followed, James Misewich from Brookhaven National Laboratory discussed his interest in using quantum computing to understand quantum chromodynamics (QCD), the theory of strong interactions between quarks and gluons. QCD is a hard problem that scales well with the number and depth of qubits, and he is looking at entanglement between jets coming out of particle collisions as a possible avenue to explore quantum advantage.

Jian Shi and Ravishankar Sundararaman from RPI's Materials Science and Engineering faculty talked about computational materials science, and applying quantum computing to discover new materials and properties. Shi noted there was a huge community now doing quantum chemistry, but there is a gap between that and quantum computing. He stressed that a partnership between the two groups will be important, so each learns the language of the other and can approach the problems from a different perspective.

One of the most interesting talks was given by Steve M. Girvin, Eugene Higgins Professor of Physics, Yale University, who discussed the challenges of creating an error-correction quantum computer.

Girvin described how the first quantum revolution was the development of things like the transistor, the laser, and the atomic clock, while the second quantum revolution is based on a new understanding of how quantum mechanics works. He usually tells his students that they do the things that Einstein said were impossible just to make sure that we have a quantum computer and not a classical computer.

He thought there was a bit too much hype around quantum computing today. quantum is going to be revolutionary and do absolutely amazing things, but it's not its time yet. We still have massive problems to solve.

He noted that quantum sensors are extremely sensitive, which is great for making sensors, but bad for building computers, because they are very sensitive to external perturbations and noise. Therefore, error correction is important.

Among the issues Girvin discussed were making measurements to detect errors, but he said we also need calculations to decide if it truly is an error, where it is located, and what kind of error it is. Then there is the issue of deciding what signals to send to correct those errors. Beyond that, there is the issue of putting these together in a system to reduce overall errors, perhaps borrowing from the flow control problems used in things like telephony.

In addition to quantum error detection, Girvin said there are "grand challenges all up and down the stack," from materials to measurement to machine models and algorithms. We need to know how to make each layer of the stack more efficient, using less energy and fewer qubits, and get to higher performance so people can use these to solve science problems or economically interesting problems.

Then there are the algorithms. Girvin noted that there were algorithms way before there were computers, but it took time to decide on the best ones for classical computing. For quantum computing, this is just the beginning, and over time, we need people to figure out how to build up their algorithms and how to do heuristics. They need to discover why quantum computers are so hard to program and clever tools to solve these problems.

Another challenge he described was routing quantum information. He noted that having two quantum computers that can communicate classically is exponentially less good than having two quantum computers that can communicate with quantum information, entangling with each other.

He talked about fault tolerance, which is the ability to correct errors even when your error correction circuit makes errors. He believes that fact that it's possible to do that in a quantum system, at least in principle, is even more amazing than the fact that if you had a perfect quantum computer, you could do interesting quantum calculations.

Girvin described the difficulty in correcting errors, saying you have an unknown quantum state, and you're not allowed to know what it is, because it's from the middle of a quantum computation. (If you know what it is, you've destroyed the superposition, and if you measure it to see if there's an error, it will randomly change, due to state collapse.) Your job is that if it develops an error, please fix it.

"That's pretty hard, but miraculously it can be done in principle, and it's even been done in practice," he said. We're just entering the era of being able to do it. The basic idea is to build in redundancy, such as building a logical qubit that consists of multiple physical qubits, perhaps nine. Then you have two possible giant entangled states corresponding to a logical Zero and a logical One. Note the one and zero aren't living in any single physical qubit, both are only the superposition of multiple ones.

In that case, Girvin says, if the environment reaches in and measures one of those qubits, the environment doesn't actually learn what it knows. There's an error, but it doesn't know what state, so there's still a chance that you haven't totally collapsed anything and lost the information.

He then discussed measuring the probability of errors and then seeing whether it exceeds some threshold value, with some complex math. Then correcting the errors, hopefully quicklysomething that should improve with new error correction methods and better, more precise physical qubits.

All this is still theoretical. That's why fault tolerance is a journey with improvements being made continuously. (This was in opposition to Gambetta, who said systems are either fault tolerant or they aren't). Overall, Girvin said, "We still have a long way to go, but we're moving in the right direction."

Later in the morning, Austin Minnich, Professor of Mechanical Engineering and Applied Physics, Caltech described "mid-circuit measurement" and the need for hybrid circuits as a way of finding, and thus mitigating errors.

In a discussion that followed, Kerstin Kleese van Dam, Director of the Computational Science Initiative at Brookhaven National Laboratory, explained that her team was looking for answers to problems, whether solved on traditional or quantum machines. She said there were problems they can't solve accurately on a traditional computer, but there remains the question of whether the accuracy will matter. There are areas, such as machine learning, where quantum computers can do things accurately. She predicts that quantum advantage will come when we have systems that are large enough. But she also wondered about energy consumption, noting that a lot of power is going into today's AI models, and if quantum can be more efficient.

Shekhar Garde, Dean of the School of Engineering, RPI, who moderated this part of the discussion, compared the status of quantum computing today to where traditional computing was in the late 70s or early 80s. He asked what the next 10 years would bring.

Kleese van Dam said that within 10 years, we would see hybrid systems that combine quantum and classical computing, but also hoped we would see libraries that are transferred from high-performance computing to quantum systems, so a programmer could use them without having to understand the way the gates work. Aparna Gupta, Professor and Associate Dean of RPI's Lally School of Management would bet on the hybrid approach offering more easy access and cost-effectiveness, as well as "taking away the intrigue and the spooky aspects of quantum, so it is becoming real for all of us"

Antonio Corcoles, Principal Research Scientist, IBM Quantum, said he hoped users who don't know quantum will be able to use the system because the complexity will become more transparent, but that can take a long time. In between, they can develop quantum error correction in a way that is not as disruptive as current methods. Minnich talked about "blind quantum computing" where many smaller machines might be linked together.

One of the most interesting talks came from Lin Lin, Professor of Mathematics at the University of California, Berkeley, who discussed the theoretical aspects and challenges of achieving quantum advantage for scientific computation. He defined quantum advantage as the ability to solve problems that are quantumly easy but classically hard, and proposed a hierarchy of four levels of problems.

Lin said that for the first two levels, a lot of people think quantum advantage will be achieved, as the methods are generally understood. But on the next two levels, there needs to be a lot of work on the algorithms to see if it will work. That's why this is an exciting time for mathematicians as well as physicists, chemists, and computer scientists.

This talk was followed by a panel during which Lin said that he is interested in solving quantum many-body problems, as well as applying quantum computing to other areas of mathematics, such as numerical analysis and linear algebra.

Like Garde above, Lin compared where quantum is today to the past, going even further to say it's where classical computing was 60 or 70 years ago, where error correction was still very important. Quantum computing will need to be a very interdisciplinary field, in that it will require people to be very good at building the machines, but it will always produce errors, so it will require both mathematical and engineering ways to correct these.

Ryan Sweke from IBM Research noted that one of the things that has allowed classical computing to develop to the point it is at is the various levels of abstraction, so if you want to work on developing algorithms, you don't have to understand how the compiler works. If you want to understand how the compiler works, you don't have to understand how the hardware works.

The interesting thing in the quantum regime, as seen in error mitigation for example, is that people who come out of the top level of abstraction have to interact with people who are developing the devices. This is an exciting aspect of the time we're in.

Di Fang, Assistant Professor of Mathematics, Duke University, said now was a "golden time for people who work on proving algorithms." She talked about the varying levels of complexity, and the need to see where new algorithms can solve theoretical problems, then look at the hardware and solve practical problems.

Brian McDermott, Principal R&D Engineer at the Naval Nuclear Laboratory, said he was looking at this in reverse, seeing what the problems are and then working backward toward the quantum hardware and software. His job involved matching applications of new and emerging computing architectures to the types of engineering problems that are important to the lab's mission for new nuclear propulsion.

The panelists discussed where quantum algorithms could have the most impact. McDermott talked about things like finite elements and computational fluid dynamics, going up to material science. As a nuclear engineer, he was first attracted to the field because of the quantum properties of the nucleus itself moving predicting behaviors in astrophysics, the synthesis of nuclei in a supernova, and then with engineering, into nuclear reactors and things like fusion. Lin discussed the possibilities for studying molecular dynamics.

Olivia Lanes, Global Lead and Manager for IBM Quantum Learning and Education gave the final talk of the day, where she discussed the need for workforce development in the quantum field.

Already the US is projected to face a shortfall of nearly two million STEM workers by next year. She quoted Carl Sagan, who said "We live in a society exquisitely dependent on science and technology, in which hardly anyone knows anything about science and technology," and agreed with him that this is a recipe for disaster.

She noted that not only do very few people understand quantum computing, very few actually understand how classical computers work. She cited a McKinsey study which found that there are three open jobs in quantum for every person qualified to fill those positions. It's probably just going to get worse from here to 2026.

She focused on upskilling and said it was unrealistic to expect that we'll make everyone into experts in quantum computing. But, there were a lot of other jobs that are part of the quantum ecosystem that will be required, and urged students to focus on the areas they are particularly interested in.

In general, she recommended getting a college degree (not surprising, since she was talking at a college), considering graduate school, or finding some other way to get relevant experience in the field, and building up rare skills. "Find the one thing that you can do better than anybody else and market that thing. You can make that thing applicable to any career that you really want for the most part," she said. "Stop letting the physicists hog quantum; they've had a monopoly here for too long and that needs to change."

Similar concepts were voiced in a panel that followed. Anastasia Marchenkova, Quantum Researcher, Bleximo Corporation, said that there was lots of pop science, and lots of research, but not much in the middle. She said we need to teach people enough so they can use quantum computing, even if they aren't computer scientists.

Richard Plotka, Director of Information Technology and Web Science, RPI, said it was important to create middleware tools that can be applied to quantum so that the existing workforce can take advantage of these computers. He also said it was important to prepare students for a career in the future, with foundational knowledge, so they have the ability to adapt because quantum in five or ten years won't look like it does today.

All told, it was a fascinating day of speakers. I was intrigued by software developers explaining the challenge in writing languages, compilers, and libraries for quantum. One explained that you can't use traditional structures such as "ifthen" because you won't know "if." Parts of it were beyond my understanding, and I remain skeptical about how quickly quantum will become practical and how broad the applications may be.

Still, it's an important and interesting technology that is sure to get even more attention in the coming years, as researchers meet some of the challenges. It's good to see students getting a chance to try out the technology and discover what they can do with it.

Read more here:
Top Academics: Here's How We Facilitate the Next Big Leap in Quantum Computing - PCMag AU

Read More..

Next-Generation Quantum Computing Secured by Oxford Innovation – yTech

Research scientists at Oxford University have made significant strides in securing the future of quantum computing, focusing on how users can safely leverage this profound technology through cloud services using fiber optics. Their cutting-edge approach may enable individuals to perform quantum computing tasks without compromising their datas privacy and integrity.

**Summary:** Oxford University has broken new ground in quantum computing by developing a way to ensure data privacy and security when using quantum computers through cloud services. This innovation represents a key milestone in addressing security concerns and making quantum computing accessible to individual users.

The flagship breakthrough reported in the Physics Review Letters journal involves a strategy termed blind quantum computing, which establishes a secure lien between personal devices and quantum servers. The team, led by Professor David Lucas, has showcased a model that not only preserves data privacy but also enables users to verify computational accuracy, a critical contribution to the growing field of quantum computations.

Quantum computing stands at the forefront of a technological revolution, with industry titans pouring resources into this sphere in anticipation of its ability to tackle complexities unfathomable to classic supercomputers. Spearheading these advances, Oxfords research introduces an innovative photonic detection system, assuring secure data interaction across networks.

This progress charts possible new territories for telecommunications providers, as these breakthroughs necessitate robust infrastructure to accommodate quantum networks. Of pivotal importance, these advancements push the boundaries of data processing and may profoundly affect sectors reliant on computational power.

The article illustrates a burgeoning opportunity that quantum computing represents, pinning down the crucial element of security that will define the technologys practical application. This comes at a critical juncture where quantum computings rapid expansion underscores the need for enhanced data protection measures and infrastructural expansions to realize its full potential.

Quantum Computing Industry Outlook: The quantum computing industry is poised for significant growth, with some market forecasts projecting that it will reach billions of dollars in value within the next decade. Leading corporations in tech and defense, such as IBM, Google, Microsoft, and numerous startups, are competing to achieve quantum supremacy and to bring practical quantum computing solutions to the market. This technology promises to revolutionize industries ranging from pharmaceuticals, where it could expedite drug discovery, to finance, providing powerful tools for risk analysis and optimization.

Market Forecasts: According to research firms, the global quantum computing market is expected to witness a compound annual growth rate (CAGR) that might exceed 20-30% in the upcoming years. Factors such as increased investment in quantum computing ventures, partnerships between academic institutions and the private sector, and government funding in quantum technologies are fueling this growth. As quantum computings potential applications expand and as the technology becomes more accessible, even more sectors could begin to explore its advantages.

Industry Issues: However, the quantum computing industry faces several issues, notably in the aspects of security and technology standardization. The very principles that make quantum computing powerful also pose risks, such as the potential to break current encryption standards. As a result, there is a parallel pursuit for quantum-safe cryptography. Additionally, the industry is confronted with the technical challenge of achieving and maintaining quantum coherence to realize practical and reliable quantum machines.

Infrastructure and Telecommunications: The development of quantum networks for secure communication is vital. Quantum Key Distribution (QKD) and other quantum-safe communication methodologies are likely to become industry standards. Telecommunications providers will play a pivotal role in creating the necessary infrastructure. These providers need to be ready to make considerable investments to future-proof their networks against the forthcoming surge in quantum data traffic.

Research such as the work conducted by the scientists at Oxford University is essential to address these issues. Their innovation in blind quantum computing provides a new layer of security in cloud-accessible quantum systems, possibly paving the way for consumer-level quantum computing services. As this field develops, telecommunications companies and service providers would likely seek to integrate these advancements into their offerings to gain a competitive edge.

For additional information about quantum computing and the industrys developments, check out the following related resources:

IBM Quantum Google Quantum AI Microsoft Quantum

These links represent some of the industry leaders who are actively investing in and developing quantum computing technologies that are shaping the future of the sector.

Micha Rogucki is a pioneering figure in the field of renewable energy, particularly known for his work on solar power innovations. His research and development efforts have significantly advanced solar panel efficiency and sustainability. Roguckis commitment to green energy solutions is also evident in his advocacy for integrating renewable sources into national power grids. His groundbreaking work not only contributes to the scientific community but also plays a crucial role in promoting environmental sustainability and energy independence. Roguckis influence extends beyond academia, impacting industry practices and public policy regarding renewable energy.

Read more from the original source:
Next-Generation Quantum Computing Secured by Oxford Innovation - yTech

Read More..

Advancement in Quantum Computing Security through Oxford University’s Breakthrough – yTech

Researchers at Oxford University have developed a new system that could transform the security of quantum computing when performed via cloud services. By employing a technique called blind quantum computing, the team has successfully enabled computations to be done on remote servers without compromising the privacy of the data involved. This method, which utilizes fiber optics for secure communication, is poised to usher in an era where quantum computations can be safely conducted from any distance.

In an age where digital privacy is under scrutiny, the advancement by Oxford scientists represents a notable step forward. The protocol not only protects user data but also verifies the accuracy of the quantum computations being performed remotely. This means that individuals and organizations could utilize quantum computings exceptional capabilities without risking the exposure of sensitive information.

Quantum computing presents a revolutionary opportunity across various sectors, from cryptography and pharmaceuticals to artificial intelligence and finance. By leveraging quantum mechanics, it promises to execute calculations indomitably faster than any traditional computing system.

However, as the technology nears a more practical application, issues such as data security and the need for quantum-resistant encryption are becoming more pressing. The pioneering work by the Oxford University team is a response to such concerns and is crucial for the industry to overcome security challenges associated with quantum networks.

With government investments and academic-industry collaborations fueling its growth, quantum computing is projected to experience a significant market expansion, maintaining a CAGR of over 20% in the ensuing years. Nonetheless, the sector continues to contend with obstacles like error correction, hardware stability, scaling quantum systems, and establishing universal standards. The community anticipates that with the maturation of technologies, regulations will guide the ethical and secure deployment of quantum computing.

For those interested in this cutting-edge field, resources such as IBMs research division and market analytics from Bloomberg offer comprehensive insights. Scholarly contributions, for instance in Nature journals, provide a wealth of information on the dynamic progress and implications of quantum advancements. As investments and innovations persist, quantum computing will likely redefine digital computations horizon and fortify data security in unprecedented ways.

Quantum computing is a topic enveloped in both excitement and complexity, as researchers and industry specialists explore its vast potential to transform computing as we know it. With its ability to perform complex calculations at speeds unmatchable by classical computers, this technology holds the key to monumental strides in myriad industries.

The implications of quantum computing ripple across numerous fields, standing to reimagine the landscape of data encryption, drug discovery, financial modeling, and weather forecasting, among others. The technology operates using the principles of quantum mechanicssuch as superposition and entanglementto process information in ways that traditional computers cannot.

However, this leap in computational capabilities brings to the forefront various challenges, including those of data privacy and cybersecurity. The advancements at Oxford University mark a significant response to these hurdles, offering enhanced protection for data during quantum computations, particularly in scenarios where such operations must be performed remotely.

As the quantum computing industry readies itself for wider adoption, the market forecasts are highly optimistic. Estimates suggest that the quantum computing market could reach billions of dollars in the coming decade, with both public and private entities increasing their stakes in this high-potential space.

Yet, this forecasted growth does not obscure the ongoing challenges facing the industry. The precision demanded by quantum computing means that developers need to solve problems related to quantum error correction, coherence times, and the development of fault-tolerant systems. Furthermore, scaling up quantum computers to be commercially viable is an intricate process thats still under intensive research.

The field of quantum computing is underpinned by significant investments from government bodies and the private sector. Companies like IBM are spearheading research, and market analytics available from entities like Bloomberg can offer well-defined projections and analyses.

Likewise, academic publishing in well-respected journals such as those found within the Nature portfolio continues to supply vital findings and discussions on quantum developments. These resources, together with the breakthroughs from institutions like Oxford University, indicate a future wherein quantum computing is not only viable but also secureholding vast potential to reshape industries and protect data in ways previously unimagined.

Leokadia Gogulska is an emerging figure in the field of environmental technology, known for her groundbreaking work in developing sustainable urban infrastructure solutions. Her research focuses on integrating green technologies in urban planning, aiming to reduce environmental impact while enhancing livability in cities. Gogulskas innovative approaches to renewable energy usage, waste management, and eco-friendly transportation systems have garnered attention for their practicality and effectiveness. Her contributions are increasingly influential in shaping policies and practices towards more sustainable and resilient urban environments.

Continue reading here:
Advancement in Quantum Computing Security through Oxford University's Breakthrough - yTech

Read More..

Japan Embarks on Quantum Computing Leap with NVIDIA-powered ABCI-Q Supercomputer – yTech

Summary: Japan aims to bolster its quantum computing prowess with the deployment of a new world-class supercomputer, ABCI-Q. To be situated at the ABCI supercomputing center, it will utilize NVIDIAs cutting-edge platforms specialized in accelerated and quantum computing to conduct high-fidelity quantum simulations aiding various industry sectors.

The realm of quantum computing is set to receive a substantial boost in Japan as the countrys latest supercomputer, named ABCI-Q, draws upon NVIDIA technology to lay the groundwork for significant research advancements. This strategic move is poised to escalate Japanese capabilities in computational science and underlines the nations commitment to its quantum computing initiative.

Powered by an array of over 2,000 NVIDIA H100 Tensor Core GPUs, the ABCI-Q machine will sport more than 500 nodes, all seamlessly connected through the NVIDIA Quantum-2 InfiniBand networking platform. NVIDIA CUDA-Q, an innovative open-source hybrid framework, will be a pivotal aspect of the system, enabling the programming of sophisticated quantum-classical systems and serving as a robust simulation toolkit.

The strategic collaboration between NVIDIA and the Global Research and Development center for Business by Quantum-AI Technology (G-QuAT) at the National Institute of Advanced Industrial Science and Technology (AIST) facilitated the creation of the ABCI-Q. This supercomputer is not just an investment in hardware; it also represents a long-term vision for Japans scientific community, supporting quantum machine learning, quantum circuit simulation, and the pioneering development of algorithms inspired by quantum principles.

Expected to launch in the coming year, the ABCI-Qs role in accelerating Japans quantum technology meets a national strategy aiming to derive societal and economic benefits from quantum advancements. Furthermore, this initiative is set to propel industrial applications in conjunction with G-QuAT/AIST and NVIDIA, laying the foundation for future hybrid quantum-classical systems and breakthroughs in a variety of fields, including AI, energy, and biosciences.

Japans Quantum Computing Ambitions with ABCI-Q

Japans commitment to advancing its position in quantum computing is epitomized by the ABCI-Q supercomputer. This latest quantum initiative aligns with the broader industrys trajectory towards creating computers that leverage the principles of quantum mechanics to process information at speeds unachievable by classical computers.

Quantum Computing Industry and Market Forecasts

Quantum computing is not only a scientific endeavor but also a burgeoning industry with significant commercial potential. Global market forecasts suggest that quantum computing could evolve into a multi-billion-dollar industry over the next decade. According to market experts, this growth is likely to be driven by increasing investments in quantum technologies by nations and private entities, in sectors ranging from cryptography and optimization to drug discovery and material science.

The market expectations indicate that while we may still be in the early stages of quantum computing commercialization, the acquisition of quantum capabilities is set to be a strategic focus for countries and corporations aiming to secure a competitive edge in the high-tech landscape.

Industry and Research Applications

The ABCI-Q supercomputers powerful NVIDIA H100 Tensor Core GPUs and Quantum-2 InfiniBand networking capabilities place it at the forefront of quantum simulation. Such simulations are crucial for advancing research into complex molecular interactions, material properties, and energy-efficient solutions, thereby aiding industries like pharmaceuticals, automotive, and clean energy.

Furthermore, the ability to perform quantum machine learning and circuit simulations is expected to open new avenues for artificial intelligence, allowing for the exploration of quantum-inspired algorithms that could dramatically enhance AIs problem-solving abilities.

Issues and Challenges

Despite the promise of quantum computing, the industry faces significant challenges. The technology is still in development, and practical, scalable quantum computers are yet to become mainstream. Issues such as error correction, qubit coherence, and the integration of quantum processors with classical systems need to be resolved before the full potential of quantum computing can be realized.

Additionally, there is a looming need for quantum-skilled workforce development. Education and training programs will be crucial to prepare the next generation of scientists and engineers, equipping them with the skills necessary to foster quantum innovation.

As quantum technology advances, there are implications for cybersecurity, as traditional encryption methods may become vulnerable to quantum attacks. This has led to a growing field of quantum cryptography, which aims to develop security protocols resistant to quantum computing threats.

Conclusion

Japans investment in ABCI-Q exemplifies a strategic approach to harnessing quantum computings potential. The collaboration with NVIDIA and the focus on building quantum-classical hybrid systems indicate that Japan is positioning itself as a global leader in the quantum race.

For further information on the state of the global quantum computing industry and market trends, respected sources can be consulted, such as IBM, a pioneer in the field of quantum computing, and NVIDIA, a leader in accelerated computing platforms. These resources offer insights into the latest developments, research, and forecasts for quantum computings evolution, shaping the future across diverse sectors.

Roman Perkowski is a distinguished name in the field of space exploration technology, specifically known for his work on propulsion systems for interplanetary travel. His innovative research and designs have been crucial in advancing the efficiency and reliability of spacecraft engines. Perkowskis contributions are particularly significant in the development of sustainable and powerful propulsion methods, which are vital for long-duration space missions. His work not only pushes the boundaries of current space travel capabilities but also inspires future generations of scientists and engineers in the quest to explore the far reaches of our solar system and beyond.

Excerpt from:
Japan Embarks on Quantum Computing Leap with NVIDIA-powered ABCI-Q Supercomputer - yTech

Read More..

Breakthrough promises secure quantum computing at home – University of Oxford

The full power of next-generation quantum computing could soon be harnessed by millions of individuals and companies, thanks to a breakthrough by scientists at Oxford University Physics guaranteeing security and privacy. This advance promises to unlock the transformative potential of cloud-based quantum computing and is detailed in a new study published in the influential U.S. scientific journal Physical Review Letters.

Never in history have the issues surrounding privacy of data and code been more urgently debated than in the present era of cloud computing and artificial intelligence. As quantum computers become more capable, people will seek to use them with complete security and privacy over networks, and our new results mark a step change in capability in this respect.

Quantum computing is developing rapidly, paving the way for new applications which could transform services in many areas like healthcare and financial services. It works in a fundamentally different way to conventional computing and is potentially far more powerful. However, it currently requires controlled conditions to remain stable and there are concerns around data authenticity and the effectiveness of current security and encryption systems.

Several leading providers of cloud-based services, like Google, Amazon, and IBM, already separately offer some elements of quantum computing. Safeguarding the privacy and security of customer data is a vital precursor to scaling up and expending its use, and for the development of new applications as the technology advances. The new study by researchers at Oxford University Physics addresses these challenges.

We have shown for the first time that quantum computing in the cloud can be accessed in a scalable, practical way which will also give people complete security and privacy of data, plus the ability to verify its authenticity, said Professor David Lucas, who co-heads the Oxford University Physics research team and is lead scientist at the UK Quantum Computing and Simulation Hub, led from Oxford University Physics.

In the new study, the researchers use an approach dubbed blind quantum computing, which connects two totally separate quantum computing entities potentially an individual at home or in an office accessing a cloud server in a completely secure way. Importantly, their new methods could be scaled up to large quantum computations.

Using blind quantum computing, clients can access remote quantum computers to process confidential data with secret algorithms and even verify the results are correct, without revealing any useful information. Realising this concept is a big step forward in both quantum computing and keeping our information safe online said study lead Dr Peter Drmota, of Oxford University Physics.

The results could ultimately lead to commercial development of devices to plug into laptops, to safeguard data when people are using quantum cloud computing services.

Researchers exploring quantum computing and technologies at Oxford University Physics have access to the state-of-the-art Beecroft laboratory facility, specially constructed to create stable and secure conditions including eliminating vibration.

Funding for the research came from the UK Quantum Computing and Simulation (QCS) Hub, with scientists from the UK National Quantum Computing Centre, the Paris-Sorbonne University, the University of Edinburgh, and the University of Maryland, collaborating on the work.

The study Verifiable blind quantum computing with trapped ions and single photons has been published in Physical Review Letters.

See the rest here:
Breakthrough promises secure quantum computing at home - University of Oxford

Read More..

Oxford University’s Research Paves the Way for Secure, Cloud-Based Quantum Computing – yTech

Summary: A breakthrough in quantum computing research at Oxford University could revolutionize how individual users utilize quantum computing through cloud services over fiber optic connections. The study addresses the pivotal concern of maintaining data privacy and security in such a sensitive computational environment.

In an era where supercomputers and personal devices have dominated computational tasks, the potential for quantum computing to surpass these capacities is within reach. Notably, tech giants like Google, Amazon, and IBM are already incorporating aspects of quantum technology in their operations. The inherent challenge that arises with quantum computing is the delicate nature of quantum interactionswhere minor disturbances may lead to the collapse of the quantum state, a hurdle that must be overcome to fully harness its capabilities.

Oxford Universitys Physics Department has made a significant advance in this field, targeting the crucial aspect of secure quantum computing. As the findings in the Physics Review Letters journal suggest, the future might see the introduction of devices that connect to personal laptops, safeguarding data during the use of quantum cloud computing services.

The research introduces a novel concept of blind quantum computing. This involves a secured connection between a quantum computing server and an independent client device via a fiber network, ensuring complete data privacy and the ability to verify the accuracy and integrity of the information. The system employs an apparatus capable of detecting photons, which plays a key role in achieving the desired security during computations that depend on real-time corrections.

Under the guidance of Professor David Lucas, the Oxford team successfully demonstrated the practical application of cloud-accessible quantum computing, holding promises of full data security, privacy, and authenticity verification. This advance also opens up potential opportunities for telecom providers to support the infrastructure required for quantum networks.

Quantum Computing Industry Overview

The quantum computing field is experiencing rapid growth, fueled by its potential to solve complex problems far beyond the capacity of classical computers. Leading tech companies such as Google, Amazon, and IBM are investing heavily in quantum technology, aiming to harness it for various applications including cryptography, drug discovery, financial modeling, and climate research.

Market Forecasts

Financially, the quantum computing industry is projected to expand substantially in the coming years. Market research forecasts that the global quantum computing market, which includes hardware, software, and services, could surpass tens of billions of dollars by the end of this decade. This growth is anticipated as advancements in the field unlock new commercial uses and as industry and government investments continue to pour into research and development.

Industry Issues

However, the industry faces several challenges, with data security and privacy ranking as critical concerns. Quantum computings ability to potentially break current encryption standards poses substantial risks to data security. The Oxford University breakthrough, which addresses security concerns, is a crucial step towards countering such threats.

The sensitivity of quantum states to external disturbances is another significant issue, often referred to as quantum decoherence. Achieving long-term stability of quantum information requires technological innovations to isolate qubits (quantum bits) from environmental interference, a feat that researchers across the globe are striving to accomplish.

Additionally, building a scalable quantum computing infrastructure involves creating new standards, protocols, and devices that can operate in extreme physical conditions, usually at near absolute zero temperatures. The real-world application of quantum computing also necessitates a skilled workforce proficient in quantum mechanics and its computational applications, signaling a need for focused education and training programs.

Potential Opportunities and Advancements

With the progress at Oxford University, the concept of blind quantum computing provides a promising avenue for secure quantum data processing. The potential expansion of cloud-based quantum computing services may necessitate updates to telecommunications infrastructure, presenting opportunities for telecom companies to facilitate quantum networks.

The development reported by Oxfords team not only captures the essence of these technological breakthroughs but also highlights the potential for wide-ranging impacts across sectors that depend on computing power. As research like this develops, companies and governments must collaborate to ensure responsible stewardship of quantum computing technology and address societal and ethical implications of its deployment.

The quantum computing revolution offers an array of possibilities across industries. By overcoming the intrinsic challenges related to security, coherence, and infrastructure, the industry stands poised to redefine our capabilities in data processing and technological innovation.

Roman Perkowski is a distinguished name in the field of space exploration technology, specifically known for his work on propulsion systems for interplanetary travel. His innovative research and designs have been crucial in advancing the efficiency and reliability of spacecraft engines. Perkowskis contributions are particularly significant in the development of sustainable and powerful propulsion methods, which are vital for long-duration space missions. His work not only pushes the boundaries of current space travel capabilities but also inspires future generations of scientists and engineers in the quest to explore the far reaches of our solar system and beyond.

The rest is here:
Oxford University's Research Paves the Way for Secure, Cloud-Based Quantum Computing - yTech

Read More..