Category Archives: Quantum Computing
Exploring new frontiers with Fujitsu’s quantum computing research and development – Fujitsu
Fujitsu and RIKEN have already successfully developed a 64-qubit superconducting quantum computer at the RIKEN-RQC-Fujitsu Collaboration Center, which was jointly established by the two organizations (*1). Our interviewee, researcher Shingo Tokunaga, is currently participating in a joint research project with RIKEN. He majored in electronic engineering at university and worked on microwave-related research topics. After joining Fujitsu, he worked in a variety of software fields, including network firmware development as well as platform development for communication robots. Currently, he is applying his past experience in the Quantum Hardware Team at the Quantum Laboratory to embark on new challenges.
In what fields do you think quantum computing can be applied to?
ShingoQuantum computing has many potential applications, such as finance and healthcare, but especially in quantum chemistry calculations used in drug development. If we can use it for these calculations, we can realize efficient and high precision simulations in a short period of time. Complex calculations that traditionally take a long time to solve on conventional computers are expected to be solved quickly by quantum computers. One such example of this is finding solutions for combinatorial optimization problems such as molecular structure patterns. The spread of the novel coronavirus has made the development of vaccines and therapeutics urgent, and in such situations where rapid responses are needed, I believe the time will come when quantum computers can be utilized.
Fujitsu is collaborating with world-leading research institutions to advance research and development in all technology areas, from quantum devices to foundational software and applications, with the aim of realizing practical quantum computers. Additionally, we are also advancing the development of hybrid technologies (*2) for quantum computers and high-performance computing technologies, represented by the supercomputer Fugaku, which will be necessary for large-scale calculations until the full practicality of quantum computers is achieved.
What themes are you researching? What are your challenges and goals?
ShingoOne of the achievements of our collaborative research with RIKEN is the construction of a 64-qubit superconducting quantum computer. Superconducting quantum computers operate by manipulating quantum bits on quantum chips cooled to under 20 mK using ultra-low-temperature refrigerators, driving them with microwave signals of around 8 GHz, and reading out the state of the bits. However, since both bit operations and readouts are analog operations, errors are inherent. Our goal is to achieve higher fidelity in the control and readout of quantum bits, providing an environment where quantum algorithms can be executed with high computational accuracy, ultimately solving our customers' challenges.
What role do you play in the team?
ShingoThe Quantum Hardware Team consists of many members responsible for tasks such as designing quantum chips, improving semiconductor manufacturing processes, designing and constructing components inside refrigerators, as well as designing and constructing control devices outside refrigerators. I am responsible for building control devices and controlling quantum bits. While much attention is often given to the development of the main body of quantum computers or quantum chips, by controlling and reading quantum bits with high precision, we can deliver the results of the development team to users, and that's my role.
How do you carry out controlling quantum bits, and in what sequence or process?
ShingoThe first step is the basic evaluation of the quantum chip, followed by calibration for controlling the quantum bits. First, we receive the quantum chip from the manufacturing team and perform performance measurements. To evaluate the chip, it is placed inside the refrigerator, and after closing the cover of the refrigerator, which is multilayered for insulation, the inside is vacuumed and cooling begins. It usually takes about two days to cool from room temperature to 20 mK. In the basic evaluation, we confirm parameters such as the resonance frequency of the quantum bits and coherence time called T1(the time it takes for a qubit to become initialized). Then, we perform calibration for quantum bit operations and readouts. Bit operations and readouts may not always yield the desired results, because there are interactions between the bits. The bit to be controlled may be affected by the neighboring bits, so it is necessary to control based on the overall situation of the bits. Therefore, we investigate why the results did not meet expectations, consult with researchers at RIKEN, and make further efforts to minimize errors.
How do you approach the challenge of insufficient accuracy in bit operations and readouts?
ShingoThere are various approaches we can try, such as improving semiconductor processes, implementing noise reduction measures in control electronics, and changing the method of microwave signal irradiation. Our team conducts studies on the waveform, intensity, phase, and irradiation timing of microwave signals necessary to improve the accuracy of quantum bit control. Initially, we try existing methods described in papers on our quantum chip and then work to improve accuracy further from there.
What other areas do you focus on or innovate in, outside of your main responsibilities? Can you also explain the reasons for this?
ShingoI am actively advancing tasks to contribute to improving the performance of quantum computer hardware further. The performance of the created quantum chip can only be evaluated by cooling it in a refrigerator and conducting measurements. Based on these results, it is important to determine what is needed to improve the performance of quantum computer hardware and provide feedback to the quantum chip design and manufacturing teams.
For Fujitsu, the development of quantum computers marks a first-time challenge. Do you have any concerns?
ShingoI believe that venturing into unknown territories is precisely where the value of a challenge lies, presenting opportunities for new discoveries and growth. Fujitsu is tackling quantum computer research and development by combining various technologies it has cultivated over the years. I aim to address challenges one by one and work towards achieving stable operation. Once stable operation is achieved, I hope to conduct research on new control methods.
What kind of activities you are undertaking to accelerate your research on quantum computers?
ShingoQuantum computing is an unknown field even for myself, so I am advancing development while consulting with researchers at RIKEN, our collaborative research partner. I aim to build a relationship of give and take, so I actively strive to cooperate if there are ways in which I can contribute to RIKEN's research.
What is your outlook for future research?
ShingoUltimately, our goal is to utilize quantum computers to solve societal issues, but quantum computing is still in its early stages of development. I believe that it is the responsibility of our Quantum Hardware Team urgently to provide application development teams with qubits and quantum gates that have many bits and high fidelity. In particular, fidelity improvement in two-qubit gate operations is a challenge in the field of control, and I aim to work on improving it. Additionally, I want to explore the development of a quantum platform that allows customers to maximize their utilization of quantum computers.
We use technology to make peoples lives happier. As a result of this belief, we have created various technologies and contributed to the development of society and our customers. At the Fujitsu Technology Hall located in the Fujitsu Technology Park, you can visit mock-ups of Fujitsu's quantum computers, as well as experience the latest technologies such as AI.
Mock-up of a quantum computer exhibited at the Fujitsu Technology Hall
See the rest here:
Exploring new frontiers with Fujitsu's quantum computing research and development - Fujitsu
Get ready for the 2024 IBM Quantum Challenge – IBM
2024 IBM Quantum Challenge June 5-14, 2024 Sign up here
Earlier this year, we debuted the first stable release of the Qiskit SDK, the IBM software for programming utility-scale quantum computers. Now, we challenge you to put it to work.
Were excited to introduce the 2024 IBM Quantum Challenge. This annual coding challenge is an educational event focused on teaching the world how quantum computational scientists use Qiskit. This years challenge is about Qiskit 1.0 and working toward utility-scale quantum experiments.
Read the Qiskit 1.0 release summary
It will begin on 5 June and run until 14 June Sign up here.
As with previous challenges, the 2024 IBM Quantum Challenge is tailored for anyone to join, regardless of their experience whetheryourea newcomer or a seasoned veteran, there is something here for you. It consists of a series of Jupyter notebooks that contain tutorial material, code examples, and auto-graded coding challenges. We call each of these notebooks a lab. While the first lab can be completed by beginners, the final labs will test your Qiskit knowledge. This is, after all, a challenge!
This years challenge will showcase the new features of Qiskit 1.0, while demonstrating the differences from previous versions. We hope it will help you better understand what it means to do utility-scale experiments with Qiskit those with 100 or more qubits and practice the steps to get there. And, to add a fun twist, the labs follow a mystery story in the world of the birds that inspired the nomenclature of IBMs quantum hardware.
For the first time in history, quantum computers are demonstrating the ability to solve problems at a scale beyond brute force classical simulation. Read more about how to harness these capabilities here.
This challenge is also an opportunity to get a sneak peek at some of the new cutting-edge features and developments in the quantum stack. That includes new integrations with AI the Qiskit code assistant powered by IBM watsonxTM.
Were also making some changes to accommodate our ever-growing quantum community. Events like this typically attract thousands of users running the same circuits multiple times, making queue times much longer than normal. Therefore, we will not be requiring hardware use as part of the Quantum Challenge. The labs in this years Challenge will focus on helping people to build intuition and make progress without waiting to run on real hardware.
However, as always, you are encouraged to run any of the code here on any of our devices which you have access to such as those available through the IBM Quantum Open, Pay-as-you-go, or Premium Plans. Thats the beauty of IBMs offerings, you can easily transition your code from testing straight to running on actual hardware.
The workflow you'll learn in this challenge is a shift away from IBM Quantum Lab, which will be sunset on 15 May, to focus on utility-scale workloads. We will offer tutorial content and hands-on support to assist you with this new workflow.
Finally, we encourage participants to host or attend a Challenge Party. In recent years weve seen an exciting increase in the number of local Challenge get-togethers held by Qiskit community members. This year, IBM will host one in New York City, while some of our partners will host events as well. More details will be released closer to the start date.
Werethrilled to have you join us on this adventure.Until then, get ready by learning all about Qiskit 1.0 and quantum computing on the IBM Quantum Learning Platform. Make sure to check your email as we get closer to 5 June and note that the official communications will come from[emailprotected].
Sign up here https://challenges.quantum.ibm.com/2024
Excerpt from:
Get ready for the 2024 IBM Quantum Challenge - IBM
History Shows How to Win the Quantum Computing Race – TIME
In 1981, physicist Richard Feynman first theorized the creation of quantum computers that harnessed the principles of quantum physics to process calculations that would take standard computers millennia or longer to compute. Over the next four decades, however, research failed to advance significantly enough for the machines to have much impact on society.
But breakthroughs in 2023 signaled that quantum computers have embarked on a new era, one that may unleash a technological revolution full of possibilitiessome good and some bad. On the positive side, quantum computers could lead to the development of new drugs to combat cancer. On the negative side, however, they can break the encryption we use multiple times per day for everything from sending texts to financial transactions.
But this isnt the first quantum race in history that pitted the U.S. against its adversariesand the past provides a guide for how the U.S. can win the coming computing revolution. In the 1940s, a quantum race produced the creation of nuclear weapons and unleashed a technology explosion. Crucially, the U.S. won the competition to harness the new technology. Not only did American scientists create the first nuclear weapons, but advancements in lasers and in chips for computers made the U.S. the home for global innovation.
That only happened, however, because policymakers supplied the funding and support necessary to ensure superiority. In 2024, by contrast, a key quantum funding bill has stalled while allies and adversaries are sinking billions into quantum research and development. Without action, history shows that the U.S. risks falling behind especially in leadership for the revolutionary power of quantum technologies.
Quantum physics developed in Europe in the 1920s and 1930s. As World War II erupted in the 1930s and 1940s, German, Hungarian, and Italian physicists escaped to the U.S. Many of them joined J. Robert Oppenheimer and his American colleagues in the Manhattan Projectwhich birthed the atomic bomb and simultaneously elevated the U.S. as the home for quantum science.
In the ensuing decades, Feynman and other scientists who cut their teeth on the Manhattan Project inspired profound innovation from quantum physics that became woven into the fabric of American life. The first quantum revolution created nuclear weapons and energy, global positioning systems, lasers, magnetic resonance imaging, and the chips that would power the rise of the personal computer.
Read More: Quantum Computers Could Solve Countless ProblemsAnd Create a Lot of New Ones
Although many countries like the Soviet Union built nuclear weapons, none rivaled the U.S. in pioneering innovation. The Soviet launch of Sputnik in 1957 and the space race produced an explosion of federal funding for science and education that was at the root of American success. Further, the Department of Defense provided crucial sponsorship for visionary, but risky, research that developed the internet, stealth capabilities, and voice assistants like Siri. This combination propelled the U.S. to unparalleled innovation heights in the decades after World War II.
The technologies born from the first quantum revolution were at the core of American national defense, and also reshaped civilian life in the U.S., most especially through the development of personal computers and the Information Revolution.
But even as personal computers were beginning to revolutionize American life in 1981, Feynman insisted in a pivotal lecture that something more was possible. He argued that a quantum computer with processing power magnitudes greater than even the highest performing computer then in existence offered the only way to unlock the true knowledge of the world. Feynman admitted, however, that building such a machine required staggering complexity.
The ensuing four decades have proved him correct on the obstacles involved. Designing a quantum computer required tremendous advances in theory as well as materials and components. Since the 1980s, progress has crept along, and many joked that quantum computers would always be 10 to 20 years away.
In 1994, mathematician Peter Shor discovered an algorithm that created a method for a quantum computer to calculate the large prime numbers used in encryption. Despite this breakthrough, the pace of developments since Shors discovery has remained glacial. Persistent funding from the National Security Agency and the Department of Defense especially the former has sustained innovation, but the results have been uneven, because scientists have been unable to build a quantum computer that wasnt plagued by errors.
In the past 10 years, private technology companies such as IBM, Google, and Microsoft have made significant investments in quantum computing, which have pushed the field to new heights of maturity and accelerated a global race for quantum dominance one with major national security and cybersecurity implications.
Even so, todays quantum computers still have yet to outperform standard computers due to regular errors caused by radiation, heat, or improper materials. These errors make quantum computers useless for things like, for example, designing new drugs, because scientists cant replicate an experiment accurately. But all of that is changing quickly.
Advances by IBM and a Harvard team in 2023 demonstrated that error correction is on the horizon and the era of quantum utility has arrived. In July 2023, IBM announced peer reviewed evidence from experiments that indicated the company had made strides in mitigating the errors that have long plagued quantum computing. A few months later in December, a Harvard team and the company QuEra published encouraging results from experiments that showed they too had developed a quantum process with enhanced error-correction.
Read More: How the AI Revolution Will Reshape the World
But its not only American companies and universities trying to figure out how to mitigate the errors that have limited the possibilities of quantum computers. Over the last 15 years, Chinese physicists have undertaken an ambitious program aimed at making their country the world leader in quantum technologies. One estimate pegs China which has invested over $15 billion in the project as a leader or near equal to the U.S. in this new realm of science. In 2023, results from experiments suggested that Chinese physicists were notching impressive achievements that may enable them to construct a quantum computer that could outpace those developed in the U.S.
The consequences of Chinese superiority in this realm would be seismic. The U.S.s foremost adversary would then be able to crack the encryption Americans use every day for secure internet traffic and messaging, and which the U.S. government and its allies use to protect secret communications. One organization projects that the world has a mere six years before this capacity exists. Other estimates insist that date is as far as 10 years away. But it is coming fast.
That means the U.S. has to get out ahead of this impending technology to forestall disastrous consequences in every realm of American life. In May 2022 the White House announced plans to prepare the nation for post-quantum encryption alongside efforts being undertaken by private companies like Apple and Google. But Congress failed to renew a landmark federal funding bill for quantum research and development in 2023. Meanwhile, China and European countries are not flinching at devoting billions to quantum.
Quantum computing breakthroughs in 2023 herald a bright future that will transform life and economics. Technology sits on the cusp of fulfilling Feynmans vision and understanding the world and universe unlike ever before. An error-correcting quantum computer would launch the second quantum revolution, and a race is on to preserve the U.S.s leadership in science for one of the 21st centurys most prized technologies. To win that race, the federal government needs to make a concerted push to sustain American preeminence in quantum computing and other quantum technologies like sensing. Thats how the U.S. won the first quantum revolution and the stakes are too high not to learn from this past triumph.
The opinions are those of the author and do not necessarily represent the opinions of LLNL, LLNS, DOE, NNSA, or the U.S. government.
Brandon Kirk Williams is a senior fellow at the Center for Global Security Research at Lawrence Livermore National Laboratory.
Made by History takes readers beyond the headlines with articles written and edited by professional historians. Learn more about Made by History at TIME here. Opinions expressed do not necessarily reflect the views of TIME editors.
See more here:
History Shows How to Win the Quantum Computing Race - TIME
Intel Takes Next Step Toward Building Scalable Silicon-Based Quantum Processors – Investor Relations :: Intel Corporation (INTC)
A photo shows a 300-millimeter Intel silicon spin qubit wafer. In May 2024, Nature published an Intel research paper, Probing single electrons across 300-mm spin qubit wafers, demonstrating state-of-the-art uniformity, fidelity and measurement statistics of spin qubits. (Credit: Intel Corporation)
Research published in Nature demonstrates high qubit control fidelity and uniformity in single-electron control.
SANTA CLARA, Calif.--(BUSINESS WIRE)-- Today, Nature published an Intel research paper, Probing single electrons across 300-mm spin qubit wafers, demonstrating state-of-the-art uniformity, fidelity and measurement statistics of spin qubits. The industry-leading research opens the door for the mass production and continued scaling of silicon-based quantum processors, all of which are requirements for building a fault-tolerant quantum computer.
This press release features multimedia. View the full release here: https://www.businesswire.com/news/home/20240501215284/en/
A photo shows a 300-millimeter Intel silicon spin qubit wafer. In May 2024, Nature published an Intel research paper, Probing single electrons across 300-mm spin qubit wafers, demonstrating state-of-the-art uniformity, fidelity and measurement statistics of spin qubits. (Credit: Intel Corporation)
Quantum hardware researchers from Intel developed a 300-millimeter cryogenic probing process to collect high-volume data on the performance of spin qubit devices across whole wafers using complementary metal oxide semiconductor (CMOS) manufacturing techniques.
The improvements to qubit device yield combined with the high-throughput testing process enabled researchers to obtain significantly more data to analyze uniformity, an important step needed to scale up quantum computers. Researchers also found that single-electron devices from these wafers perform well when operated as spin qubits, achieving 99.9% gate fidelity. This fidelity is the highest reported for qubits made with all-CMOS-industry manufacturing.
The small size of spin qubits, measuring about 100 nanometers across, makes them denser than other qubit types (e.g., superconducting), enabling more complex quantum computers to be made on a single chip of the same size. The fabrication approach was conducted using extreme ultraviolet (EUV) lithography, which allowed Intel to achieve these tight dimensions while also manufacturing in high volume.
Realizing fault-tolerant quantum computers with millions of uniform qubits will require highly reliable fabrication processes. Drawing upon its legacy in transistor manufacturing expertise, Intel is at the forefront of creating silicon spin qubits similar to transistors by leveraging its cutting-edge 300-millimeter CMOS manufacturing techniques, which routinely produce billions of transistors per chip.
Building on these findings, Intel plans to continue to make advances in using these techniques to add more interconnect layers to fabricate 2D arrays with increased qubit count and connectivity, as well as demonstrating high-fidelity two-qubit gates on its industry manufacturing process. However, the main priority will continue to be scaling quantum devices and improving performance with its next generation quantum chip.
Read the complete findings in Nature.
About Intel
Intel (Nasdaq: INTC) is an industry leader, creating world-changing technology that enables global progress and enriches lives. Inspired by Moores Law, we continuously work to advance the design and manufacturing of semiconductors to help address our customers greatest challenges. By embedding intelligence in the cloud, network, edge and every kind of computing device, we unleash the potential of data to transform business and society for the better. To learn more about Intels innovations, go to newsroom.intel.com and intel.com.
Intel Corporation. Intel, the Intel logo and other Intel marks are trademarks of Intel Corporation or its subsidiaries. Other names and brands may be claimed as the property of others.
View source version on businesswire.com: https://www.businesswire.com/news/home/20240501215284/en/
Laura Stadler 1-619-346-1170 laura.stadler@intel.com
Source: Intel
Released May 1, 2024 11:00 AM EDT
See the rest here:
Intel Takes Next Step Toward Building Scalable Silicon-Based Quantum Processors - Investor Relations :: Intel Corporation (INTC)
Seize the Quantum Opportunity: Why You Should Buy IONQ Stock Before May 8 – InvestorPlace
Very soon, IonQ stock could take a post-earnings quantum leap forward
Source: Amin Van / Shutterstock.com
Admit it. You want to find the next Nvidia (NASDAQ:NVDA) stock before it takes a rocket ride. We all do! Right now, theres a needle-in-a-haystack opportunity withIonQ(NYSE:IONQ) stock as IonQ gears up to release its quarterly results.
Just to recap, IonQopenedthe first dedicated quantum-computing manufacturing facility in the U.S. This company is a quantum-computing innovator, but it remains unrecognized among investors. Thats actually a good thing, though as IonQ could easily be the next Nvidia and should be top-of-mind on Wall Street soon enough.
Why did Nvidia stock soar in 2023 and 2024? This happened because the demand for artificial intelligence processors exploded, and Nvidia specializes in AI-enabled processors.
Similarly, IonQ specializes in quantum-computing technology, so IonQ stock offers a pure play in this emerging field. Granted, you might not be aware of the various use cases for quantum computing. For one thing, it can help to provide the vast computing power thats required for AI applications.
There are other use cases, as well. For example, IonQ is teaming up with German-based science research center Deutsches Elektronen-Synchrotron. Together, they will advance quantum-computing technology in the field of flight-gate optimization.
Furthermore, IonQ is collaborating with Oak Ridge National Laboratory (ORNL) to explore how quantum technology can be used to modernize the power grid. This crucial research project is funded by none other than the U.S. Department of Energy.
As you can see, IonQ is a highly active specialist in an exciting technology field. Is IonQ able to generate substantial revenue, though? The answer, as the data will show, is definitely yes.
Heres the evidence. In 2023s fourth quarter, IonQ generated $6.106 million in revenue. Thats up 60% year over year compared to the $3.807 million in revenue that IonQ reported in the year-earlier quarter. Plus, this result beat Wall Streets consensus call for $5.8 million in revenue.
Due to the companys expenses, IonQ recorded a Q4-2023 net earnings loss of 20 cents per share in 2023s fourth quarter. Interestingly, the analysts consensus estimate calls for IonQ to generate $6.6 million in first-quarter 2024 revenue.
IonQs guidance predicts that the company will generate Q1-2024 revenue of $6.5 million to $7.5 million. The midpoint of that guidance range ($7 million) is higher than what Wall Street expects ($6.6 million). Is it possible that the company knows something that the analysts dont?
Additionally, the analysts consensus estimate calls for IonQ to lose 25 cents per share in Q1 of 2024. Thats a low bar to clear, since IonQ only lost 20 cents per share in 2023s third quarter and 14 cents per share in the year-earlier quarter.
So, mark your calendar for May 8. Thats when IonQ will publish its first-quarter 2024 results and, potentially, deliver not-as-bad-as expected results i.e., a positive surprise.
As weve discovered, IonQ is discovering new and important use cases for quantum-computing technology. Hence, for a pure play on the quantum-computing revolution, forward-thinking investors should take a share position in IonQ.
Moreover, the time to take action is now. IonQ has a prime opportunity to beat the Streets muted expectations on May 8. With that in mind, I encourage you to get a head start in May and buy IonQ stock today.
On the date of publication, David Moadeldid not have (either directly or indirectly) any positions in the securities mentioned in this article.The opinions expressed in this article are those of the writer, subject to the InvestorPlace.comPublishing Guidelines.
David Moadel has provided compelling content and crossed the occasional line on behalf of Motley Fool, Crush the Street, Market Realist, TalkMarkets, TipRanks, Benzinga, and (of course) InvestorPlace.com. He also serves as the chief analyst and market researcher for Portfolio Wealth Global and hosts the popular financial YouTube channel Looking at the Markets.
Continued here:
Seize the Quantum Opportunity: Why You Should Buy IONQ Stock Before May 8 - InvestorPlace
The $1 Billion Bet on Quantum Computers That Process Light – DISCOVER Magazine
In the battle to build the worlds first useful quantum computers, one company has taken an entirely different approach to the other frontrunners. The conventional approach is to gradually increase the size and power of these devices and test as you go.
But PsiQuantum, a startup based in Palo Alto, California, is gambling on the opposite approach. The company is investing heavily in quantum technologies that are compatible with chip-making fabrication plants that already exist. By using these facilities, their goal is to mass-produce powerful silicon-based quantum computers from the very beginning.
This week, they reveal how well this approach is going and discuss the challenges that still lie ahead.
Founded in 2016, PsiQuantum hit the headlines in 2021 when it raised $700 million to pursue its goal of building useful quantum computers within a decade. This week, it announced a similar injection from the Australian government bring its total funding to some $1.3 billion. That makes it one of the best funded startups in history.
The excitement is largely because of PsiQuantums unique approach. A key decision is its choice of quantum bits or qubits. Other companies are focusing on superconducting qubits, ion traps, neutral atoms, quantum dots and so on.
PsiQuantum has opted to use photons. The advantage is that photons do not easily interact with the environment, so their quantum nature is relatively stable. Thats important for computation.
Paradoxically, this reluctance to interact is also the main disadvantage of photons. Its hard to make them interact with each other in a way that processes information.
But various groups have demonstrated optical quantum computing and PsiQuantum was founded by researchers in this area from Imperial College London and the University of Bristol.
Optical quantum computing works by creating photons or pairs of them, guiding them through channels carved into silicon where they can interact and then measuring their properties with highly specialized detectors.
PsiQuantum intends to do all this with silicon wafers. Their bold idea was that we already know how to make silicon chips on a vast scale by mass-production. Chip fabrication plants cost billions to build so there is a significant advantage in being able to use this current technology.
And by making bigger, more densely packed chips, optical quantum computers can scale relatively easily. Unlike other designs where scaling will be much harder.
So all the focus has been on how to make the manufacture optical quantum computing chips compatible with conventional fabrication plants.
Thats not as easy as it sounds. So this weeks paper outlining their advances has been eagerly awaited.
The team have reached numerous goals. We modified an established silicon photonics manufacturing flow to include high-performance single photon detection and photon pair generation, they say. To our knowledge, this is the first realization of an integrated photonic technology platform capable of on-chip generation, manipulation, and detection of photonic qubits.
But there are significant steps ahead. PsiQuantum still needs to develop a variety of next generation technologies to make large scale photonic quantum computation feasible. It will be necessary to further reduce Silicon Nitride materials and component losses, improve filter performance, and increase detector efficiency to push overall photon loss and fidelity, say the team.
For example, the on-chip photon detectors that are built into waveguides need to be able to count individual photons. The on-chip photon waveguides need to be lower loss. And perhaps the biggest challenge is in developing high speed optoelectronic switches that can rapidly reconfigure optical circuits.
PsiQuantum is making these switches out of barium titanate (BTO), a material that must be incorporated into the fabrication process. We have developed a proprietary process for the growth of high-quality BTO films using molecular beam epitaxy, compatible with foundry processes, they say.
All that looks impressive, but the paper does not include a demonstration of quantum computing itself.
Perhaps its too early to expect that. To be fair, basic quantum computation with photons has long been possible with these kinds of systems at a small scale.
The singular intent of our development is a useful fault-tolerant quantum computer, they say. PsiQuantum has also said elsewhere that its goal is to achieve this by 2029.
Of course, it faces stiff competition from other manufacturers of quantum computers. Itll be an exciting race and the (quantum) clock is ticking.
Ref: A manufacturable platform for photonic quantum computing : arxiv.org/abs/2404.17570
Original post:
The $1 Billion Bet on Quantum Computers That Process Light - DISCOVER Magazine
JPMorgan Chase and AWS study the prospects for quantum speedups with near-term Rydberg atom arrays | Amazon … – AWS Blog
This post was contributed by Martin Schuetz, Ruben Andrist, Grant Salton, and Helmut Katzgraber from the Amazon Quantum Solutions Lab, and Pierre Minssen, Romina Yalovetzky, Shouvanik Chakrabarti, Dylan Herman, Niraj Kumar, Ruslan Shaydulin, Yue Sun, and Marco Pistoia from JPMorgan Chase
Many companies face combinatorial optimization problems and across both science and industry there are prominent examples in areas like transportation and logistics, telecommunications, manufacturing, and finance.Analog neutral-atom quantum machines can provide a novel platform to design and implement quantum optimization algorithms, with scientists in both industry and academia searching for the most promising types of problems for which an early quantum advantage could be demonstrated.
Over the past year, the Amazon Quantum Solutions Lab (QSL) has worked together with the Global Technology Applied Research team at JPMorgan Chase to conduct a systematic study of the hardness of certain optimization problems, inspired by real-world use cases in finance.
In this post, well describe our project and summarize the key results. Motivated by recent experiments reporting a potential super-linear quantum speedup [2], we studied the problem native to Rydberg atom arrays (the maximum independent set problem on unit-disk graphs). We identified a class of problem instances with controllable hardness and minimal overhead for neutral atom quantum hardware.
We think this work sets the stage for potentially impactful future experiments towards quantum advantage. For further technical details, feel free to check out our original work published in Physical Review Research [1].
Given its potentially far-reaching impact, the potential demonstration of quantum speedups for practically relevant, computationally hard problems has emerged as one of the greatest milestones in quantum information science. Over the last few years,Rydberg atom arrays have established themselves among the leading contenders for the demonstration of such quantum speedups; see this blog post for more details. In particular, in Ref. [2] a potential super-linear quantum speedup over classical simulated annealing has been reported for the maximum independent set problem (MIS) on unit-disk graphs (MIS-UD), based on variational quantum algorithms run on Rydberg atom arrays with up to 289 qubits arranged in two spatial dimensions.
This work focused on benchmarking quantum variational algorithms against a specific implementation of simulated annealing (representing a classical analogue of the adiabatic algorithm), yet it left open the question of benchmarking against other state-of-the-art classical solvers. In our work, we study the MIS-UD problem in detail (see Figure 1 for a schematic illustration), with a broader range of classical solvers beyond the scope of the original paper. Our main goal is to empirically assess the hardness of the MIS-UD problem, to help zero in on problem instances and system sizes where quantum algorithms could ultimately be useful, and thus identify the most promising directions for future experiments with Rydberg atoms.
Figure 1. Schematic illustration of the problem. (a) We consider unit-disk graphs with nodes arranged on a two-dimensional square lattice with side length L and ~80% of all lattice sites filled, and edges connecting all pairs of nodes within a unit distance (illustrated by the circle). (b) Our goal is to solve the MIS problem on this family of Union-Jack-like instances (as depicted here with nodes colored in red in the right panel) and assess the hardness thereof using both exact and heuristic algorithms.
The MIS problem is an important combinatorial optimization problem with practical applications in network design, vehicle routing, and finance, among others, and is closely related to the maximum clique, minimum vertex cover, and set packing problems [3]. Here we provide two complementary problem formulations, one based on an integer linear program and one based on an Ising-type Hamiltonian compatible with Rydberg atom arrays. We discuss important figures of merit to assess problem hardness.
Consider a graph G = (V, E) with vertex set V and edge set E, with an independent set defined as a subset of vertices that are not connected with each other. The MIS problem is then the task to find the largest independent set. Introducing a binary variable xi for every node (with xi = 1 if node i is in the independent set, and xi = 0 otherwise), the MIS problem can formally be expressed as a compact integer linear program of the form:
with the objective to maximize the marked vertices while adhering to the independence constraint.
Alternatively, a problem formulation commonly used in physics literature expresses this program in terms of a Hamiltonian thatincludes a soft penalty to non-independent configurations (that is when two vertices in the set are connected by an edge) to model the hard independence constraint. This Hamiltonian is given by
with a negative sign in front of the first term because the largest independent set is searched for within an energy minimization problem, and where the penalty parameter V > 1 enforces the constraints.
Energetically, this Hamiltonian favors having each variable in the state xi = 1 unless a pair of vertices are connected by an edge. This second unconstrained formulation provides a straightforward connection to Rydberg atom arrays. Specifically, by mappingthe binary variables xi to two-level Rydberg atoms, MIS-UD problems can be encoded efficiently with Rydberg atoms placed at the vertices of the target problem graph. Strong Rydberg interactions between atoms (as described by the second term in the Hamiltonian) then prevent two neighboring atoms from being simultaneously in the excited Rydberg state.Using a coherent drive with Rabi frequency and detuning , one can then search for the ground state of the Hamiltonian H (encoding the MIS) via, for example, quantum-annealing-type approaches. Compare this blog post for more details on annealing-type quantum optimization algorithms running on Rydberg hardware.
To assess the practical hardness of the MIS-UD problem and compare the performance of various algorithms, we consider the following key figures of merit:
In our work, we studied the MIS-UD problem described earlier using both exact and heuristic algorithms. Here we provide a brief overview of our tool suite for more technical details we refer to Ref. [1].
We now turn to our numerical results. Here we will highlight just a few selected results for more details we refer to Ref. [1].
TTS as function of system size. First, we study TTS as a function of system size (given by the number of nodes in the graph). Our results are displayed in Fig. 2. We find that typical quasi-planar instances with Union-Jack-like connectivity (as studied in Ref. [2]) can be solved to optimality for up to thousands of nodes within minutes, with both custom and generic commercial solvers on commodity hardware, without any instance-specific fine-tuning. For most instances (taken as 98th percentile) we can upper-bound the TTS needed by classical B&B or SA solvers through a runtime scaling of the form TTS = O(2aN); we find a = 0.0045 and a = 0.0128 for our B&B and SA solvers, respectively; these resultsset an interesting, putative bar for quantum algorithms to beat. In addition, we observe a relatively large spreadspanning several orders of magnitude in TTS, displaying significant instance-to-instance variations, even for fixed system size, thus motivating a more detailed analysis of problem hardness, as discussed next.
Figure 2. TTS as a function of system size. (Left) B&B solver: Problems with hundreds (thousands) of nodes can be solved to optimality in subsecond (minute) timescales. The solid line is the linear regression over instances whose TTS are in the highest 2%. (Right) SA solver: Time required to reach 99% success probability for the heuristic SA solver as a function of system size (how long the solver should run for a 99% chance of finding the optimal solution). For every system size, 1000 random UD instances have been considered.
Hardness parameter.We now consider algorithmic performance in terms of the hardness parameter HP that accounts for both the degeneracy of the ground as well as first excited states. Our resultsfor both the exact SLA as well as the heuristic SA solvers are displayed in Fig 3., showing a remarkably different behavior. The SA solver displays a strong dependence on the hardness parameter. Conversely, virtually no dependence is observed for the exact SLA solver, thereby demonstrating that the conductance-like hardness parameter HP successfully captures hardness for algorithms undergoing Markov-chain dynamics, while alternative algorithmic paradigms (like sweeping line) may require a different notion of hardness.
In particular, we find that for the SA solver the success probability PMIS fits well to the functional form PMIS=1-exp(-C HPa), where C refers to a positive fitted constant and smaller values of a yield larger success rate. We find 0.66 for our implementation of SA, competitive with a=0.63 as reported for the optimized quantum algorithm demonstrated in Ref. [2] (for smaller systems than studied here, and when restricting the analysis to graphs with minimum energy gaps sufficiently large to be resolved in the duration of the noisy quantum evolution). This is much better than the SA baseline results in Ref. [2] with 1.03. As such, the quantum speedup reported in Ref. [2] could be classified as limited sequential quantum speedup, based on comparing a quantum annealing type algorithm with a particular implementation of classical SA, while our analysis points at a potential next milestone, in the form of the experimental demonstration of a (more general) limited non-tailored quantum speedup, by comparing the performance of the quantum algorithm to the best-known generic classical optimization algorithm.
Figure 3. Dependence on hardness parameter HP (for different system sizes, for lattices with side length L=13 and about 135 nodes up to lattices with L=33 and about 870 nodes). (Left) Time-to-solution (TTS) for the exact SLA solver as a function of the hardness parameter HP. Virtually no dependence on HP is observed, showing that TTS is fully determined by the system size N~L^2. (Right) Conversely, for the Markov-chain based SA solver, we observe a strong correlation between algorithmic performance and the hardness parameter HP. Here we plot log(1 P_MIS), for UD graphs selected from the top two percentile of hardness parameter for each system size. Power-law fits to the form ~HP^(-a) are used to extract scaling performance with graph hardness.
Tuning problem hardness. We now study hardness as we gradually change the topology of the problem instances. Specifically, we analyze TTS following two protocols by either (i) systematically tuning the blockade radius or (ii) randomly rewiring edges of the graph. While protocol (i) prepares UD graphs (with varying connectivity), protocol (ii) explicitly breaks the UD structure via random (potentially long-range) interactions, ultimately preparing random structure-less Erds-Rnyi (ER) graphs. The results of this analysis are shown in Fig. 4. We find that TTS for the established B&B solver can be tuned systematically over several orders of magnitude. As such, these two protocols suggest a potential recipe to benchmark quantum algorithms on instances orders of magnitude harder for established classical solvers than previously studied, and motivate interesting future experiments towards quantum advantage; in particular, our protocols help identify small, but hard instances, as needed for thorough scaling analyses.
Figure 4. Hardness transition. (Left) Hardness transition as a function of the disk radius (in units of the lattice spacing), as given by the time-to-solution (TTS) for the B&B solver, shown here for systems with about 350 nodes, with 100 random seeds per radius. (Right) Hardness transition from unit-disk to random Erds-Rnyi (ER) graphs (denoted by the red shaded bands). Here TTS is given as a function of the fraction of edges rewired. Starting from Union-Jack-type UD graphs (left), edges are randomly selected and rewired, thereby gradually breaking the UD connectivity, and ultimately generating random ER graphs (right). While the original UJ graphs can be solved to optimality in ~10^(-2)s, we observe TTS potentially orders of magnitudes larger in both plots.
Our work provides an in-depth look into the hardness of themaximum independent set problem on unit-disk graphs, the problem native toRydberg atom arrays. Our results establish well-defined goal posts for quantum algorithms to beat. In particular, we have shown that the hardness parameter put forward in Ref. [2] captures problem hardness for a certain class of Markov chain Monte Carlo solvers, while virtually no dependence between time-to-solution and this parameter is observed for alternative solvers. Finally, we have identified protocols to systematically tune time-to-solution over several orders of magnitude, pinpointing problem instancesorders of magnitude harder for established classical solvers than previously studied.
These results should help identify the most promising directions for applications of Rydberg devices and direct the communitys on-going efforts towards quantum advantage, hopefully inspiring many interesting future experiments further exploring the hardness of the MIS problem with Rydberg atom arrays.
The content and opinions in this blog are those of the third-party author and AWS is not responsible for the content or accuracy of this blog.
Thisblogpost is for informational purposes only and is not intended as legal, tax, financial, investment, accounting or regulatory advice. Opinions expressed herein are the personal views of the individual(s) and do not represent the views of JPMorgan Chase & Co. The accuracy of any statements, linked resources, reported findings or quotations is not the responsibility of JPMorgan Chase & Co.
[1]R. S. Andrist, M. J. A. Schuetz, P. Minssen, R. Yalovetzky, S. Chakrabarti, D. Herman, N. Kumar, G. Salton, R. Shaydulin, Y. Sun, M. Pistoia, and H. G. Katzgraber, Hardness of the Maximum Independent Set Problem on Unit-Disk Graphs and Prospects for Quantum Speedups, Phys. Rev. Research 5, 043277 (2023); arXiv:2307.09442.
[2]S. Ebadi, A. Keesling, M. Cain, T. T. Wang, H. Levine, D. Bluvstein, G. Semeghini, A. Omran, J.-G. Liu, R. Samajdar, et al., Quantum optimization of Maximum Independent Set using Rydberg atom arrays, Science 376, 1209 (2022).
[3]J. Wurtz, P. L. S. Lopes, N. Gemelke, A. Keesling, and S. Wang, Industry applications of neutral-atom quantum computing solving Independent Set problems, arXiv:2205.08500 (2022).
Read the original post:
JPMorgan Chase and AWS study the prospects for quantum speedups with near-term Rydberg atom arrays | Amazon ... - AWS Blog
3 Quantum Computing Stocks That Could Be Multibaggers in the Making: April Edition – InvestorPlace
Quantum computing technology and applications across sectors are advancing rapidly; the market is predicted to grow over 32% from 2023 to 2030, attracting attention toward multibagger quantum computing stocks.
Furthermore, industry developments such as IBMs (NYSE:IBM) 433-qubit Osprey processor represent a substantial advancement in quantum computing, enabling complicated calculations beyond traditional computers.
Amidst this, an investor may wonder how to buy multibagger quantum computing stocks. Compared to IBM and Microsoft (NASDAQ:MSFT), there is huge potential for reward and a little commitment when it comes to penny stocks. Moreover, the three multibagger quantum computing stocks we shall examine promise up to triple-digit potential. Today were looking at stocks that offer strong gross margins, solid pioneering achievements and major quarterly performance advances.
Source: Shutterstock
Quantum Computing (NASDAQ:QUBT) creates low-cost nanophotonics-based quantum-tech industrial products.
The penny play, one of the most inexpensive multibagger quantum computing stocks, is down 15% in 2024. However, its strong gross margins of 45% are greater than 84% of its industrys, even though the company is losing money.
Most notably, through its quantum optimization platform, Dirac-3, QUBT is a leader in the field. Another creation of QUBT is the Quantum Photonic Vibrometer. Better vibration detection, perception and analysis are possible with the first quantum-accelerated photonics vibrometer. This business and military gadget detects explosives and hidden weapons up to 30 inches below the surface.
Furthermore, the company has expanded its CRADA to include use case testing using the Dirac-1 Entropy Quantum Computer at Los Alamos National Laboratory.
The company was subcontracted by the Bay Area Environmental Research Institute to develop and test a photonic sensor for NASA Ames. This sensor enhances the companys NASA relationship by measuring the size, form and chemistry of clouds, aerosols and volcanic ash.
A fourth NASA grant was given to QUBT to use entropy quantum computing to denoise LiDAR spectrum data. This technology is needed for NASA missions in order to provide accurate measurements for observation and exploration at any time of day.
Source: T. Schneider / Shutterstock
Quantum computing gear, software and services provider D-Wave Quantum (NYSE:QBTS) reported strong revenue and bookings last quarter.
Sales rose 21% in Q4 and 22% in 2023. Q4 bookings up 34% and FY 89% year-over-year. Like QUBT, this penny stock will fluctuate.
Even though sales and bookings grew by more than 10%, QBTS didnt do better than experts expected. Earnings per share were -0.09, which was about 10% less than expected. There was a selloff because sales were 38% lower than expected, at $2.9 million instead of $4.7 million.
Moving towards its latest tech moves, the 1,200+ qubit D-Wave Advantage2 prototype is now available via its Leap Quantum Cloud Service. This prototypes stacked superconducting integrated circuit manufacturing stack reduces noise. Complex applications like machine learning benefit from 20 times quicker optimization.
Moreover, D-Wave demonstrated quantum spin glass flow on over 5,000 qubits. This advances its annealing and gate-type quantum computer. Advantage2 should have over 7,000 qubits.
No superconducting qubit on the market has better coherence than D-Waves fluxonium qubits. Gate model quantum computing systems improve fluxonium qubits. This will impact D-Waves quantum technology.
The consensus on QBTS stock is Strong Buy, and its 80% growth potential points to a rebound.
Source: Boykov / Shutterstock.com
The last of our top quantum computing multibaggers is Rigetti Computing (NASDAQ:RGTI). Similar to the previous two penny stocks, however, its last quarter was better than expected, giving it some brownie points.
RGTIs sales of $3.38 million exceeded estimates by 10%. Its EPS came in at a loss of $0.04 instead of $0.06, 32% better than anticipated.
In terms of innovation, Rigettis 84-qubit Ankaa-2 quantum computer makes mistakes 2.5 times less often than older QPUs. This is what made it a success when it came to operation and sales. The company got 98% of the time with two qubits correctly, which means the quantum bits worked well together.
Rigetti also wants to make quantum computing easier to access by combining the 9-qubit Novera QPU with existing systems.
Internationally, Oxford Instruments NanoScience and Rigetti UK recently completed a project to construct one of the first UK quantum computers. Horizon Quantum Computing also got a Rigetti Novera QPU for a hardware testbed in Singapore so that it can grow in the Asia-Pacific region, which is the fastest-growing quantum computing market.
The upside potential is significant at 170%, with a consensus recommendation of Strong Buy.
On the date of publication, Faizan Farooque did not have (either directly or indirectly) any positions in the securities mentioned in this article.The opinions expressed in this article are those of the writer, subject to the InvestorPlace.comPublishing Guidelines.
Faizan Farooque is a contributing author for InvestorPlace.com and numerous other financial sites. Faizan has several years of experience in analyzing the stock market and was a former data journalist at S&P Global Market Intelligence. His passion is to help the average investor make more informed decisions regarding their portfolio.
Go here to see the original:
3 Quantum Computing Stocks That Could Be Multibaggers in the Making: April Edition - InvestorPlace
Quix checks off another condition to build universal quantum computer Bits&Chips – Bits&Chips
12:23
Researchers using Quix Quantums technology have successfully demonstrated the on-chip generation of so-called Greenberger-Horne-Zeilinger (GHZ) states, a critical component for the advancement of photonic quantum computing. The Dutch startup focusing on photonics-based quantum computing hails the result as a breakthrough that validates the companys roadmap towards building a scalable universal quantum computer.
The creation of GHZ states is necessary for photonic quantum computers. In a matter-based quantum computer, qubits are stationary, typically positioned on a specialized chip. By contrast, a photonic quantum computer uses flying qubits of light to process and transmit information. This information is constantly passed from one state to another through a process called quantum teleportation. The GHZ states entanglements across three photonic qubits are the crucial resource enabling the computer to maintain this information.
This milestone demonstrates the capability of photonic quantum computers to generate multi-photon entanglement in a way that advances the roadmap toward large-scale quantum computation. The generation of GHZ states is evidence of the transformative potential of Quix Quantums photonic quantum computing technology, commented CEO Stefan Hengesbach of Quix.
Quix next challenge is now making many of these devices. When comparing one GHZ state to a million GHZ states, think of it as the spark needed to create a blazing fire. The more GHZ states a photonic quantum computer contains, the more powerful it becomes, added Chief Scientist Jelmer Renema.
Read more from the original source:
Quix checks off another condition to build universal quantum computer Bits&Chips - Bits&Chips
Japan to expand export restrictions on semiconductor and quantum computing technology – DatacenterDynamics
The Japanese government has announced plans to expand export restrictions on technologies related to semiconductors and quantum computing.
According to a Bloomberg report, impacted technologies include scanning electron microscopes and gate-all-around transistors, which companies including Samsung Electronics have been using to improve semiconductor design.
The report added that the Japanese government will also start requiring licenses for the shipment of quantum computers and cryogenic CMOS circuits, which are used to control the input and output signals of qubits in quantum computers.
Favored trading partners of Japan, including South Korea, Singapore, and Taiwan, will not be exempt from the new rules, which are expected to come into force in July following a period of public consultation.
At the start of 2023, it was reported that Japan, alongside the Netherlands, had agreed to comply with a number of US-led restrictions relating to the exportation of high-tech chipmaking technology to China.
Originally posted here:
Japan to expand export restrictions on semiconductor and quantum computing technology - DatacenterDynamics