Page 535«..1020..534535536537..540550..»

3 Quantum Computing Stocks to Tap into the Future in 2024 – InvestorPlace

Invest in quantum leap in 2024, uncovering tech stocks driving the quantum computing industry's explosive growth

As we usher in 2024, quantum computing stocks are not just buzzwords but pivotal players in a technological revolution. Quantum computing, a field brewing for decades, currently stands at the forefront of innovation. Its a realm where the peculiarities of quantum mechanics converge to forge computing power, effectively dwarfing traditional methods.

Moreover, though quantum computing still dances mostly within the experimental stages in commercial settings, its promise remains undeniable. Functional quantum systems are no longer a fragment of science fiction they are a reality. The implications of this technology are vast and varied, stretching from societal advancements to inevitable security challenges. Yet, the promise held within these quantum computing stocks is palpable, a promise of a future where the benefits far surpass the risks.

Source: Amin Van / Shutterstock.com

IonQ(NYSE:IONQ) stands out in the quantum computing space as a dedicated player, distinct from the sprawling tech giants which traditionally dominate the sector. This focus gives IonQ an edge, primarily considering its smaller market cap which hints at a robust upside potential for investors. As the first pure-play quantum computing company to go public, IonQ doesnt just participate in the quantum computing conversation; it leads it. With the industry still in its early stages, IonQs role in the sector is critical to the quantum computing narrative.

A significant draw for IonQ is its impressive collaborations with all three major cloud providers. Notably, its Aria quantum computer integrates seamlessly with Amazon (NASDAQ:AMZN), a platform enabling advanced tasks, including testing quantum circuits. This accessibility is a big leap forward for quantum computing applications. Financially, IonQs trajectory has been remarkable. Surpassing its $100 million cumulative bookings target since 2021 and accumulating $58.4 million in bookings in 2023 alone, IonQ demonstrates potent growth. Despite its unprofitability in terms of cash flow, the companys revenue for the third quarter surged by 122% year-over-year (YOY), a clear indicator of its mushrooming potential in a nascent yet rapidly evolving market.

Source: Sergio Photone / Shutterstock.com

Nvidia(NASDAQ:NVDA) has established itself as a titan in the tech sphere, particularly in 2023, with its groundbreaking h100 chips leading the charge in artificial intelligence (AI) applications. The anticipation for 2024 is already high as Nvidia gears up to unveil the h200, the successor to the h100. The h200 is poised to elevate Nvidias status even further, reinforcing its position as a frontrunner in the tech world.

Beyond its AI prowess, Nvidia is making significant strides in quantum computing. Its cuQuantum project, aimed at stimulating quantum circuits, has broken new ground in the simulation of ideal and noisy qubits. Nvidias expertise in simulating quantum computing environments is another compelling reason for investors to take note. Moreover, Nvidias projections for a potential quadrupling by 2035 indicate a promising path for long-term investment.

Source: IgorGolovniov / Shutterstock.com

Alphabet(NASDAQ:GOOG, NASDAQ:GOOGL) is emerging as a powerhouse in the quantum computing sphere, achieving a pivotal breakthrough in February by reducing computational errors in its quantum bits. This advancement is a critical step towards making quantum computers not only usable but commercially viable. Alphabets dedication to overcoming one of the major hurdles in quantum computing commercialization highlights its commitment to leading in this innovative field.

Financially, Alphabet is on a strong footing, bolstered by its decision to efficiently reorganize its advertising business, which represents a staggering 80% of its total revenue. This reorganization comes on the heels of a remarkable $54.4 billion in ad sales in the recent quarter. Such strategic shifts could further enhance the companys robust financial performance. Additionally, Alphabets foray into AI with the launch of Gemini, an AI model designed to rival Microsofts OpenAI, showcases its ambition to convert technological prowess into tangible sales growth. The companys impressive top and bottom lines, with sales of $297.13 billion and net income of $66.73 billion, further solidify its position as a robust contender in the tech arena, poised for continued growth and innovation.

On the date of publication, Muslim Farooque did not have (either directly or indirectly) any positions in the securities mentioned in this article.The opinions expressed in this article are those of the writer, subject to the InvestorPlace.comPublishing Guidelines.

Read more from the original source:
3 Quantum Computing Stocks to Tap into the Future in 2024 - InvestorPlace

Read More..

The AIquantum computing mash-up: will it revolutionize science? – Nature.com

Call it the Avengers of futuristic computing. Put together two of the buzziest terms in technology machine learning and quantum computers and you get quantum machine learning. Like the Avengers comic books and films, which bring together an all-star cast of superheroes to build a dream team, the result is likely to attract a lot of attention. But in technology, as in fiction, it is important to come up with a good plot.

If quantum computers can ever be built at large-enough scales, they promise to solve certain problems much more efficiently than can ordinary digital electronics, by harnessing the unique properties of the subatomic world. For years, researchers have wondered whether those problems might include machine learning, a form of artificial intelligence (AI) in which computers are used to spot patterns in data and learn rules that can be used to make inferences in unfamiliar situations.

Now, with the release of the high-profile AI system ChatGPT, which relies on machine learning to power its eerily human-like conversations by inferring relationships between words in text, and with the rapid growth in the size and power of quantum computers, both technologies are making big strides forwards. Will anything useful come of combining the two?

Many technology companies, including established corporations such as Google and IBM, as well as start-up firms such as Rigetti in Berkeley, California, and IonQ in College Park, Maryland, are investigating the potential of quantum machine learning. There is strong interest from academic scientists, too.

CERN, the European particle-physics laboratory outside Geneva, Switzerland, already uses machine learning to look for signs that certain subatomic particles have been produced in the data generated by the Large Hadron Collider. Scientists there are among the academics who are experimenting with quantum machine learning.

Our idea is to use quantum computers to speed up or improve classical machine-learning models, says physicist Sofia Vallecorsa, who leads a quantum-computing and machine-learning research group at CERN.

The big unanswered question is whether there are scenarios in which quantum machine learning offers an advantage over the classical variety. Theory shows that for specialized computing tasks, such as simulating molecules or finding the prime factors of large whole numbers, quantum computers will speed up calculations that could otherwise take longer than the age of the Universe. But researchers still lack sufficient evidence that this is the case for machine learning. Others say that quantum machine learning could spot patterns that classical computers miss even if it isnt faster.

Researchers attitudes towards quantum machine learning shift between two extremes, says Maria Schuld, a physicist based in Durban, South Africa. Interest in the approach is high, but researchers seem increasingly resigned about the lack of prospects for short-term applications, says Schuld, who works for quantum-computing firm Xanadu, headquartered in Toronto, Canada.

Some researchers are beginning to shift their focus to the idea of applying quantum machine-learning algorithms to phenomena that are inherently quantum. Of all the proposed applications of quantum machine learning, this is the area where theres been a pretty clear quantum advantage, says physicist Aram Harrow at the Massachusetts Institute of Technology (MIT) in Cambridge.

Over the past 20 years, quantum-computing researchers have developed a plethora of quantum algorithms that could, in theory, make machine learning more efficient. In a seminal result in 2008, Harrow, together with MIT physicists Seth Lloyd and Avinatan Hassidim (now at Bar-Ilan University in Ramat Gan, Israel) invented a quantum algorithm1 that is exponentially faster than a classical computer at solving large sets of linear equations, one of the challenges that lie at the heart of machine learning.

But in some cases, the promise of quantum algorithms has not panned out. One high-profile example occurred in 2018, when computer scientist Ewin Tang found a way to beat a quantum machine-learning algorithm2 devised in 2016. The quantum algorithm was designed to provide the type of suggestion that Internet shopping companies and services such as Netflix give to customers on the basis of their previous choices and it was exponentially faster at making such recommendations than any known classical algorithm.

Tang, who at the time was an 18-year-old undergraduate student at the University of Texas at Austin (UT), wrote an algorithm that was almost as fast, but could run on an ordinary computer. Quantum recommendation was a rare example of an algorithm that seemed to provide a significant speed boost in a practical problem, so her work put the goal of an exponential quantum speed-up for a practical machine-learning problem even further out of reach than it was before, says UT quantum-computing researcher Scott Aaronson, who was Tangs adviser. Tang, who is now at the University of California, Berkeley, says she continues to be pretty sceptical of any claims of a significant quantum speed-up in machine learning.

A potentially even bigger problem is that classical data and quantum computation dont always mix well. Roughly speaking, a typical quantum-computing application has three main steps. First, the quantum computer is initialized, which means that its individual memory units, called quantum bits or qubits, are placed in a collective entangled quantum state. Next, the computer performs a sequence of operations, the quantum analogue of the logical operations on classical bits. In the third step, the computer performs a read-out, for example by measuring the state of a single qubit that carries information about the result of the quantum operation. This could be whether a given electron inside the machine is spinning clockwise or anticlockwise, say.

Algorithms such as the one by Harrow, Hassidim and Lloyd promise to speed up the second step the quantum operations. But in many applications, the first and third steps could be extremely slow and negate those gains3. The initialization step requires loading classical data on to the quantum computer and translating it into a quantum state, often an inefficient process. And because quantum physics is inherently probabilistic, the read-out often has an element of randomness, in which case the computer has to repeat all three stages multiple times and average the results to get a final answer.

Once the quantumized data have been processed into a final quantum state, it could take a long time to get an answer out, too, according to Nathan Wiebe, a quantum-computing researcher at the University of Washington in Seattle. We only get to suck that information out of the thinnest of straws, Wiebe said at a quantum machine-learning workshop in October.

When you ask almost any researcher what applications quantum computers will be good at, the answer is, Probably, not classical data, says Schuld. So far, there is no real reason to believe that classical data needs quantum effects.

Vallecorsa and others say that speed is not the only metric by which a quantum algorithm should be judged. There are also hints that a quantum AI system powered by machine learning could learn to recognize patterns in the data that its classical counterparts would miss. That might be because quantum entanglement establishes correlations among quantum bits and therefore among data points, says Karl Jansen, a physicist at the DESY particle-physics lab in Zeuthen, Germany. The hope is that we can detect correlations in the data that would be very hard to detect with classical algorithms, he says.

Quantum machine learning could help to make sense of particle collisions at CERN, the European particle-physics laboratory near Geneva, Switzerland.Credit: CERN/CMS Collaboration; Thomas McCauley, Lucas Taylor (CC BY 4.0)

But Aaronson disagrees. Quantum computers follow well-known laws of physics, and therefore their workings and the outcome of a quantum algorithm are entirely predictable by an ordinary computer, given enough time. Thus, the only question of interest is whether the quantum computer is faster than a perfect classical simulation of it, says Aaronson.

Another possibility is to sidestep the hurdle of translating classical data altogether, by using quantum machine-learning algorithms on data that are already quantum.

Throughout the history of quantum physics, a measurement of a quantum phenomenon has been defined as taking a numerical reading using an instrument that lives in the macroscopic, classical world. But there is an emerging idea involving a nascent technique, known as quantum sensing, which allows the quantum properties of a system to be measured using purely quantum instrumentation. Load those quantum states on to a quantum computers qubits directly, and then quantum machine learning could be used to spot patterns without any interface with a classical system.

When it comes to machine learning, that could offer big advantages over systems that collect quantum measurements as classical data points, says Hsin-Yuan Huang, a physicist at MIT and a researcher at Google. Our world inherently is quantum-mechanical. If you want to have a quantum machine that can learn, it could be much more powerful, he says.

Huang and his collaborators have run a proof-of-principle experiment on one of Googles Sycamore quantum computers4. They devoted some of its qubits to simulating the behaviour of a kind of abstract material. Another section of the processor then took information from those qubits and analysed it using quantum machine learning. The researchers found the technique to be exponentially faster than classical measurement and data analysis.

Doing the collection and analysis of data fully in the quantum world could enable physicists to tackle questions that classical measurements can only answer indirectly, says Huang. One such question is whether a certain material is in a particular quantum state that makes it a superconductor able to conduct electricity with practically zero resistance. Classical experiments require physicists to prove superconductivity indirectly, for example by testing how the material responds to magnetic fields.

Particle physicists are also looking into using quantum sensing to handle data produced by future particle colliders, such as at LUXE, a DESY experiment that will smash electrons and photons together, says Jensen although the idea is still at least a decade away from being realized, he adds. Astronomical observatories far apart from each other might also use quantum sensors to collect data and transmit them by means of a future quantum internet to a central lab for processing on a quantum computer. The hope is that this could enable images to be captured with unparalleled sharpness.

If such quantum-sensing applications prove successful, quantum machine learning could then have a role in combining the measurements from these experiments and analysing the resulting quantum data.

Ultimately, whether quantum computers will offer advantages to machine learning will be decided by experimentation, rather than by giving mathematical proofs of their superiority or lack thereof. We cant expect everything to be proved in the way we do in theoretical computer science, says Harrow.

I certainly think quantum machine learning is still worth studying, says Aaronson, whether or not there ends up being a boost in efficiency. Schuld agrees. We need to do our research without the confinement of proving a speed-up, at least for a while.

Read the original:
The AIquantum computing mash-up: will it revolutionize science? - Nature.com

Read More..

NIU STEM Caf: Quantum Computing: Next Big Thing or Next Big Flop? – Northern Public Radio (WNIJ)

Quantum computing has experienced tremendous growth in the last ten years. But will it be the next big thing or the next big flop?

Join us to learn the basics of quantum computing, find out where the field stands today and learn what else needs to happen before quantum computing can live up to its potential.

Laurence Lurio, Ph.D., NIU professor of physics, will discuss the basic principles of quantum physics which make quantum computers possible. Hell explain why these quantum computers could significantly outperform even the best current computers and discuss some of the fundamental problems that have to be solved if quantum computers are to ever really work.

Kirk Duffin, Ph.D., NIU associate professor of computer science, will talk about how quantum computers fit into the overall computing paradigm and their strengths and weaknesses. Hell also discuss a few of the most important algorithms discovered to date in quantum computing.

We hope youll leave this talk better able to distinguish between the potential of quantum computing and the hype surrounding it!

Northern Illinois University STEM Cafs are part of NIU STEAM and are designed to increase public awareness of the critical role that STEM fields play in our everyday lives. They are offered in partnership with the NIU Alumni Association and made possible with support from Bayer Fund.

Fattys Pub and Grille

Free. Registration encouraged.

06:30 PM - 08:30 PM on Wed, 17 Jan 2024

View post:
NIU STEM Caf: Quantum Computing: Next Big Thing or Next Big Flop? - Northern Public Radio (WNIJ)

Read More..

Quantum Computing Meets AI What Happens Next | by Anshul Kummar | Jan, 2024 – Medium

What will happen if we combine Quantum Computing with Artificial Intelligence?

What youre going to read in this blog might sound like the brainchild of a Sci-Fi novelist on a caffeine bench, but here is the kicker while these visions might seem farfetched to now, many leading experts are nodding along with the marriage of quantum computing and AI.

The lines between reality and fiction blur to the point where distinguishing one from the other could be our next big challenge.

Heres what will happen when we combine Quantum Computing with AI:

Tasks that take years will be done in seconds.

Think about the time it takes for your computer to start up. Remember dial-up internet that painful wait for a single web page to load?

Yep, that was top tech in its time.

Fast forward to todays supercomputers, which can process vast data in seconds. Impressive, right?

But what if I told you quantum computers scoff at these advanced machines? Classical computers work with bits. Think of them as light switches, either on or off.

Quantum computers, on the other hand, utilize qubits. Thanks to superposition, these qubits can be on, off, or both simultaneously.

A qubit, or quantum bit, is the basic unit of information in quantum computing. Its the quantum version of the classic binary bit, and its physically realized with a two-state device.

The power grows exponentially with each added qubit.

Nobel laureate Richard Feynman famously said,

If you think you understand quantum mechanics, you doesnt understand quantum mechanics.

True, its mind-boggling. But for a quick analogy, consider reading all the books in a library simultaneously instead of one by one.

Thats the potential speed of a quantum machine.

Read more:
Quantum Computing Meets AI What Happens Next | by Anshul Kummar | Jan, 2024 - Medium

Read More..

Podcast with Joe Fitzsimons, CEO of Horizon Quantum Computing – Quantum Computing Report

Joe Fitzsimons, CEO of Horizon Quantum Computing, is interviewed by Yuval Boger. Joe describes the companys approach of building software development tools that aim to accelerate classical code and make it run more efficiently on quantum hardware. They discuss the advantages and disadvantages of abstraction layers, the potential for quantum computing in chemistry, and much more.

Yuval Boger: Hello, Joe, and thank you for joining me today.

Joe Fitzsimons: Thank you very much. Very happy to be here.

Yuval: So, who are you, and what do you do?

Joe: Im the CEO of a company called Horizon Quantum Computing. Before I started Horizon, I was a professor of quantum computing for nearly 20 years now. At Horizon, were focused on building software development tools to make it easier to build programs that take advantage of quantum computing.

Yuval: At a high level, there are several companies that build software for quantum computers. What makes Horizon unique or what makes your approach unique?

Joe: The approach weve been taking is to recognize that its going to be very hard to take advantage of quantum computers if you dont have a really in-depth knowledge of quantum algorithms and how to construct them. If you look at the numbers, really only a few hundred people do have that kind of level of knowledge. So what weve been doing is trying to build up tools to make it both easier to program the systems from a technical point of view, being able to do more with less code, but also being able to enable domain experts to take advantage of quantum computing in different domains like finance, pharma, but also things like the energy sector, automotive, aerospace, and so on. For us, what that has meant, our kind of North Star, is that we are building towards being able to accelerate classical code, code written to run on a conventional computer. We want to be able to take legacy code, code that has been written for systems that have nothing to do with quantum computing, and make it run faster on quantum hardware.

At the moment, I think were probably the only ones that have capabilities in that direction. Weve put quite a lot of effort into being able to, for example, accelerate a subset of MATLAB code: to break it apart, automatically construct quantum algorithms from that classical code, and then, the intention is to be able to compile that all the way down to run on hardware. Now, where we are at the moment, the first tools youll see coming out from us are a little bit lower down the chain than that. We have tech demos. You may have seen us at Q2B last year, for example, or the year before, where weve had demonstrations of accelerating MATLAB code. But what our focus is on right now is getting to early access with our integrated development environment that allows users to program at a somewhat higher level of abstraction than existing frameworks, but still not quite with classical code. What that means for us is programming in a quantum programming language that looks a little bit like BASIC. We call it Helium. Its fully Turing-complete, so youre not programming circuits, youre writing programs that may have some indefinite runtime. And youre doing it in a way where you can write subroutines, for example, in C or C++, and compile those directly down to extremely efficient quantum circuits. So thats kind of what weve been building. Its coming up to early access now, so therell be more updates at Q2B this year.

Yuval: If I were to play devils advocate on abstraction layers, I would say that abstraction layers are the best way to get code to be equally bad on all hardware platforms. How do you respond to that?

Joe: I think with a smile. So in some sense, youre right. And if the approach we had been taking was to, for example, build up libraries for optimization algorithms or something like that, then I would 100% agree with you. But thats not what were doing. And were not focused on those kind of black box algorithms. Rather, were focused on the way conventional compilers work. So we are taking source code and building an optimizing compiler that not only does the classical optimizations but also does quantum optimizations on the way down to construct a quantum program for the same task. At every layer its passing through, it is getting optimized for the processor closer and closer and closer to the hardware. So weve had to put in a lot of effort. Weve built an entire stack. We dont rely on any of the existing frameworks at any point in our code. So going from C or going from Helium, compiling that down, that process that it goes through, everything from the constructing quantum circuits to converting between instruction sets, doing the gate synthesis, compiling down to target particular hardware, and also taking things that are maybe loops, like while loops and things like this, that you cannot run on current hardware and turning those into hybrid programs all of that is us. So were doing all of this without any other quantum computing frameworks in there except when it comes to export time. So if you want to export in QASM, for example, to target an IBM system or something like that, then, of course, we give you framework code that you can run on an IBM system. But all of the generation behind the scenes, thats not based on any of the existing frameworks or anything like that. So weve built our entire stack to go the whole way down.

Yuval: If we look at one of the biggest computing revolutions, that was the transition from CPU only to CPU plus GPU. And when you look at a GPU, it is programmed similarly but still different than a CPU. You have to think about the cores; you have to think about some local processing and so on. So, what have we learned from that transition from CPU to GPU, and how does it apply to the QPU transition?

Joe: Thats a great question. So what I would say is that theres different ways you can think about this. If you are a developer working at a low level with GPUs, for example, then you need to be writing GPU-specific code. If you are trying to implement faster linear algebra algorithms, you need to be very close to the hardware. If however, you are writing machine learning models, you dont need to worry about the GPUs. Not really. You just work within whatever Python libraries youre working in, and its taken care of for you. So there are different layers of abstraction going on in the classical world as well.

Weve been building our system in such a way that it has kind of layered abstraction. At the lowest layer, you can work directly with the hardware, with the native instruction set for it, constrained by the connectivity graph of the hardware and so on. But you can also work at a layer that is hardware-agnostic, where you can write kind of general-purpose quantum assembly code but that also allows for arbitrary flow control, loops, and so on, which can then be compiled down to target particular systems. Or you can work above that. You can work with Helium and with subroutines written in C and C++. And where were going is were going to classical code. Therell be several other layers above where were currently at. The intention here is that depending on your expertise, depending on the place youre contributing, you can dive in at whatever level of abstraction you want, make changes at that level of abstraction, develop at that level of abstraction, and leave all of the other layers as automatically compiled. So, if youre a quantum algorithms designer, maybe you dont want to be all that close to the hardware. Maybe you want to be a little bit higher up in the abstraction layers but not so high that its classical. You still want to be doing your quantum Fourier transforms and having full control over the system. If youre working on quantum error correction, you may want to be a little lower down the stack. And if youre a domain expert in the oil and gas industry, for example, then you probably dont want to deal with quantum code at all. So weve been trying to build a system where there is the flexibility to dive in at the layer that you care about, the layer that you can contribute at, and leave whats below it automatic so that you do not need to worry about those lower levels of abstraction.

Yuval: Lets talk a little bit about marketing this platform to customers. When you go to customers, I mean, I think its easy to get into the technical details. Well, what would you say are the top three benefits that a customer would have with your platform? Is it hardware independence? Is it the ability of non-domain experts to code? Is it something else? How would you pitch this to customers?

Joe: Sure. What I can say at the moment in terms of how I view the market is that actually, what is critically important at this point in time is technical lead and getting to quantum advantage as soon as we can. For quantum advantage, talking about approaching particular customer groups on what we can do for them, what everyone can do for them is extremely limited. Until were at a point where quantum computing is affecting these customers bottom line, its going to significantly affect willingness to pay, but were not really contributing to them. So really for us, our goal is to get to useful quantum computing as soon as possible.

In terms of what makes our system different, why we think it contributes to that goal, it starts to allow new capabilities that are not possible in existing frameworks, and it starts to make it much easier to do quite complex things. If you want to program a really large quantum program, and I would say the largest ones weve explored so far have been at the range of about 50 trillion gates, then there are not very many options in terms of how you develop that kind of complex software. So weve been trying to build a system that is capable both of developing for systems today, but also far into the future, so that were building a framework that will stand the test of time and that starts enabling new capabilities. For example, within our system, its very easy to make programs that have indefinite runtime to directly simulate a quantum Turing machine, for example. And that is just something thats extremely difficult. If you want to do it, just construct it from scratch as some kind of hybrid program, unless you have mid-circuit measurements, its not really going to be possible. Unless you think about how to do it with postselection and all of these other things, for us its trivial. You express it in our language. We just write a repeat until loop, and its going to run through that loop until it sees a particular value from a measurement and it will stop. ven though not all hardware can do that today, we compile that down to a hybrid program. And thats completely abstracted from the user. They dont need to worry about that hybrid program. It already converts it to do the postselection for you to run it as a series of circuits rather than a single circuit and so on.

So I would say, a big part of what weve been doing is ease of use, ease of writing more complex systems. This is true both from the development perspective, but also from the deployment perspective. For us, the end point of compilation should be a deployed program. It shouldnt be a single shot run or 10,000 shot run or whatever it is on that piece of hardware. It should be an API that the user can call with whatever inputs are used to describe that problem that will run the code on whatever hardware backend that has been compiled for and return the results to the user through a standard API interface so that they can build whatever frontend they want to process their results in whatever way they want. So if they want to incorporate it, they dont need to be working in Python. They can incorporate it straight into JavaScript. They can incorporate it into MATLAB. They can incorporate it into whatever technology theyre building.

Yuval: How does the platform deal with hybrid algorithms where part is classical and part is quantum? Do you expect users to use your quantum version of BASIC to write hybrid algorithms as well?

Joe: What I would say is theres perhaps two different categories that you can fall into here. There are hybrid algorithms where youre thinking about doing many different shots. You have some classical logic that is processing the statistics from the previous set of shots, that were run to determine the next circuit to run. And this is the variational algorithms that were all very familiar with at this point, I would say. But also, if you think about things like error correction or anything like that, you also need to think about classical processing that is happening concurrently with the quantum circuit. And thats somewhat different because your classical processing has to be able to feed back into the circuit. That means thinking about code that is running maybe locally on FPGAs rather than a nearby GPU system or something like that. We think about both within our system, how this is built. There is a simple way to implement basic functions classically concurrently with your quantum algorithm, but youre also able to include classical code that computes classical functions that run in a sandboxed way. That allows for the development of both types of algorithms, but importantly, it allows for the development of these more advanced algorithms where you need concurrent classical processing happening live with the quantum processing thats going on, which is clearly something we need as we move to fault tolerance, as we move to more complex quantum programs. Now, if you talk about how you would you do a variational Eigensolver or QAOA or something like this, what I would tell you is that our system is really designed for programming the quantum backend. By the quantum backend, I mean the quantum processor itself, as well as any classical control that sits with it. Its not intended for running large compute loads. Its intended for very fast functions. So its intended for pure quantum backend development, but what you would do if you were developing a circuit, we have a way of specifying inputs to be read at call time. So what you would do is you would specify each of the parameters that can be varied within your circuit as an input. You would compile that program and deploy that program with those inputs marked. And then from your front-end code thats implementing stochastic gradient descent in whatever framework you want, whatever technology you want, whatever hardware you want, youre implementing that, calling this API in the background with the specified parameters. Now, I will freely admit our system has been built to target more structured algorithms. My view has always been that theres unlikely to be a big advantage in NISQ, except perhaps for chemistry. Now, I could be proven wrong, and I am not saying that it is not worthwhile for people to be exploring variational algorithms. Its just personally, I dont think thats the direction of the future, and we have never been working toward that goal. You dont see Horizon QAOA or Horizon VQE implementations. Thats not our core competency.

Yuval: You mentioned getting to quantum advantage, and some people talk about the quantum equivalent of the GPT moment, where all of a sudden its clear that theres something there for general use. What is that going to look like in your opinion, and how soon will it come?

Joe: Thats a tricky question, and I dont think anyone really knows the answer to this. What I think is pretty clear is that the first real advantage is likely to be seen in chemistry. There are numerous reasons why you would believe this, but one of them is just that chemistry looks a lot like whats happening in a quantum computer. Theyre both quantum mechanical systems, theyre both obeying the same equations. You might also think that you can get away with a higher error rate for chemistry calculations if you can make the errors in the quantum computer look a little bit like the noisy environment that a molecule is experiencing. You may not need to cancel out noise, it may just be a process of shaping the noise to look like the natural world because the natural world is just not that clean. But yet chemistry works even in the complex environments of the real world. So thats why I think there will be a first narrow advantage for chemistry.

But for us, we also need to care about getting from that first narrow advantage to a broader-based advantage, where you start to see an advantage across a large number of applications. And I think a lot of it depends not just on hardware advance but also on advances in algorithms and advances in dev tools. You can of course, speculate what timelines look like on this, but what I would say that is maybe not so obvious is that were in an interesting time now where it is both advances in hardware and advances in software that can independently lead to a real-world quantum advantage. We seem to have convincing arguments at this point that there are at least a small number of quantum processors that are hard to simulate. With that being the case, the fact that we cannot yet make use of them to do real work means there are a couple of possibilities. Either theyre not capable of doing real work, but that could be because theyre easily simulable. But if theyre not simulable, then why cant they do real work? Maybe thats just a gap in our understanding. So advances on the algorithm side can help bring us closer there. Some of that will be theoretical advances on algorithms, and some of that will be advances in compilation where we are just getting better at harnessing those systems, getting better at echoing out the noise, getting better at taking the problem we care about and making it as small as we possibly can. At the same time, advances are happening in hardware. So youve got these two things that are going on in parallel. Over the last year, weve seen quite a few interesting demonstrations and quite a lot of progress around error correction and fault tolerance. Its clear were getting much closer to the goal of seeing a real proper demonstration of complete fault tolerance where youre doing useful computation, youre getting a performance that is improved compared to the physical qubits and so on, and where that encoding as it grows suppresses error further and further. Weve seen all of these components demonstrated individually, and in many cases, weve seen collections of these demonstrated. So were still just getting there to the first real fault-tolerant quantum computers. You can see that starting to be on the horizon, even if there are only two-qubit systems, three-qubit systems at first.

Yuval: Going back to customers, when you go to customers today with the products that youre releasing that are now available for early access, do you tell them, hey, you could run this? Have you done benchmarking? I mean, have you published benchmarking? Can you say, well, you can run this, this code is faster with our framework, or is it that this algorithm is much easier to program this way? Lots of customers use Qiskit, and so I think thats sort of their frame of reference. How do you compare to what theyre using today?

Joe: So look, Id say theres two different ways you can make comparisons. But the correct point of comparison for us I do not think are other quantum programming frameworks because were enabling functionality that is just not possible within those frameworks. For a start, those are our circuit frameworks. They generate circuits that run, you get results back. They dont generate Turing machines. They also dont have the capability of compiling classical code. So theres not really a place where I can compare our performance on C compilation to anything else. There have been a couple of demonstrations where you see people maybe implementing a unitary that implements some small function written in Python or something like that, but those are usually using cosine-sine decompositions or something like this, which are exponentially bad. To give you an idea of how well weve been doing in terms of generated C, we tried a couple of problems. We talked about this, Id say, maybe two years ago at Q2B. You may remember that Goldman Sachs had a paper out on options pricing, maybe, I think, 2021. The hard part of that, the bottleneck in that algorithm using Monte Carlo methods, is actually just a classical computation, a classical subroutine that computes e to the minus x. So theres an analysis that they did in terms of how many T gates it needs, how many Toffoli gates it needs. There was a subsequent paper about a year later that showed improved results. We tried compiling this from about 15 lines of C. So, we just implemented that inverse exponential in a fixed-point manner. Okay, theres some boilerplate code as well, but its about 15 non-trivial lines of C. We compiled it through our system with our default settings. What we found was that we outperformed the code that had been in both papers by a large margin, and in some parameter ranges up to a factor of 112 times in terms of the reduction of the number of T gates, or Toffoli gates. So this is a really large difference in performance.

For the actual number of gates for the level of precision involved, its pretty close to the square root of the number of the original gates. So were clearly getting good performance out of this, but its also limited by how good your C algorithm is. So if you use good C code, its better than bad C code. But thats a classical problem. So we have a lot of trade-offs you can make, and some of these do extremely well. So we have, for example, special constructions that we can use if were targeting low-depth circuits or low T-depth. So we use different constructions for different types of gates. If youre using it as part of an oracle, then again, we use special constructions that take into account that the phase thats incurred on each of the computational basis states doesnt matter because youre just going to compute this thing; youre going to do some controlled operations of it, and then youre going to uncompute it. So were trying to take into account these kinds of optimizations, and were trying to come up with the right kind of structures for being able to actively reuse qubits, for example, uncomputing them, making use of them again, recomputing things, and so on. As you can imagine, thats a fairly complicated process, but weve been getting good performance out of it.

In terms of what benchmarks will look like overall, as I say, were about to start early access. Well start seeing some examples there of how this looks like applied to real code. But what I can say is that there really arent good points of comparison in terms of other quantum frameworks because what were doing is quite different from what many of the other frameworks are doing.

Yuval: And as we reach the end of our conversation, I wanted to ask you a hypothetical. If you could have dinner with one of the quantum greats, dead or alive, who would that person be?

Joe: So Ive been really fortunate. Ive worked in this area for a long time. So Ive had dinner with some really impressive people over the years, with Artur Ekert, with Anton Zeilinger, with Frank Wilczek, with quite a few very eminent physicists. And I guess if it was allowed to pick someone from the past, maybe it would be Richard Feynman or something like that. Not just because of being a great physicist, but because I think hed be quite funny at dinner, and I kind of appreciate that in a dinner companion. But if were looking to today who it would be, then I have two answers for you. One is whoever runs @QuantumMemeing on Twitter. Because, again, Id like a bit of humor with dinner. I think thats, you know, I kind of enjoy that. And the other is Ewin Tang. So Im not sure if you know Ewin. I think shes a postdoc at Berkeley at the moment. But shes had a huge string of incredibly impressive results on the theory of quantum computing, particularly in relation to dequantization of machine learning algorithms. I think she thinks about quantum computing in a different way than I think about quantum computing, and I think Id have something to gain from being more exposed to that. Ive never met her before, so I think that would be beneficial.

Yuval: Excellent. Joe, thank you so much for joining me today.

Joe: Sure, no problem. Thanks for having me. Thank you.

Yuval Boger is the chief marketing officer for QuEra, the leader in neutral-atom quantum computers. Known as the Superposition Guy as well as the original Qubit Guy, he can be reached on LinkedIn or at this email.

January 1, 2024

See the original post:
Podcast with Joe Fitzsimons, CEO of Horizon Quantum Computing - Quantum Computing Report

Read More..

Beyond Binary: The Convergence of Quantum Computing, DNA Data Storage, and AI – Medium

Exploring the convergence of quantum computing, DNA data storage, and AI how these technologies could revolutionize computing power, memory, and information handling if challenges around implementation and ethics are overcome.

Check out these two books for a deeper dive and to stay ahead of the curve.

Computing technology has advanced in leaps and bounds since the early days of Charles Babbages Analytical Engine in the 1800s. The creation of the first programmable computer in the 1940s ushered in a digital revolution that has profoundly impacted communication, commerce, and scientific research. But the binary logic that underlies modern computing is nearing its limits. Exploring new frontiers in processing power, data storage, and information handling will enable us to tackle increasingly complex challenges.

The basic unit of binary computing is the bit either a 0 or 1. These bits can be manipulated using simple logic gates like AND, OR, and NOT. Combined together, these gates can perform any logical or mathematical operation. This binary code underpins everything from representing the notes in a musical composition to the pixels in a digital photograph. However, maintaining and expanding todays vast computational infrastructure requires massive amounts of energy and resources. And binary systems struggle to efficiently solve exponentially complex problems like modeling protein folding.

In the quest to surpass the boundaries of binary computing, quantum computing emerges as a groundbreaking solution. It leverages the enigmatic and powerful principles of quantum mechanics, fundamentally different from the classical world we experience daily.

Quantum Mechanics: The Core of Quantum Computing

Quantum computing is rooted in quantum mechanics, the physics of the very small. At this scale, particles like electrons and photons behave in ways that can seem almost magical. Two key properties leveraged in quantum computing are superposition and entanglement.

Superposition allows a quantum bit, or qubit, to exist in multiple states (0 and 1) simultaneously, unlike a binary bit which is either 0 or 1. This means a quantum computer can process a vast array of possibilities at once.

Entanglement is a phenomenon where qubits become interlinked in such a way that the state of one (whether its a 0, a 1, or both) can depend on the state of another, regardless of the distance between them. This allows for incredibly fast information processing and transfer.

Exponential Growth in Processing Power

A quantum computer with multiple qubits can perform many calculations at once. For example, 50 qubits can simultaneously exist in over a quadrillion possible states. This exponential growth in processing power could tackle problems that are currently unsolvable by conventional computers, such as simulating large molecules for drug discovery or optimizing complex systems like large-scale logistics.

Revolutionizing Fields: Cryptography and Beyond

Quantum computing holds the potential to revolutionize numerous fields. In cryptography, it could render current encryption methods obsolete, as algorithms like Shors could theoretically break them in mere seconds. This presents both a risk and an opportunity, prompting a new era of quantum-safe cryptography.

Beyond cryptography, quantum computing could advance materials science by accurately simulating molecular structures, aid in climate modeling by analyzing vast environmental data sets, and revolutionize financial modeling through complex optimization.

Key Quantum Algorithms

Research in quantum computing has already produced notable algorithms. Shors algorithm, for instance, can factor large numbers incredibly fast, a task thats time-consuming for classical computers. Grovers algorithm, on the other hand, can rapidly search unsorted databases, demonstrating a quadratic speedup over traditional methods.

The Road Ahead: Challenges and Promises

Despite its potential, quantum computing is still in its infancy. One of the major challenges is maintaining the stability of qubits. Known as quantum decoherence, this instability currently limits the practical use of quantum computers. Keeping qubits stable requires extremely low temperatures and isolated environments.

Additionally, error rates in quantum computations are higher than in classical computations. Quantum error correction, a field of study in its own right, is crucial for reliable quantum computing.

Quantum computing, though still in the developmental stage, stands at the forefront of a computational revolution. It promises to solve complex problems far beyond the reach of traditional computers, potentially reshaping entire industries and aspects of our daily lives. As research and technology advance, we may soon witness the unlocking of quantum computings full potential, heralding a new era of innovation and discovery.

DNA data storage emerges as a paradigm shift, harnessing the building blocks of life to revolutionize how we store information.

Unprecedented Storage Capabilities

DNAs storage density is unparalleled: one gram can store up to 215 petabytes of data. In contrast, traditional flash memory can hold only about 128 gigabytes per gram. This immense capacity could fundamentally change how we manage the worlds exponentially growing data.

Longevity and Reliability

DNA is not only dense but also incredibly durable. It can last thousands of years, far outstripping the lifespan of magnetic tapes and hard drives. Its natural error correction mechanisms, rooted in the double helix structure, ensure data integrity over millennia.

DNA for Computation and Beyond

Beyond storage, DNA holds potential for computation. Researchers are exploring DNA computing, where biological processes manipulate DNA strands to perform calculations. This could lead to breakthroughs in solving complex problems that are infeasible for conventional computers.

Challenges in Practical Implementation

Despite its promise, DNA data storage is not without challenges. Synthesizing and sequencing DNA is currently expensive and time-consuming. Researchers are working on methods to streamline these processes and reduce error rates, which are crucial for making DNA a practical medium for everyday data storage.

While quantum computing offers exponential speedups on specialized problems, its broader applicability and scalability remain uncertain. And both quantum and DNA computing currently require extremely low operating temperatures only possible with expensive equipment. They also consume large amounts of energy, though less than traditional data centers. However, both offer inherent data security advantages. Quantum computations cannot be copied, while DNA data storage is dense and hard to access. We may see hybrid deployments that apply these technologies to niche applications. For generalized workloads, traditional binary computing will likely dominate for the foreseeable future.

The integration of AI with quantum computing and DNA data storage represents a leap forward in computational capability.

AI and Quantum Computing: A Synergy for Complex Problems

AI algorithms can leverage the immense processing power of quantum computers to analyze large datasets more efficiently than ever before. This synergy could lead to breakthroughs in fields like drug discovery, where AI can analyze quantum-computed molecular simulations.

AI and DNA Data Storage: Managing Massive Databases

With DNAs vast storage capacity, AI becomes essential in managing and interpreting this wealth of information. AI algorithms can be designed to efficiently encode and decode DNA-stored data, making it accessible for practical use.

Ethical and Societal Implications

As highlighted in The Coming Wave by Mustafa Suleyman, the intersection of these technologies raises significant ethical questions. The use of genetic data in AI models, for instance, necessitates stringent privacy protections and considerations of genetic discrimination.

Looking Ahead: AI as the Conductor

The future sees AI not just as a tool but as a conductor, orchestrating the interplay between quantum computing and DNA data storage. This involves developing new algorithms tailored to the unique properties of quantum and DNA-based systems.

Google AI recently demonstrated a program that can autonomously detect and correct errors on a quantum processor, a major milestone. On the DNA computing front, researchers successfully stored a movie file and 100 books using DNA sequences. Ongoing studies also show promise in using DNA to manufacture nanoscale electronics for faster, denser computing. Quantum computing is enabling models of complex chemical reactions and biological processes. As costs decline, we could see exponential growth in synthesizing custom DNA and practical quantum computers.

Despite promising strides, there are still obstacles to realizing commercially viable DNA and quantum computing. Stability of quantum bits remains limited to milliseconds, far too short for practical applications. And while DNA sequencing costs have dropped, synthesis and assembly costs remain prohibitively high. There are also ethical pitfalls if without careful oversight, like insurers obtaining genetic data, or AI algorithms exhibiting biases. Job losses due to increasing automation present another societal challenge. Investments in retraining and social programs will be necessary to ensure shared prosperity.

Hybridized quantum-DNA computing could transform our relationship with information and usher in an era of highly personalized medicine and hyper-accurate simulations. It may even require overhauling information theory and rethinking how humans interact with advanced AI. But we must thoughtfully navigate disruptions to industries like finance and cryptography. Avoiding misuse will also require international cooperation to enact governance frameworks and design systems mindful of ethical dilemmas. With wise stewardship, hybrid computing could positively benefit humanity.

The convergence of quantum computing, DNA data storage, and AI represents an unprecedented phase change for processing power, memory, and information handling. To fully realize the potential, while mitigating risks, we must aggressively fund research and development at the intersection of these fields. The technical hurdles are surmountable through collaboration between the public and private sectors. But establishing governance and ethical frameworks ultimately requires a broad, multidisciplinary approach. If society rises to meet this challenge, we could enter an age of scientific wonders beyond our current imagination.

Check out these two books for a deeper dive:

Excerpt from:
Beyond Binary: The Convergence of Quantum Computing, DNA Data Storage, and AI - Medium

Read More..

A banner year for quantum computing: What to expect in 2024 – SiliconRepublic.com

Sectigos Tim Callan makes his top predictions for what the year ahead will hold for quantum computing and encryption.

2023 was the year that quantum computing entered the average persons lexicon. From government-backed funding to advancements by major players like IBM, quantum has become the buzzword on everyones mind, after artificial intelligence of course.

Quantum computers are no longer a science project, they are now an engineering project. By that, I mean that there is no question that it will work, and it will be commercially viable, and it will be practical. Its just about figuring out how to get them all tuned in the way we want them. Its important, therefore, to look at where quantum computing could be by the end of 2024.

RSA is one of the oldest cryptosystems out there and is widely in use, specifically for secure data transmission. 2024 will see the RSA encryption algorithm face unprecedented scrutiny as researchers around the world intensify their endeavors to break the security backbone of the internet. Although RSA is not expected to succumb, it will undoubtedly grapple with an immense amount of pressure.

Quantum computers stand to defeat RSA and elliptic-curve cryptography (ECC), and qualified researchers are putting energy into finding the strategies to most effectively do so. This research can piggyback on thinking about how to defeat RSA with a traditional computer as well. After all, any attacker with access to quantum computing will also have access to all the traditional architecture computing it needs.

We should expect continued revelations in the years to come, which will reduce the time to computation for RSA on a variety of fronts. While these attacks on their own are deeply unlikely to bring that time to computation down to the point where they represent viable attack vectors against the key sizes we commonly use today, they will fuel additional research and ultimately will contribute to the optimised quantum-based attack that we will one day understand.

While the siege on RSA encryption is underway, it is crucial to acknowledge that the encryption method itself is robust and has withstood more than 40 years of technological innovation. This is not being challenged. We will see, however, continued scrutiny of this algorithm in preparation for the day when a quantum computer can execute such an attack.

We will see continued scrutiny of the idea of applying traditional methods to the quantum platform, and we will see full consideration of hybrid attacks using both architectures together. While we dont expect a computer to be able to perform these attacks in 2024, the trend toward that eventual day will continue with additional published papers and revelations about how to break this bulwark of digital security.

In 2024, transitioning to quantum-resistant cryptography will become a mainstream boardroom discussion. No longer a buzzword or a topic to be tabled, becoming crypto-agile to prepare for post-quantum encryption will be a key focus for the C-suite next year.

This shift has massively been supported by the US National Institute of Standards and Technologys (NIST) development of quantum-resistant encryption and its impactful educational campaign on quantums threat to decryption. They have now transformed a once theoretical discussion about decryption into a mainstream business focus.

Enterprises will sit up and take notice of the threat quantum computers pose to the cryptography that enables and secures nearly all our digital operations today. In 2024, large enterprises, particularly those in sensitive industries such as financial, medical, or military contractors, and businesses with high-value intellectual property will begin building roadmaps for deployment of post-quantum cryptography (PQC) to keep their assets and operations safe from this new computing paradigm. This accompanies a general increase in focus on automation of cryptography and certificates, certificate life cycle management and crypto agility.

The recent statement in the UKs autumn budget by the chancellor of the exchequer showcased the countrys commitment to the quantum strategy they outlined earlier in the year.

While commendable, Jeremy Hunts earnestness in the 10-year quantum plan falls short when it comes to a sustained commitment to safeguarding encryption security. The paradox is evident while the remarkable processing power of quantum holds boundless potential, it simultaneously poses a significant threat to the foundation of all encryption.

We must not forget the security challenges associated with this advanced technology. If a country does develop a quantum computer capable of breaking current encryption methods, it would likely keep it a closely guarded state secret, as the UK did when it broke the Enigma code during World War II. For this reason, businesses must take proactive measures to prepare for this eventuality by transitioning to quantum-safe algorithms before it is too late.

Think about those industrial secrets that, lets say, another nation state might want to take away. Think about those military secrets. Think about the plans for the stealth fighter. Those are the things that are very, very valuable. And those are the things that we need to worry about most immediately.

Back in June of this year, IBM claimed that quantum computers were entering a period where they would become useful for businesses, calling it the utility phase. Over the next 12 months, businesses will have to prove to themselves and others that they are capable of handling the enormous opportunity that quantum computers will bring. Doing that requires them to be compliant and secure at every level.

By Tim Callan

Tim Callan is the chief experience officer at Sectigo, a company that provides web security and identity solutions.

10 things you need to know direct to your inbox every weekday. Sign up for theDaily Brief, Silicon Republics digest of essential sci-tech news.

Read the original here:
A banner year for quantum computing: What to expect in 2024 - SiliconRepublic.com

Read More..

Quantum’s threat to encryption is the new Y2K threat – FierceElectronics

The potential for quantum computing to break classical computer security is spurring efforts to reimagine it, but fortunately theres time before bad actors can cost-effectively exploit quantum to do much damage.

Cracking currently used cryptosystems was one of the first famous applications of quantum computing, and has been viewed as a good accident, Hari Krovi, lead scientist at Riverlane, told Fierce Electronics in an interview. It's great that we know that this can crack cryptography.

He said this discovery has spurred research in into post-quantum cryptosystems those computers that will remain that secure in the quantum era, which The National Institute of Standards and Technology (NIST) in the U.S. is tracking as it works to solidify standards.

Error correction capabilities key to quantum applications advances

Riverlane is focused on solving error correction, which Krovi said is quantum computings defining challenge, by building a quantum error correction stack to enable fault-tolerant quantum computing.

Part of this work is the estimation of resources needed for different quantum computing applications like simulations of physical systems, and how many fault-tolerant operations will be needed for any given application, he said. This informs you how soon or how far away applications are.

These estimates include the ability to cost-effectively use quantum computing to hack classical computing security. Krovi said finding new quantum algorithms is not so easy, and you want those algorithms to be better than what is already available. We have many constraints. He said theres not much point in building a quantum computer to do the same thing as an existing supercomputer.

Knowing that quantum computers could break current cryptography has heightened the need to get a clear understanding what quantum computing is capable of.

In the meantime, developing security that can protect systems from quantum computing threats is not unlike the Y2K problem we have advance warning that theres a problem that needs to be fixed.

Scott Best, technical director at Rambus, said the problem is that public key algorithms are based on math, and quantum computers love math. That means some common security public-key cryptosystems like RSA (RivestShamirAdleman) are more vulnerable than the Advanced Encryption Standard (AES) or Secure Hash Algorithms (SHA). There's no like equation that describes what AES does, he explained. It doesn't have a lot to do with math.

Rambus recently introduced its Quantum Safe Engine (QSE) for integration into hardware security elements in ASICs, SoCs and FPGAs in recognition that quantum computers will enable bad actors to break current asymmetric encryption. The QSE IP core uses NIST-selected quantum-resistant algorithms.

Best said cryptographers began seriously thinking about the quantum threat about six years ago with the expectation that quantum computing systems capable of cracking classical cryptography will be coming online in 2030. We need to adapt new algorithms, he said.

It also means updating the entire Internet, Best said. Every browser session uses a key exchange mechanism, he said. We need to update all those firmware mechanisms. We need to revoke all the keys that are out there.

The good new is that standards work has been in progress for some time. NIST put out a call for proposals in 2016 to find the best quantum-safe schemes to become the new cryptographic standards. The two primary standards the were chosen are the CRYSTALS-Kyber public-key encryption and the CRYSTALS-Dilithium digital signature algorithms.

Quantum-safe cryptography standards substitute the math problems that are easy for quantum computers to solve with math problems that are difficult for both classical and quantum computers to solve.

Hardware-level security is table stakes

Although these standards havent quite yet been finalized, Rambus has incorporated them into hardware accelerators, Best said. We come from an environment where we do not trust software, he said. Software is always suspect.

Best said a critical cryptographic operation should be done with tamper resistant hardware.

Its also important that any added layer of security, whether its at the hardware or software level, doesnt impede performance. If you're setting up a key exchange, you want the key exchange to complete in 100 milliseconds, Best said. You don't want the key exchange to take several hours that feel like dial-up.

These layers at the hardware level not only need to be done quickly, but also at low power, Codasip safety and security architect Carl Shaw told Fierce Electronics in an interview. Codasip is a processor technology company focused on enabling system-on-chip developers to differentiate their products, and that includes security features such as quantum resistant algorithms. There's many ways we can customize the CPU, he said.

One approach is to customize the actual core by adding instructions to provide an advantage on the numerical computing side because all these algorithms are very compute heavy, Shaw said. We could also inherently change the micro architecture of the processor itself to make it more efficient to handle these sorts of computations or integrate in specific bits of hardware to help.

Quantum could spot security vulnerabilities sooner

Codasip recently released its Capability Hardware Enhanced RISC Instructions (CHERI) security technology, which extends conventional hardware Instruction-Set Architectures (ISAs) with architectural features that enable fine-grained memory protection and highly scalable software compartmentalization. More simply, CHERI stops unsafe memory access that causes 70% of vulnerabilities.

Shaw said Codasip is keeping an eye on the specific bits of the quantum algorithms being used today, and so far, actual implementations are somewhat simple.

As much as quantum presents a threat to classical computing cryptography, Shaw thinks in the longer term that quantum computing will be valuable for identifying security vulnerabilities. We're going to have something that is a lot more robust than what we have today.

Rambus Best said its going to take time replace the existing public key infrastructure and revoke everything, but the technical community is doing the right thing. Everybody recognizes the problem, he said. It's a bit of a race, but we think we can fix it.

See the original post:
Quantum's threat to encryption is the new Y2K threat - FierceElectronics

Read More..

2023 Aftermath: By How Much Did Bitcoin Outperform S&P 500, NASDAQ, Dow Jones, and Gold? – CryptoPotato

It was another challenging year as one war continued without an evident conclusion (Russia-Ukraine), and another one broke out in the Middle East. Numerous countries are trying to fight off the increasing inflation, some with bigger success than others.

With just hours left of 2023, its interesting to compare the performance of the worlds largest stock market indexes, gold, which typically performs better during uncertain times, and Bitcoin an asset that has been proclaimed dead numerous times in the past but still keeps coming back.

Starting with perhaps the most well-known US stock market index the Standard and Poors 500. It tracks the performance of the 500 largest companies listed on stock exchanges in the US and is typically regarded as the benchmark that shows the health of the countrys financial state, at least in terms of large corporations.

It began the year at just over 3,820 points and quickly soared to 4,200 before returning to its starting position by March. The bulls stepped up on the gas in the following months, and the index jumped to 4,600 at the end of July. After another retracement, the S&P 500 finished the year strong and ended the last trading day of 2023 on December 29 at 4,769 close to its all-time high.

In terms of percentages, the S&P 500 finished 2023 with a notable increase of roughly 25%. Although that seems quite impressive, one can easily see that most of the gains came from a few tech-related companies, such as Nvidia (245%).

Taking into consideration the aforementioned tech stocks, its logical that the Nasdaq Composite, which tracks mainly such assets, has soared the most from the indexes. In fact, the Nasdaq has outperformed almost all of its competitors with a 44.5% yearly surge that drove it from 10,386 at the start of 2023 to 15,011 at the end of it.

The Dow Jones Industrial Average, on the other hand, averts from tech-related stocks. The index that follows just 30 large US behemoths has increased by 13.74% in 2023 from 33,136 to 37,689.

The yellow metal is regarded as the most prominent safe haven asset that tends to outperform the more riskier stocks during turbulent times. The past few years indeed fall into such a category, which has affected golds performance and resulted in untypical volatility.

One ounce of gold cost $1,813 on the financial markets at the start of 2023. Similar to most assets, the bullion had a strong spring and soared past $2,000 in April and May. The trend reversed after the summer, and the precious metal found itself dumping hard to its 2023 starting price at the beginning of October.

After the Hamas-Israel war broke out, though, gold went on a tear. Its price against the dollar exploded by over $300 in less than two months and marked an all-time high of $2,150/oz on December 5.

Since then, the precious metal has lost some traction but still ended 2023 above $2,060, charting a yearly increase of 13.73%.

Bitcoin, alongside the rest of the crypto market, had a catastrophic 2022 due to industry collapses and adverse global events. As such, it entered 2023 at around $16,600. It didnt take long before the asset broke out of its late-2022 nosedive. By January 13, it had soared past $20,000 and hadnt looked back since, despite a few retracements along the way.

Then came reclaiming the $30,000 level, which was harder than anticipated. In fact, it took BTC several attempts to decisively overcome that level, which finally happened in late October.

Bitcoin kept climbing in the following weeks, which culminated in a price surge to nearly $45,000 in early December. It has lost some ground since then, and even though therere still some hours left in 2023, its safe to assume that BTC will finish the year in a range between $42,000 and $43,000 unless something cataclysmic happens.

As such, Bitcoins YTD gains will be somewhere between 150% and 160%. This means that the cryptocurrency will trump all other large traditional finance assets mentioned above by a massive margin.

2024 has all the ingredients to be an even better year for Bitcoin, given the upcoming halving (usually serving as catalyst for a bull market) and the potential approval of a spot BTC ETF in the States. Nevertheless, its worth noting that such an approval could serve as a sell-the-news moment, and history is no indication of future price performances.

View post:
2023 Aftermath: By How Much Did Bitcoin Outperform S&P 500, NASDAQ, Dow Jones, and Gold? - CryptoPotato

Read More..

Bitcoin could dip to $32,000 post spot ETF approval – Hindustan Times

Bitcoin (BTC) could face a downward correction to around $32,000 in the next month, if a spot ETF gets approved. This scenario is called a sell the news event, which is common in capital markets, according to CryptoQuant.

Sell the news refers to how the prices, leverage and sentiment of an asset increase before a positive event, but then drop soon after. This happens because smart traders take advantage of the crowded long trade, and force those who use leverage to exit or get liquidated as the price moves against them, CoinDesk reported.

A spot ETF approval is seen as a positive event for Bitcoin, because it would attract more institutional investors, and create more demand for the cryptocurrency. However, CryptoQuant warned that this could also trigger a price correction, based on historical patterns.

ALSO READ| Crypto news: Avalanche Foundation set to invest in meme coins, here's why

Short term Bitcoin holders are experiencing high unrealized profit margins of 30%, which historically has preceded price corrections (red circles), CryptoQuant wrote in the note to the CoinDesk.

Moreover, short-term holders are still spending Bitcoin at a profit, while rallies usually come after short-term losses are realized, the note states.

CryptoQuant estimated that the price of Bitcoin could fall to $32,000, which is the realized price of short-term holders. This means that this is the average price at which they bought or sold Bitcoin.

ALSO READ| US-based Chinese student under the lens over ownership of $6 million Bitcoin mine

Another firm, Capriole Investments, advised that conservative portfolio management is advisable before the possible approval of a spot ETF. Capriole wrote in a blog post that the risk of holding Bitcoin is higher now than a few weeks ago.

With Bitcoin up over 60% since ETF mania began a few months ago, and with every man and his dog on X.com expecting an approval on or around 10 January, we must start to anticipate much larger volatility events (up/down) in this region. Risk today is substantially higher for long Bitcoin positions than it was just a few weeks ago, Capriole wrote.

ALSO READ| Crypto shocker: ARK Invest liquidates entire GBTC holdings, invests $100m in Bitcoin futures ETF

Bitcoin has experienced sell the news events before, in 2017 and 2021. In 2017, Bitcoin reached $20,000 after the CME launched BTC futures, but then declined in the following months. In 2021, Bitcoin hit $65,000 after Coinbase went public, but then also lost value in the next months.

Bitcoin is trading at $42,450 at the moment, after starting the year at $16,000. The daily trading volume is stable at $80 billion, according to CoinMarketCap.

See the article here:
Bitcoin could dip to $32,000 post spot ETF approval - Hindustan Times

Read More..