Page 562«..1020..561562563564..570580..»

Marathon Digitals stock rallies toward a record win streak, even as bitcoin falls – MarketWatch

Published: Dec. 22, 2023 at 12:53 p.m. ET

Shares of Marathon Digital Holdings Inc. MARA shot up 8.4% toward a 20-month high in midday trading Friday, to put them on track to extend their win streak to a record nine sessions. The cryptocurrency miners stock was rallying again, even as bitcoin BTCUSD fell 0.2%, as Marathons stock and bitcoin continue to decorrelate. Marathons stock has rocketed 83.3% during its current win streak, the best nine-day performance since it soared 83.7% during the nine-day stretch that ended Jan. 23, 2023. Over the past nine days, bitcoin has gained six times and has climbed 6.2%. The correlation coefficient between Marathons stock...

Shares of Marathon Digital Holdings Inc. MARA shot up 8.4% toward a 20-month high in midday trading Friday, to put them on track to extend their win streak to a record nine sessions. The cryptocurrency miners stock was rallying again, even as bitcoin BTCUSD fell 0.2%, as Marathons stock and bitcoin continue to decorrelate. Marathons stock has rocketed 83.3% during its current win streak, the best nine-day performance since it soared 83.7% during the nine-day stretch that ended Jan. 23, 2023. Over the past nine days, bitcoin has gained six times and has climbed 6.2%. The correlation coefficient between Marathons stock and bitcoin has dropped to 0.44 during the month of December, from 0.66 year to date and from 0.90 in 2022. B. Riley analyst Lucas Pipes reiterated his neutral rating on Marathons stock but raised his price target to $14 from $11, after the companys announcement on Thursday that it has added Anchorage Digital Bank National Association at a bitcoin custodian. Marathons stock has run up 685.1% year to date, while bitcoin has surged 163.5% and the S&P 500 SPX has gained 24.1%.

Visit link:
Marathon Digitals stock rallies toward a record win streak, even as bitcoin falls - MarketWatch

Read More..

Spot Bitcoin ETF approval ‘still happening’ by Jan. 10, analysts say – Cointelegraph

As applicants for spot Bitcoin (BTC) exchange-traded funds (ETF) rush to incorporate new cash-only requirements into their proposals in the last month of 2023, some analysts still expect ETF approvals in the United States by early 2024.

Bloomberg ETF analysts James Seyffart and Eric Balchunas anticipate that the U.S. Securities and Exchange Commission (SEC) will approve a spot Bitcoin ETF in January 2024 despite multiple last-minute amendments that applicants are scrambling to add to their proposals.

Seyffart took to X (formerly Twitter) on Dec. 21 to share his observations aboutBlackRocks latest spot Bitcoin ETF update from Dec. 18, which accepted the SECs cash redemption system instead of in-kind redemptions, or those implying non-monetary payments like BTC.

The analyst noticed that BlackRocks latest iShares Bitcoin Trust ETF S-1 registration statement replaced the term prime broker and the trade credit lender with the prime execution agent, noting that the SEC might not be comfortable with the change.

Will be interesting to see who updates their documents after this, Seyffart wrote, adding that the SEC might not accept a condition where a third party would buy and sell Bitcoin on behalf of the ETF in the Bitcoin cash model.

The analyst notedthat multiple applicants like ARK, Bitwise and Valkyrie have already set up for a cash-only model, while some including Grayscale and WisdomTree still have in-kind or cash in their filings.

Related: Spot Bitcoin ETF will be bloodbath for crypto exchanges, analyst says

All this is to say we still think this is happening by Jan. 10, Seyffart stated. He added that some issuers may be left behind, referring to some filers failing to accept the SECs cash-only model.

Seyffarts colleague Balchunas agreed that the ongoing meetings and calls between the SEC multiple spot BTC ETF filers are an interesting and good sign for January.

We are hearing it wasn't one giant conf call b/t SEC and every issuer but rather many calls to exchanges/issuers to reiterate that its Cash Creates or You Will Wait, Balchunas wrote.

Magazine: Lawmakers fear and doubt drives proposed crypto regulations in US

Excerpt from:
Spot Bitcoin ETF approval 'still happening' by Jan. 10, analysts say - Cointelegraph

Read More..

Bitcoin traders see $48K BTC price before ETF ‘sell the news’ event – Cointelegraph

Bitcoin (BTC) circled $44,000 into the Dec. 21 Wall Street open as analysis said that a BTC price correction was necessary.

Data from Cointelegraph Markets Pro and TradingView confirmed Bitcoin trading beyond its previous one-week range.

A breakout had occurred the day prior, with BTC/USD reaching highs of $44,300 before reversing.

Still up over 6% week-to-date, the largest cryptocurrency nonetheless gave some market participants pause for thought.

Although a correction seems necessary, BTC chart continues to look very strong on all timeframes, trading team Stockmoney Lizards wrote in part of its latest market update on X (formerly Twitter).

Like many others, Stockmoney Lizards focused its attention on the upcoming decision on the United States first Bitcoin spot price exchange-traded fund (ETF) due by Jan. 10.

It is likely that BTC will continue to pump and break the upper trendline until an ETF decision is made, it continued, giving a near-term upside target of $48,000.

The announcement itself, even if positive, could nonetheless turn out to be a buy the rumor, sell the news event, the analysis warned. In this, Cointelegraph reported, Stockmoney Lizards is far from alone.

As we finally approach the launch, we need to point out that it is likely that the actual demand for the BTC Spot ETF at the start will fall short of market expectations, trading firm QCP Capital agreed in its own market update on Dec. 21.

The mid to late $30,000s remains a popular area in terms of where a potential retracement could take the market.

Related: 2 risks around Bitcoin ETF launch that no ones talking about

The chart looks heated up and a correction would be good. A drop below $40k could liquidate some leveraged long positions and lead to retracement towards $38k, Stockmoney Lizards concluded.

While the update called such a scenario less likely than others, market data showed traders poorly positioned even for the latest push above $44,000.

According to statistics resource CoinGlass, Dec. 20 liquidated over $100 million in crypto short positions the most in two weeks. BTC short liquidations totaled $38.5 million.

This article does not contain investment advice or recommendations. Every investment and trading move involves risk, and readers should conduct their own research when making a decision.

See the article here:
Bitcoin traders see $48K BTC price before ETF 'sell the news' event - Cointelegraph

Read More..

Researchers take a different approach with measurement-based quantum computing – Phys.org

This article has been reviewed according to ScienceX's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

peer-reviewed publication

trusted source

proofread

close

The race to develop quantum computers has really heated up over the past few years. State-of-the-art systems can now run simple algorithms using dozens of qubitsor quantum bitswhich are the building blocks of quantum computers.

Much of this success has been achieved in so-called gate-based quantum computers. These computers use physical components, most notably superconducting circuits, to host and control the qubits. This approach is quite similar to conventional, device-based classical computers. The two computing architectures are thus relatively compatible and could be used together. Furthermore, future quantum computers could be fabricated by harnessing the technologies used to fabricate conventional computers.

But the Optical Quantum Computing Research Team at the RIKEN Center for Quantum Computing has been taking a very different approach. Instead of optimizing gate-based quantum computers, Atsushi Sakaguchi, Jun-ichi Yoshikawa and Team Leader Akira Furusawa have been developing measurement-based quantum computing.

Measurement-based quantum computers process information in a complex quantum state known as a cluster state, which consists of three (or more) qubits linked together by a non-classical phenomenon called entanglement. Entanglement is when the properties of two or more quantum particles remain linked, even when separated by vast distances.

Measurement-based quantum computers work by making a measurement on the first qubit in the cluster state. The outcome of this measurement determines what measurement to perform on the second entangled qubit, a process called feedforward. This then determines how to measure the third. In this way, any quantum gate or circuit can be implemented through the appropriate choice of the series of measurements.

Measurement-based schemes are very efficient when used on optical quantum computers, since it's easy to entangle a large number of quantum states in an optical system. This makes a measurement-based quantum computer potentially more scalable than a gate-based quantum computer. For the latter, qubits need to be precisely fabricated and tuned for uniformity and physically connected to each other. These issues are automatically solved by using a measurement-based optical quantum computer.

Importantly, measurement-based quantum computation offers programmability in optical systems. "We can change the operation by just changing the measurement," says Sakaguchi. "This is much easier than changing the hardware, as gated-based systems require in optical systems."

But feedforward is essential. "Feedforward is a control methodology in which we feed the measurement results to a different part of the system as a form of control," explains Sakaguchi. "In measurement-based quantum computation, feedforward is used to compensate for the inherent randomness in quantum measurements. Without feedforward operations, measurement-based quantum computation becomes probabilistic, while practical quantum computing will need to be deterministic."

The Optical Quantum Computing Research Team and their co-workersfrom The University of Tokyo, Palack University in the Czech Republic, the Australian National University and the University of New South Wales, Australiahave now demonstrated a more advanced form of feedforward: nonlinear feedforward. Nonlinear feedforward is required to implement the full range of potential gates in optics-based quantum computers. The findings are published in the journal Nature Communications.

"We've now experimentally demonstrated nonlinear quadrature measurement using a new nonlinear feedforward technology," explains Sakaguchi. "This type of measurement had previously been a barrier to realizing universal quantum operations in optical measurement-based quantum computation."

close

Optical computers

Optical quantum computers use qubits made of wave packets of light. At other institutions, some of the current RIKEN team had previously constructed the large optical cluster states needed for measurement-based quantum computation. Linear feedforward has also been achieved to construct simple gate operations, but more advanced gates need nonlinear feedforward.

A theory for practical implementation of nonlinear quadrature measurement was proposed in 2016. But this approach presented two major practical difficulties: generating a special ancillary state (which the team achieved in 2021) and performing a nonlinear feedforward operation.

The team overcame the latter challenge with complex optics, special electro-optic materials and ultrafast electronics. To do this they exploited digital memories, in which the desired nonlinear functions were precomputed and recorded in the memory. "After the measurement, we transformed the optical signal into an electrical one," explains Sakaguchi. "In linear feedforward, we just amplify or attenuate that signal, but we needed to do much more complex processing for nonlinear feedforward."

The key advantages of this nonlinear feedforward technique are its speed and flexibility. The process needs to be fast enough that the output can be synchronized with the optical quantum state.

"Now that we have shown that we can perform nonlinear feedforward, we want to apply it to actual measurement-based quantum computation and quantum error correction using our previously developed system," says Sakaguchi. "And we hope to be able to increase the higher speed of our nonlinear feedforward for high-speed optical quantum computation."

"But the key message is that, although superconducting circuit-based approaches may be more popular, optical systems are a promising candidate for quantum-computer hardware," he adds.

More information: Atsushi Sakaguchi et al, Nonlinear feedforward enabling quantum computation, Nature Communications (2023). DOI: 10.1038/s41467-023-39195-w

Journal information: Nature Communications

Here is the original post:
Researchers take a different approach with measurement-based quantum computing - Phys.org

Read More..

IBM demonstrates useful Quantum computing within 133-qubit Heron, announces entry into Quantum-centric … – Tom’s Hardware

At its Quantum Summit 2023, IBM took the stage with an interesting spirit: one of almost awe at having things go their way. But the quantum of today the one thats changing IBMs roadmap so deeply on the back of breakthrough upon breakthrough was hard enough to consolidate. As IBM sees it, the future of quantum computing will hardly be more permissive. IBM announced cutting-edge devices at the event, including the 133-qubit Heron Quantum Processing Unit (QPU), which is the company's first utility-scale quantum processor, and the self-contained Quantum System Two, a quantum-specific supercomputing architecture. And further improvements to the cutting-edge devices are ultimately required.

Each breakthrough that afterward becomes obsolete is another accelerating bump against what we might call quantum's "plateau of understanding." Weve already crested this plateau with semiconductors, so much so that the latest CPUs and GPUs are reaching practical, fundamental design limits where quantum effects start ruining our math. Conquering the plateau means that utility and understanding are now enough for research and development to be somewhat self-sustainable at least for a Moores-law-esque while.

IBMs Quantum Summit serves as a bookend of sorts for the companys cultural and operational execution, and its 2023 edition showcased an energized company that feels like it's opening up the doors towards a "quantum-centric supercomputing era." That vision is built on the company's new Quantum Processing Unit, Heron, which showcases scalable quantum utility at a 133-qubit count and already offers things beyond what any feasible classical system could ever do. Breakthroughs and a revised understanding of its own roadmap have led IBM to present its quantum vision in two different roadmaps, prioritizing scalability in tandem with useful, minimum-quality rather than monolithic, hard-to-validate, high-complexity products.

IBM's announced new plateau for quantum computing packs in two particular breakthroughs that occurred in 2023. One breakthrough relates to a groundbreaking noise-reduction algorithm (Zero Noise Extrapolation, or ZNE) which we covered back in July basically a system through which you can compensate for noise. For instance, if you know a pitcher tends to throw more to the left, you can compensate for that up to a point. There will always be a moment where you correct too much or cede ground towards other disruptions (such as the opponent exploring the overexposed right side of the court). This is where the concept of qubit quality comes into account the more quality your qubits, the more predictable both their results and their disruptions and the better you know their operational constraints then all the more useful work you can extract from it.

The other breakthrough relates to an algorithmic improvement of epic proportions and was first pushed to Arxiv on August 15th, 2023. Titled High-threshold and low-overhead fault-tolerant quantum memory, the paper showcases algorithmic ways to reduce qubit needs for certain quantum calculations by a factor of ten. When what used to cost 1,000 qubits and a complex logic gate architecture sees a tenfold cost reduction, its likely youd prefer to end up with 133-qubit-sized chips chips that crush problems previously meant for 1,000 qubit machines.

Enter IBMs Heron Quantum Processing Unit (QPU) and the era of useful, quantum-centric supercomputing.

Image 1 of 2

The two-part breakthroughs of error correction (through the ZNE technique) and algorithmic performance (alongside qubit gate architecture improvements) allow IBM to now consider reaching 1 billion operationally useful quantum gates by 2033. It just so happens that its an amazing coincidence (one born of research effort and human ingenuity) that we only need to keep 133 qubits relatively happy within their own environment for us to extract useful quantum computing from them computing that we wouldnt classically be able to get anywhere else.

The Development and Innovation roadmap showcase how IBM is thinking about its superconducting qubits: as weve learned to do with semiconductors already, mapping out the hardware-level improvements alongside the scalability-level ones. Because as weve seen through our supercomputing efforts, theres no such thing as a truly monolithic approach: every piece of supercomputing is (necessarily) efficiently distributed across thousands of individual accelerators. Your CPU performs better by knitting and orchestrating several different cores, registers, and execution units. Even Cerebras Wafer Scale Engine scales further outside its wafer-level computing unit. No accelerator so far no unit of computation - has proven powerful enough that we dont need to unlock more of its power by increasing its area or computing density. Our brains and learning ability seem to provide us with the only known exception.

IBMs modular approach and its focus on introducing more robust intra-QPU and inter-QPU communication for this years Heron shows its aware of the rope it's walking between quality and scalability. The thousands of hardware and scientist hours behind developing the tunable couplers that are one of the signature Heron design elements that allow parallel execution across different QPUs is another. Pushing one lever harder means other systems have to be able to keep up; IBM also plans on steadily improving its internal and external coupling technology (already developed with scalability in mind for Heron) throughout further iterations, such as Flamingos planned four versions which still only end scaling up to 156 qubits per QPU.

Considering how you're solving scalability problems and the qubit quality x density x ease of testing equation, the ticks - the density increases that don't sacrifice quality and are feasible from a testing and productization standpoint - may be harder to unlock. But if one side of development is scalability, the other relates to the quality of whatever youre actually scaling in this case, IBMs superconducting qubits themselves. Heron itself saw a substantial rearrangement of its internal qubit architecture to improve gate design, accessibility, and quantum processing volumes not unlike an Intel tock. The planned iterative improvements to Flamingo's design seem to confirm this.

Theres a sweet spot for the quantum computing algorithms of today: it seems that algorithms that fit roughly around a 60-gate depth are complex enough to allow for useful quantum computing. Perhaps thinking about Intels NetBurst architecture with its Pentium 4 CPUs is appropriate here: too deep an instruction pipeline is counterproductive, after a point. Branch mispredictions are terrible across computing, be it classical or quantum. And quantum computing as we still currently have it in our Noisy Intermediate-Scale Quantum (NISQ)-era is more vulnerable to a more varied disturbance field than semiconductors (there are world overclocking records where we chill our processors to sub-zero temperatures and pump them with above-standard volts, after all). But perhaps that comparable quantum vulnerability is understandable, given how were essentially manipulating the essential units of existence atoms and even subatomic particles into becoming useful to us.

Useful quantum computing doesnt simply correlate with an increasing number of available in-package qubits (announcements of 1,000-qubit products based on neutral atom technology, for instance). But useful quantum computing is always stretched thin throughout its limits, and if it isnt bumping against one fundamental limit (qubit count), its bumping against another (instability at higher qubit counts); or contending with issues of entanglement coherence and longevity; entanglement distance and capability; correctness of the results; and still other elements. Some of these scalability issues can be visualized within the same framework of efficient data transit between different distributed computing units, such as cores in a given CPU architecture, which can themselves be solved in a number of ways, such as hardware-based information processing and routing techniques (AMDs Infinity Fabric comes to mind, as does Nvidia's NVLink).

This feature of quantum computing already being useful at the 133-qubit scale is also part of the reason why IBM keeps prioritizing quantum computing-related challenges around useful algorithms occupying a 100 by 100 grid. That quantum is already useful beyond classical, even in gate grids that are comparably small to what we can achieve with transistors, and points to the scale of the transition of how different these two computational worlds are.

Then there are also the matters of error mitigation and error correction, of extracting ground-truth-level answers to the questions we want our quantum computer to solve. There are also limitations in our way of utilizing quantum interference in order to collapse a quantum computation at just the right moment that we know we will obtain from it the result we want or at least something close enough to correct that we can then offset any noise (non-useful computational results, or the difference of values ranging between the correct answer and the not-yet-culled wrong ones) through a clever, groundbreaking algorithm.

The above are just some of the elements currently limiting how useful qubits can truly be and how those qubits can be manipulated into useful, algorithm-running computation units. This is usually referred to as a qubits quality, and we can see how it both does and doesnt relate to the sheer number of qubits available. But since many useful computations can already be achieved with 133-qubit-wide Quantum Processing Units (theres a reason IBM settled on a mere 6-qubit increase from Eagle towards Heron, and only scales up to 156 units with Flamingo), the company is setting out to keep this optimal qubit width for a number of years of continuous redesigns. IBM will focus on making correct results easier to extract from Heron-sized QPUs by increasing the coherence, stability, and accuracy of these 133 qubits while surmounting the arguably harder challenge of distributed, highly-parallel quantum computing. Its a onetwo punch again, and one that comes from the bump in speed at climbing ever-higher stretches of the quantum computing plateau.

But there is an admission that its a barrier that IBM still wants to punch through its much better to pair 200 units of a 156-qubit QPU (that of Flamingo) than of a 127-qubit one such as Eagle, so long as efficiency and accuracy remain high. Oliver Dial says that Condor, "the 1,000-qubit product", is locally running up to a point. It was meant to be the thousand-qubit processor, and was a part of the roadmap for this years Quantum Summit as much as the actual focus, Heron - but its ultimately not really a direction the company thinks is currently feasible.

IBM did manage to yield all 1,000 Josephson Junctions within their experimental Condor chip the thousand-qubit halo product that will never see the light of day as a product. Its running within the labs, and IBM can show that Condor yielded computationally useful qubits. One issue is that at that qubit depth, testing such a device becomes immensely expensive and time-consuming. At a basic level, its harder and more costly to guarantee the quality of a thousand qubits and their increasingly complex possibility field of interactions and interconnections than to assure the same requirements in a 133-qubit Heron. Even IBM only means to test around a quarter of the in-lab Condor QPUs area, confirming that the qubit connections are working.

But Heron? Heron is made for quick verification that its working to spec that its providing accurate results, or at least computationally useful results that can then be corrected through ZNE and other techniques. That means you can get useful work out of it already, while also being a much better time-to-market product in virtually all areas that matter. Heron is what IBM considers the basic unit of quantum computation - good enough and stable enough to outpace classical systems in specific workloads. But that is quantum computing, and that is its niche.

Heron is IBMs entrance into the mass-access era of Quantum Processing Units. Next years Flamingo builds further into the inter-QPU coupling architecture so that further parallelization can be achieved. The idea is to scale at a base, post-classical utility level and maintain that as a minimum quality baseline. Only at that point will IBM maybe scale density and unlock the appropriate jump in computing capability - when that can be similarly achieved in a similarly productive way, and scalability is almost perfect for maintaining quantum usefulness.

Theres simply never been the need to churn out hundreds of QPUs yet the utility wasnt there. The Canaries, Falcons, and Eagles of IBMs past roadmap were never meant to usher in an age of scaled manufacturing. They were prototypes, scientific instruments, explorations; proofs of concept on the road towards useful quantum computing. We didnt know where usefulness would start to appear. But now, we do because weve reached it.

Heron is the design IBM feels best answers that newly-created need for a quantum computing chip that actually is at the forefront of human computing capability one that can offer what no classical computing system can (in some specific areas). One that can slice through specific-but-deeper layers of our Universe. Thats what IBM means when it calls this new stage the quantum-centric supercomputing one.

Classical systems will never cease to be necessary: both of themselves and the way they structure our current reality, systems, and society. They also function as a layer that allows quantum computing itself to happen, be it by carrying and storing its intermediate results or knitting the final informational state mapping out the correct answer Quantum computing provides one quality step at a time. The quantum-centric bit merely refers to how quantum computing will be the core contributor to developments in fields such as materials science, more advanced physics, chemistry, superconduction, and basically every domain where our classical systems were already presenting a duller and duller edge with which to improve upon our understanding of their limits.

However, through IBMs approach and its choice of transmon superconducting qubits, a certain difficulty lies in commercializing local installations. Quantum System Two, as the company is naming its new almost wholesale quantum computing system, has been shown working with different QPU installations (both Heron and Eagle). When asked about whether scaling Quantum System Two and similar self-contained products would be a bottleneck towards technological adoption, IBMs CTO Oliver Dial said that it was definitely a difficult problem to solve, but that he was confident in their ability to reduce costs and complexity further in time, considering how successful IBM had already proven in that regard. For now, its easier for IBMs quantum usefulness to be unlocked at a distance through the cloud and its quantum computing framework, Quiskit than it is to achieve it by running local installations.

Quiskit is the preferred medium through which users can actually deploy IBM's quantum computing products in research efforts just like you could rent X Nvidia A100s of processing power through Amazon Web Services or even a simple Xbox Series X console through Microsofts xCloud service. On the day of IBM's Quantum Summit, that freedom also meant access to the useful quantum circuits within IBM-deployed Heron QPUs. And it's much easier to scale access at home, serving them through the cloud, than delivering a box of supercooled transmon qubits ready to be plugged and played with.

Thats one devil of IBMs superconducting qubits approach not many players have the will, funding, or expertise to put a supercooled chamber into local operation and build the required infrastructure around it. These are complex mechanisms housing kilometers of wiring - another focus of IBMs development and tinkering culminating in last years flexible ribbon solution, which drastically simplified connections to and from QPUs.

Quantum computing is a uniquely complex problem, and democratized access to hundreds or thousands of mass-produced Herons in IBMs refrigerator-laden fields will ultimately only require, well a stable internet connection. Logistics are what they are, and IBMs Quantum Summit also took the necessary steps to address some needs within its Quiskit runtime platform by introducing its official 1.0 version. Food for thought is realizing that the era of useful quantum computing seems to coincide with the beginning of the era of Quantum Computing as a service as well. That was fast.

The era of useful, mass-producible, mass-access quantum computing is what IBM is promising. But now, theres the matter of scale. And theres the matter of how cost-effective it is to install a Quantum System Two or Five or Ten compared to another qubit approach be it topological approaches to quantum computing, or oxygen-vacancy-based, ion-traps, or others that are an entire architecture away from IBMs approach, such as fluxonium qubits. Its likely that a number of qubit technologies will still make it into the mass-production stage and even then, we can rest assured that everywhere in the road of human ingenuity lie failed experiments, like Intels recently-decapitated Itanium or AMDs out-of-time approach to x86 computing in Bulldozer.

It's hard to see where the future of quantum takes us, and its hard to say whether it looks exactly like IBMs roadmap the same roadmap whose running changes we also discussed here. Yet all roadmaps are a permanently-drying painting, both for IBM itself and the technology space at large. Breakthroughs seem to be happening daily on each side of the fence, and its a fact of science that the most potential exists the earlier the questions we ask. The promising qubit technologies of today will have to answer to actual interrogations on performance, usefulness, ease and cost of manipulation, quality, and scalability in ways that now need to be at least as good as what IBM is proposing with its transmon-based superconducting qubits, and its Herons, and scalable Flamingos, and its (still unproven, but hinted at) ability to eventually mass produce useful numbers of useful Quantum Processing Units such as Heron. All of that even as we remain in this noisy, intermediate-scale quantum (NISQ) era.

Its no wonder that Oliver Dial looked and talked so energetically during our interview: IBM has already achieved quantum usefulness and has started to answer the two most important questions quality and scalability, Development, and Innovation. And it did so through the collaboration of an incredible team of scientists to deliver results years before expected, Dial happily conceded. In 2023, IBM unlocked useful quantum computing within a 127-qubit Quantum Processing Unit, Eagle, and walked the process of perfecting it towards the revamped Heron chip. Thats an incredible feat in and of itself, and is what allows us to even discuss issues of scalability at this point. Its the reason why a roadmap has to shift to accommodate it and in this quantum computing world, its a great follow-up question to have.

Perhaps the best question now is: how many things can we improve with a useful Heron QPU? How many locked doors have sprung ajar?

See more here:
IBM demonstrates useful Quantum computing within 133-qubit Heron, announces entry into Quantum-centric ... - Tom's Hardware

Read More..

Research group launches Japan’s third quantum computer at Osaka University – Fujitsu

Center for Quantum Information and Quantum Biology at Osaka University, RIKEN, National Institute of Information and Communications Technology, e-trees.Japan, Inc., Fujitsu Limited, NTT Corporation, QuEL, Inc., QunaSys Inc., Systems Engineering Consultants Co.,LTD.

Tokyo and Osaka, December 20, 2023

A consortium of joint research partners including the Center for Quantum Information and Quantum Biology at Osaka University, RIKEN, the Advanced Semiconductor Research Center at the National Institute of Advanced Industrial Science and Technology (AIST), the Superconducting ICT Laboratory at the National Institute of Information and Communications Technology (NICT), Amazon Web Services, e-trees.Japan, Inc., Fujitsu Limited, NTT Corporation (NTT), QuEL, Inc., QunaSys Inc., and Systems Engineering Consultants Co.,LTD. (SEC) today announced the successful development of Japans third superconducting quantum computer (1) installed at Osaka University. Starting December 22, 2023, the partners will provide users in Japan access to the newly developed computer via the cloud, enabling researchers to execute quantum algorithms (2), improve and verify the operation of software, and explore use cases remotely.

The newly developed superconducting quantum computer uses a 64 qubit chip provided by RIKEN, which leverages the same design as the chip in RIKEN's first superconducting quantum computer, which was unveiled to users in Japan as a cloud service for non-commercial use on March 27, 2023 (3).

For the new quantum computer, the research team sourced more domestically manufactured components (excluding the refrigerator). The research team confirmed that the new quantum computer, including its components, provides sufficient performance and will utilize the computer as a test bed for components made in Japan.

Moving forward, the research group will operate the new computer while improving its software and other systems for usage including the processing of heavy workloads on the cloud. The research team anticipates that the computer will drive further progress in the fields of machine learning and the development of practical quantum algorithms, enable the exploration of new use cases in material development and drug discovery, and contribute to the solution of optimization problems to mitigate environmental impact.

The joint research group is comprised of: Dr. Masahiro Kitagawa, (Professor, Graduate School of Engineering Science, Director of the Center for Quantum Information and Quantum Biology at Osaka University), Dr. Makoto Negoro (Associate Professor, Vice Director of the Center for Quantum Information and Quantum Biology at Osaka University), Dr. Yasunobu Nakamura (Director of the RIKEN Center for Quantum Computing (RQC)), Dr. Katsuya Kikuchi (Group Leader of the 3D Integration System Group of the Device Technology Research Institute at AIST), Dr. Hirotaka Terai (Executive Researcher at the Superconductive ICT Device Laboratory at the Kobe Frontier Research Center of the Advanced ICT Research Institute of NICT), Dr. Yoshitaka Haribara (Senior Startup Machine Learning and Quantum Solutions Architect, Amazon Web Services), Dr. Takefumi Miyoshi(Director of e-trees.Japan, Inc., Specially Appointed Associate Professor, Center for Quantum Information and Quantum Biology at Osaka University, CTO of QuEL, Inc.), Dr. Shintaro Sato (Head of Quantum Laboratory, Fujitsu Research, Fujitsu Limited), Dr. Yuuki Tokunaga (Distinguished Researcher at NTT Computer & Data Science Laboratories), Yosuke Ito (CEO of QuEL, Inc.), Keita Kanno (CTO of QunaSys Inc.), and Ryo Uchida (Chief Technologist of Systems Engineering Consultants Co.,LTD. (SEC)).

This research was supported by grants from:

The Center for Quantum Information and Quantum Biology consists of six research groups: Quantum Computing, Transdisciplinary Quantum Science, Quantum Information Devices, Quantum Communication and Security, Quantum Sensing, and Quantum Biology. QIQB promotes transdisciplinary research between each of these research groups and with other academic fields. The Center is an international research hub for quantum innovation by actively promoting international academic exchange and collaboration across borders. QIQB seeks to play a key role in nurturing future quantum leaders and specialists through education and training. For more information: https://qiqb.osaka-u.ac.jp/en/

RIKEN is Japan's largest research institute for basic and applied research. Over 2500 papers by RIKEN researchers are published every year in leading scientific and technology journals covering a broad spectrum of disciplines including physics, chemistry, biology, engineering, and medical science. RIKEN's research environment and strong emphasis on interdisciplinary collaboration and globalization has earned a worldwide reputation for scientific excellence.Website: http://www.riken.jp/en/Facebook: http://www.facebook.com/RIKENHQX (formerly Twitter): @riken_en

As the only public research institution of Japan that specializes in the field of information and communications technology, the National Institute of Information and Communications Technology (NICT) promotes ICT R&D, from the foundational to the implementation, while collaborating with universities, industry, and domestic and overseas research institutions. NICT will be advancing R&D in the fields of advanced electromagnetic wave technology, innovative networks, cybersecurity, universal communication, and frontier science. Furthermore, NICT will also actively be promoting R&D in four strategic research fields (Beyond 5G, AI, quantum ICT, and cybersecurity), which are essential cutting-edge technologies for next-generation ICT infrastructure for the early realization of Society 5.0. For more information, visit https://www.nict.go.jp/en/.

e-trees.Japan, Inc. provides solutions and implements applications with powerful hardware and flexible software combinations. Our key concept is Keep It Simple and Smart. We primarily address R&D items, such as FPGA and network (with network protocol stack for FPGAs implemented by ourselves), embedded systems, and systems with low power consumption or renewable energy. Find out more: https://e-trees.jp/en/.

Fujitsus purpose is to make the world more sustainable by building trust in society through innovation. As the digital transformation partner of choice for customers in over 100 countries, our 124,000 employees work to resolve some of the greatest challenges facing humanity. Our range of services and solutions draw on five key technologies: Computing, Networks, AI, Data & Security, and Converging Technologies, which we bring together to deliver sustainability transformation. Fujitsu Limited (TSE:6702) reported consolidated revenues of 3.7 trillion yen (US$28 billion) for the fiscal year ended March 31, 2023 and remains the top digital services company in Japan by market share. Find out more: http://www.fujitsu.com.

NTT contributes to a sustainable society through the power of innovation. We are a leading global technology company providing services to consumers and business as a mobile operator, infrastructure, networks, applications, and consulting provider. Our offerings include digital business consulting, managed application services, workplace and cloud solutions, data center and edge computing, all supported by our deep global industry expertise. We are over $97B in revenue and 330,000 employees, with $3.6B in annual R&D investments. Our operations span across 80+ countries and regions, allowing us to serve clients in over 190 of them. We serve over 75% of Fortune Global 100 companies, thousands of other enterprise and government clients and millions of consumers.NTT Service Innovation Laboratory Group Public Relationsnttrd-pr@ml.ntt.com

QuEL, Inc. is an Osaka University-affiliated startup established in 2021. We are a team of experienced researchers and engineers with various backgrounds, strongly supporting quantum computing researchers to supply novel qubit controllers.Find out more: https://quel-inc.com/

QunaSys is a Japanese startup engaged in advancing algorithms in chemistry to drive real-world applications of quantum technology. Our primary focus is on leveraging quantum computing potential by collaborating on research with industry leaders and fostering a community-driven approach within the QPARC industry network. Our flagship innovation, QURI, represents a user-friendly quantum computational web software. This platform allows users without specialized quantum algorithm expertise to engage in quantum calculations seamlessly. Additionally, our QURI Parts act as essential building blocks, aiding in the assembly of quantum algorithms into efficient Python code. For more information, visit us at https://qunasys.com/en

Systems Engineering Consultants (SEC) is a software development company specialized in real-time technology, contributing to the safety and development of society. We offer real-time software in four different business fields: mobile networking, internet technology, public infrastructure, and space, robotics and advanced technologies. Find out more: https://www.sec.co.jp/en/.

RIKENRIKEN Global CommunicationsX (formerly Twitter) : @riken_enTel: +81-(0)48-462-1225E-mail: pr@riken.jp

Fujitsu LimitedPublic and Investor Relations DivisionInquiries

All company or product names mentioned herein are trademarks or registered trademarks of their respective owners. Information provided in this press release is accurate at time of publication and is subject to change without advance notice.

See original here:
Research group launches Japan's third quantum computer at Osaka University - Fujitsu

Read More..

Japan’s Third Superconducting Quantum Computer Installed at Osaka University – The Quantum Insider

Insider Brief

PRESS RELEASE A consortium of joint research partners announced the successful development of Japans third superconducting quantum computer installed at Osaka University.

The partnership includes the Center for Quantum Information and Quantum Biology at Osaka University, RIKEN, the Advanced Semiconductor Research Center at the National Institute of Advanced Industrial Science and Technology (AIST), the Superconducting ICT Laboratory at the National Institute of Information and Communications Technology (NICT), Amazon Web Services, e-trees.Japan, Inc., Fujitsu Limited, NTT Corporation (NTT), QuEL, Inc., QunaSys Inc., and Systems Engineering Consultants Co.,LTD. (SEC).

Starting December 22, 2023, the partners will provide users in Japan access to the newly developed computer via the cloud, enabling researchers to execute quantum algorithms, improve and verify the operation of software, and explore use cases remotely.

The newly developed superconducting quantum computer uses a 64 qubit chip provided by RIKEN, which leverages the same design as the chip in RIKENs first superconducting quantum computer, which was unveiled to users in Japan as a cloud service for non-commercial use on March 27, 2023.

For the new quantum computer, the research team sourced more domestically manufactured components (excluding the refrigerator). The research team confirmed that the new quantum computer, including its components, provides sufficient performance and will utilize the computer as a test bed for components made in Japan.

Moving forward, the research group will operate the new computer while improving its software and other systems for usage including the processing of heavy workloads on the cloud. The research team anticipates that the computer will drive further progress in the fields of machine learning and the development of practical quantum algorithms, enable the exploration of new use cases in material development and drug discovery, and contribute to the solution of optimization problems to mitigate environmental impact.

The joint research group is comprised of: Dr. Masahiro Kitagawa, (Professor, Graduate School of Engineering Science, Director of the Center for Quantum Information and Quantum Biology at Osaka University), Dr. Makoto Negoro (Associate Professor, Vice Director of the Center for Quantum Information and Quantum Biology at Osaka University), Dr. Yasunobu Nakamura (Director of the RIKEN Center for Quantum Computing (RQC)), Dr. Katsuya Kikuchi (Group Leader of the 3D Integration System Group of the Device Technology Research Institute at AIST), Dr. Hirotaka Terai (Executive Researcher at the Superconductive ICT Device Laboratory at the Kobe Frontier Research Center of the Advanced ICT Research Institute of NICT), Dr. Yoshitaka Haribara (Senior Startup Machine Learning and Quantum Solutions Architect, Amazon Web Services), Dr. Takefumi Miyoshi(Director of e-trees.Japan, Inc., Specially Appointed Associate Professor, Center for Quantum Information and Quantum Biology at Osaka University, CTO of QuEL, Inc.), Dr. Shintaro Sato (Head of Quantum Laboratory, Fujitsu Research, Fujitsu Limited), Dr. Yuuki Tokunaga (Distinguished Researcher at NTT Computer & Data Science Laboratories), Yosuke Ito (CEO of QuEL, Inc.), Keita Kanno (CTO of QunaSys Inc.), and Ryo Uchida (Chief Technologist of Systems Engineering Consultants Co.,LTD. (SEC)).

For more information: Center for Quantum Information and Quantum Biology, Osaka University

This research was supported by grants from:

Read more from the original source:
Japan's Third Superconducting Quantum Computer Installed at Osaka University - The Quantum Insider

Read More..

COO of Quantum Computing Inc., Bill McGann, Presents a New Quantum Intelligence Platform Leveraging Reservoir … – The Quantum Insider

Quantum Computing Inc. (QCi), a company based out of Leesburg, Virginia that specializes in quantum hardware and software, offers practical quantum computing solutions for business applications.

Just recently in a video, the companys COO, Bill McGann, introduced a unique quantum intelligence platform using reservoir computers. This initiative aims to augment current artificial intelligence (AI) systems rather than replace them, focusing on a hardware-centric approach for edge deployment. This distinguishes it from the common software-centric strategies, highlighting its potential to enhance AI with advanced hardware innovations.

In the short presentation, McGann discusses QCis platform, mentioning that reservoir computers are just the beginning. He emphasized that the plan is to gradually integrate quantum information into these platforms. The eventual goal is to develop a reservoir quantum computer, representing the next generation of their technology. McGann noted that this development is currently underway.

Our contribution is not going to replace what artificial intelligence does today, said McGann. But its really good to enhance it.

McGann clarified that their focus isnt on replicating existing technologies like GPT. Instead, they aim to enhance current systems. He highlighted the uniqueness of their approach to quantum intelligence, machine learning (ML) and AI, and noted their commitment to hardware-based solutions.

So everybody that does reservoir computing today, for the most part, we havent found anybody that does it our way. Everyones using like, you know, software, he said.

To conclude, McGann explained their process as cloud-based, involving data input into a reservoir to generate output. This output is then used in training and applied to tests. Unlike other approaches, they implement this process in hardware, allowing for intelligence deployment at the edge, which is crucial for some applications. Therefore, their strategy for the quantum budgets platform is centred around a hardware solution.

This strategywhich integrates quantum information for edge deploymentmay set QCi apart in the future, one where AI and ML will be all important.

Featured image: Credit: QCi

Read more:
COO of Quantum Computing Inc., Bill McGann, Presents a New Quantum Intelligence Platform Leveraging Reservoir ... - The Quantum Insider

Read More..

Year of covers: Tech and sport, quantum advances and Gen AI – Technology Magazine

From groundbreaking breakthroughs in AI and quantum computing to the continued evolution of augmented and virtual reality, 2023 has witnessed a surge of innovation that is poised to revolutionise our world.

AI continues to evolve at an astonishing pace, with advancements in natural language processing (NLP) enabling more natural and intuitive human-computer interactions. Computer vision, another key AI domain, has made strides in image and video analysis, leading to improved object detection, facial recognition, and medical imaging capabilities. AI is also making significant contributions in drug discovery, medical diagnosis, and self-driving car development, further demonstrating its transformative potential.

The immersive worlds of augmented reality (AR) and virtual reality (VR) have taken significant steps forward, blurring the lines between the physical and digital realms. AR applications are becoming increasingly prevalent in gaming, education, and training, enhancing real-world experiences with digital overlays. VR, meanwhile, is gaining momentum in entertainment, healthcare, and remote collaboration, offering users immersive and interactive experiences.

Quantum computing, still in its early stages, holds immense promise for solving problems that are intractable for classical computers. Researchers are making progress in building and optimizing quantum computers, paving the way for breakthroughs in fields like materials science, drug discovery and AI.

All of these topics and more have featured in our magazine over the past 12 months, and the trends we have witnessed are likely to accelerate in the years to come. As 2023 comes to a close, join us for a review of Technology Magazine's covers from 2023.

Read the original post:
Year of covers: Tech and sport, quantum advances and Gen AI - Technology Magazine

Read More..

Quantum AI Brings the Power of Quantum Computing to the Public – GlobeNewswire

Luton, Dec. 20, 2023 (GLOBE NEWSWIRE) -- Quantum AI is set to bring the power of quantum computing to the public and has already reached a stunning quantum volume (QV) score of 14,082 in a year since its inception.

Quantum AI Ltd. was conceived by Finlay and Qaiser Sajjad during their time as students at MIT. They were inspired by the exclusive use of new-age technology by the elites on Wall Street. Recognising the transformative power of this technology, they were determined to make its potential accessible to all. Thus, the platform was born, and it has evolved and flourished in just a short time.

Quantum AI

Often, everyday traders have limited access to such advanced tools.

We are fueled by the belief that the power of quantum computing should not be confined to the financial giants but should be available to empower amateur traders as well, asserted the founders of the platform. Since its launch in 2022, they have worked to achieve this vision and have become a significant force in the industry.

The platform combines the power of the technology with the strength of artificial intelligence. By using these latest technologies, including machine learning, algorithms that are more than just lines of code have been created. They harness the potential of quantum mechanics and deep learning to analyse live data in unique ways.

Our quantum system leverages quantum superposition and coherence, providing a quantum advantage through sophisticated simulation and annealing techniques, added the founders.

Quantum AI has shown exceptional results in a brief period. It has received overwhelmingly positive reviews from customers, highlighting the enhanced speed and accuracy of trading. The transformative and groundbreaking impact the platform has had on trading is evident in its growth to 330,000 active members. Notably, it has nearly 898 million lines of code and an impressive quantum value score of 14,082. The performance on this benchmark that IBM established is a massive testament to the impact quantum AI has had in a short span of time.

According to the founders, they have bigger plans on the horizon to take the power of the technology to the public. Quantum AI is growing its team of experts and expanding its operations in Australia and Canada. Its goal of democratising the power of technology is well on its way to being realised. With trading being the first thing they cracked to pay the bills the main focus has turned to aviation, haulage and even e-commerce. The power of

To learn more about the platform and understand the transformative power of the technology for traders, one can visit https://quantumai.co/.

About Quantum AI

With the aim of democratising the power and potential of quantum computing, the company was founded by Finlay and Qaiser Sajjad during their time at MIT. Since its establishment, it has grown to over 330,000 active members and 18 full-time employees, alongside winning the trust of its customers.

###

Media Contact

Quantum AIPR Manager: Nadia El-Masri Email: nadia.el.masri@quantumai.coAddress: Quantum AI Ltd, 35 John Street, Luton, United Kingdom, LU1 2JEPhone: +442035970878URL: https://quantumai.co/

More:
Quantum AI Brings the Power of Quantum Computing to the Public - GlobeNewswire

Read More..