Category Archives: Quantum Computer
Alice & Bob-led Research Shows Novel Approach to Error Correction Could Reduce Number of Qubits For Useful … – The Quantum Insider
Insider Brief
PRESS RELEASE Alice & Bob, a leading hardware developer in the race to fault tolerant quantum computers, in collaboration with the research institute Inria, today announced a new quantum error correction architecture low-density parity-check (LDPC) codes on cat qubits to reduce hardware requirements for useful quantum computers.
The theoretical work, available on arXiv, advances previous research on LDPC codes by enabling the implementation of gates as well as the use of short-range connectivity on quantum chips.
The resulting reduction in overhead required for quantum error correction will allow the operation of 100 high-fidelity logical qubits (with an error rate of 10-8) with as little as 1,500 physical cat qubits.
Over 90% of quantum computing value depends on strong error correction, which is currently many years away from meaningful computations, said Jean-Franois Bobier, Partner and Director at the Boston Consulting Group. By improving correction by an order of magnitude, Alice & Bobs combined innovations could deliver industry-relevant logical qubits on hardware technology that is mature today.
This new architecture using LDPC codes and cat qubits could run Shors algorithm with less than 100,000 physical qubits, a 200-fold improvement over competing approaches 20 million qubit requirement. said Thau Peronnin, CEO of Alice & Bob. Our approach makes quantum computers more realistic in terms of time, cost and energy consumption, demonstrating our continued commitment to advancing the path to impactful quantum computing with error corrected, logical qubits.
Cat qubits alone already enable logical qubit designs that require significantly fewer qubits, thanks to their inherent protection from bit flip errors. In a previous paper by Alice & Bob and CEA, researchers demonstrated how it would be possible to run Shors algorithm with 350,000 cat qubits, a 60-fold improvement over the state-of-the art.
LDPC codes are a class of efficient error correction codes that reduce hardware requirements to correct errors occurring in information transfer and storing. By using LDPC codes on a cat-qubit architecture, this latest work not only shows how the qubit footprint of a fault tolerant quantum computer could be further reduced but overcomes two key challenges for the implementation of quantum LDPC (qLDPC) codes.
Alice & Bob recently announced the tape out of a chip that would encode their first logical qubit prototype, known as Helium 1. When logical qubits with a sufficiently low error rate are implemented and using the cat qubit LDPC code technique, Alice & Bob would be capable of harnessing the computing power of 100 logical qubits with as little as 1,500 physical qubits, to run fault-tolerant algorithms.
As leading superconducting quantum computing manufacturers like IBM offer up to 1,121 physical qubits, outperforming classical computers in the simulation of quantum systems (quantum supremacy) is a milestone that would become attainable within current hardware capabilities using Alice & Bob new architecture.
Read the original here:
Alice & Bob-led Research Shows Novel Approach to Error Correction Could Reduce Number of Qubits For Useful ... - The Quantum Insider
IonQ, Rescale team to put quantum, cloud-based HPC to work on AI – FierceElectronics
IonQ, one of the few companies that can lay claim to having sold a quantum computer, has partnered with Rescale, maker of accelerated computing software for cloud-based high-performance computing, in a match that could have big implications for users tackling large and complex AI projects.
The new partners said in a statement that they aim to merge the raw processing power of accelerated cloud computing with the unique problem-solving potential of quantum computing to tackle problems and applications in the realms of product development, healthcare, life sciences, financial services, materials research, logistics optimization, and national research labs.
IonQ already has sold two of its quantum computers to a European research lab, and Rescales platform would serve as a springboard for users of IonQs machines, such as its 29-algorithmic-qubit Forte, to be used on AI projects in a hybrid quantum computing/HPC environment, the companies said.
IonQ also just this week unveiled a more advanced 35-algorithmic-qubit machine, and as quantum computers continue to advance, they already are making a case for their ability to handle large AI and machine learning tasks.
At the recent Needham Growth Conference, IonQ officials spoke to quantums advantage, with IonQ CFO Thomas Kramer saying, AI runs on top of very large machine learning models, and what weve seen when they run on our quantum computers is that they're able to capture predicted outcomes [as well as] more outlying possible eventsBlack Swans, if you will and on many fewer iterations.
He added that running some problems on classical computing infrastructure may require running the same query thousands or even millions of times. We have seen that when you do that on a quantum computerour quantum computersyou can get to the same predictive output or even better output by going through 1/1000th the number of iterations. We've also seen that in very large models we can get to the same predictive output with 1/1000th the number of input variables, which translates to not only a time savings, but also reduced cost and greater power efficiency.
Because of that, we will be able to run machine learning problem sets [on quantum computers] that we can't do today [on classical computers] and will not be able to because the training set is too large to do it in an economic fashion, Kramer said.
Jordan Shapiro, vice president of financial planning and analysis and head of investor relations, also in attendance at the Needham conference, concluded, So imagine if you go to an OpenAI or any one of the thousands of companies now using these large language models and tell them that you can run and train their model cheaper, in less time, and using less power, and with better accuracy and fewer input variablesthat is a compelling use case.
The new partnership may demonstrate a new channel for Rescale, as it has not previously partnered with any quantum computing companies. To date, it has worked with many cloud providers and semiconductors firms, including Nvidia and AMD, on combining its software with their hardware on AI applications.
"Through seamlessly blending the largest full-stack integrated R&D capabilities and AI-powered computational workflows on Rescale with IonQ's cutting-edge quantum technology, we are embarking on a journey to accelerate engineering innovations and discover new science," said Joris Poort, CEO of Rescale. "This partnership not only accelerates R&D in fields such as engineering product development and life sciences exploration, but it creates a collaborative ecosystem where the boundaries of innovation are productively explored by the world's leading scientists."
"In our partnership with Rescale, we are exploring new ground in the realm of hybrid quantum computing," said Rima Alameddine, Chief Revenue Officer at IonQ. "IonQ's cutting-edge quantum computers, coupled with Rescale's R&D platform, form a dynamic duo poised to revolutionize how we approach healthcare, life sciences, financial services, and address national security challenges. This collaboration is about more than accelerating computer power; it's also about bringing the best of high performance computing, AI, and quantum computing to solve complex intractable problems and unlock unprecedented possibilities."
The companies also said the new partnership goes beyond hardware and software integration, fostering a collaborative environment where scientific expertise, computational power, and quantum know-how converge.
Originally posted here:
IonQ, Rescale team to put quantum, cloud-based HPC to work on AI - FierceElectronics
Quantum computing’s ChatGPT moment could be right around the corner – Cointelegraph
Tech experts across government, academia, and the private sector are methodically working to ensure the worlds data is safe from the impending threat of quantum decryption. While this may represent the greatest technological threat this side of AI-wrought extinction, there may be some silver linings along the way.
At some point, possibly in the near future, researchers believe a quantum computing system capable of breaking RSA encryption the standard that protects banks, military bases, and countless other institutions from hackers and spies will emerge.
Related: WEF identifies AI and quantum computing as emerging global threats
Before that happens, however, several other quantum technology solutions will likely need to come into focus. Chief among them, may very well be quantum sensing.
Jack Hidary, the CEO of SandboxAQ, a Google sibling company focused on quantum technologies, is certain well see scaled, fault-tolerant quantum computers by the end of the decade.
In a talk entitled Quantums Black Swan, given at the World Economic Forum, the CEO discussed the threat of quantum decryption as well as some of the potential breakthroughs we could see ahead of it.
Hidary predicts that certainly by 2029-30, we're going to see scaled, fault-tolerant quantum computers, which could be capable of breaking encryption.
Hes not the only one making predictions that would have seemed bold just a few years ago. IBM, currently considered the industry leader by many, says itll hit an inflection point in quantum computing by 2029. And MIT/Harvard spinout QuEra claims itll have a 10,000-qubit error-corrected quantum computer by 2026.
Theoretically-speaking, any quantum computer capable of quantum advantage outperforming classical binary computers at useful tasks could break RSA encryption.
Luckily, as Hidary pointed out, groups around the world, including the U.S. government and IBM, have identified algorithms and policies that should be able to protect our data if they can be implemented in time.
Its likely well see a swell of related quantum technologies before the threat of quantum-based encryption breaking is realized. This could manifest in less-powerful quantum computing systems capable of pushing beyond the limitations of todays modern binary supercomputers.
However, a more immediate quantum technology might be quantum sensing. According to Hidary, quantum sensors could fill in the gaps in our GPS system perhaps even thwarting active attempts at obfuscating satellite signals.
There could be myriad uses for quantum sensors ranging from medical applications involving deeper, more accurate, real-time body and brain scanning to potential implications for robotics capable of full, untethered autonomy.
Much like most AI experts and pundits couldnt have predicted the impact ChatGPT would have less than a decade after the seminal Generative Adversarial Networks paper was published, it might be difficult to determine just how quantum computing will break through from the lab to the mainstream.
Read the original post:
Quantum computing's ChatGPT moment could be right around the corner - Cointelegraph
Quantum Algorithm Can be Used to Optimize Telecommunication Networks – The Quantum Insider
Insider Brief
PRESS RELEASE The project, promoted by Cinfo and Kipu Quantumon the Rinfrastructure, applies the computing capacity offered today by quantum computing to the Galician operators optical fiber backbone network, examining its robustness and resilience to potential outages and/or critical situations.
The newly designed quantum algorithm identifies the most sensitive nodes, the ones that could have the greatest impact on the service in the event of a disconnection or breakdown. This relevant information makes it possible to focus on those points detected with current quantum technology and anticipate counteractions, achieving a maximum index of network availabilityand service excellence.
Cinfo, which has prepared the network model adapted to the capabilities of accessible quantum computers, has been supported by Kipu Quantum, which was in charge of preparing the model of the quantum algorithm to analyze the R backbone network.
Norberto Ojinaga, Director of Technology Solutions at Rand the MASMOVILGroup, believes that the considered use case is a realistic instance that allows a significant improvement in the quality and guarantee of the service that, as a telecommunications operator, we want to offer our customers.
In addition, Isidro Fernndez de la Calle, Director of Business at Rand MASMOVILGroup, explains that we manifest our strong engagement to provide our customers with a robust and resilient network in case of most adverse circumstances; therefore, we cannot neglect the advances quantum computing offers now to simulate and prepare our environments to be managed with a maximal guarantee.
Analysis in two phases: quantum annealers and neutral atoms
Unlike classical computing, and thanks to the large number of qubits of neutral-atom quantum computers(256 today and about 1,000 expected in about a year), the piloted quantum algorithm consumes the same time regardless of the number of network nodes.
In the first phase of the R-Cinfo-Kipu project, an analysis has been carried out for each node with currently available quantum computers. Specifically, an initial classification of the network topology has been performed with aquantum annealeron 180 of the 5,627 qubits available in the quantum computers of the D-Wave company, allowing a sensible network segmentation.
In the second phase, the one related to the examination of sublattices, QuEras quantum computer based on neutral-atom technology was used with 20 qubits for the solution of the main lattice structure and 46 qubitsfor the combined structure of sublattices.
This pioneering hybrid solution architecture employing various quantum computing technologies has been made possible due to the commercial access offered by providers through cloud services. QuEra platform access is enabled by AWS Braket service, while D-Wave provides its own services. Cloud quantum computing capabilities made these combined solutions possible by extracting the best from each of them.
The CEO of Cinfo, Antonio Rodrguez del Corral, points out that at Cinfo, we have accepted the challenge of creating valuable use cases in quantum computing for industry. To this end, we are developing a team of professionals graduates in Physics from the University of Santiago de Compostela and selecting technology partners that will introduce us to the design of quantum algorithms and to the understanding of the different capabilities of existing quantum computers, such as Kipu Quantum.
For Rodrguez del Corral, technology companies are needed to identify business problems where quantum computing can help, because the customer does not have to know the details of a complex field that evolves at great speed. We try to find unsolved business problems in large companies and see whether quantum computing can provide better solutions. Once the challenge has been identified, the most appropriate algorithm and quantum hardware will have to be chosen, which may change in time as quantum computers develop.
Regarding the analysis done of the R network, the CEO of Cinfo explains thatthe optimization of network traffic is always a key issue and, in complex networks such as the R network, it may require quantum computing. We proposed it, confirmed the fit, and this project was born.
On another note, Kipu Quantums Chief Visionary Officer, Enrique Solano, comments that quantum computers with digital, analog, and digital-analog encoding will move closer to quantum advantage for industrial use cases this year. Projects such as the one developed with Rand Cinfo are a step forward towards the practical use of quantum processors with hundreds of qubits. Kipu Quantum is proud to contribute to the leaders and pioneers of the Galician quantum pole in the use of quantum technologies.
The CEO of Kipu Quantum, Daniel Volz, believes that this project should lay the foundations for a long and fruitful collaboration with Cinfo, with the Galician industry and business community, as well as with the Spanish technological ecosystem in our joint path towards the usefulness of quantum computers in Europe.
Both quantum hardwarecompanies and algorithm vendors have embraced the challenge of achieving quantum advantage in the short term, possibly in a couple of years. The startup Kipu Quantumaims to achieve this as soon as possible with its application- and hardware-specific algorithms, which are adapted to existing hardware. In addition, Kipu Quantumhas the highest compression i.e. reduction of algorithms in quantum devices with digital, analog, and digital-analog encoding in optimization, logistics, finance, and artificial intelligence, as well as in the design of chemical molecules and materials. In this way, the best solutions can be extracted from quantum processors with qubits encoded in trapped ions, neutral atoms, or superconducting circuits.
Points for improvement on a realistic case
The topology of an advanced fiber network infrastructure such as Rs requires a detailed study of the connectivity of its nodes and strategic points that facilitate an evaluation of its strength. For this reason, the one of R has become a use case for Cinfo. In this sense, by running algorithms or models through quantum computers, it is possible to detect points of improvement in the network and draw conclusions to act and solve.
As quantum computers become more powerful, a larger number of more complex variables can be incorporated into the algorithmic study. In this way, the path established by this Cinfoproject, in collaboration with its expert partnerKipu Quantum,will go ahead and exploit the more than predictable quantum hardware enhancements. In fact, it is expected that by 2025 these supercomputers may already be able to process the algorithm for a complex network such as the backbone of Rand the entire MASMOVIL Group; basically, as the infrastructure grows, it is optimized and perfected.
All this shows that the use of quantum computing for the analytical and complete resolution of common problems in the industry is imminent. This will allow us to discard current approaches based on brute force and experience, which are not always effective in anticipating all scenarios.
Link:
Quantum Algorithm Can be Used to Optimize Telecommunication Networks - The Quantum Insider
Quantum technology the black swans are gathering, claims start-up CEO – diginomica
(Image by Holger Detje from Pixabay)
Much has been written about whether there will be atipping point for quantum technologies, a critical moment at which they suddenly become adopted at scale. The reality is that quantum and classical computing will co-exist long into the future, joining forces in a hybrid environment that plays to both of their strengths.
There is more good news. Quantum computers will probably become sufficiently powerful, fault-tolerant, and reliable to run some enterprise tasks this decade. But industry consensus suggests that use cases that are ideally suited to such devices will emerge more slowly. These might include applications that model the natural world and chaotic processes, crunch huge numbers, reveal hidden correlations in specialized data, or help researchers develop new materials and drugs alongside AI on classical devices.
But in the absence of a tipping point, might there be a so-called black swan moment for quantum instead? A sudden event that has unforeseen, perhaps negative, consequences? The answer is that such a crisisisapproaching. We dont know precisely when it will hit, but we do know what it is and what will happen if we fail to prepare for it.
It is the threat to the global economy to banking, ecommerce, supply chains, government systems, and everyday communications that will arrive when quantum computing, or an emulation of it, can reliably and swiftly break the public-key encryption that underpins our secure transactions and communications.
In a forceful session entitled Quantums Black Swan at the World Economic Forum in Davos last week, Jack Hidary, CEO of quantum sensor provider SandboxAQ, urged much greater urgency in building quantum-safe systems and post-quantum security than is normal at industry events.
He said:
Let's say we want to build a tunnel under a river. We don't just start from one side and keep going; we start from both sides and meet in the middle. So, by analogy, the hardware folks IBM is doing an incredible job of advancing the superconducting quantum computing methodology, for example are digging from one side. But the algo [algorithm] people are digging from the other.
What has happened recently is that, in paper after paper, we've seen that the number of qubits we need to crack RSA is coming down. So, the two sides will meet faster and faster under the river to make this tunnel that breaks the banking system, that breaks the telco system, that breaks the energy system, and breaks government secrets.
So, is there any good news? Yes and no, he said. On the one hand, the US National Institute of Standards and Technology (NIST) and others have come together to create newpost-quantum cryptography protocols. These are still being sought, tested, and finalized.
But on the other, he explained:
[The bad news is] it takes seven or eight years for a bank or government to transition to a new protocol. So, what's very important right now is that we understand that this [the development of quantum technologies] does not work in a linear fashion.
On its own page onquantum-resistant cryptography, NIST says:
Historically, it has taken almost two decades to deploy our modern public key cryptography infrastructure.Therefore, regardless of whether we can estimate the exact time of the arrival of the quantum computing era, we must begin now to prepare our information security systems to be able to resist quantum computing.
In Davos, Hidary added:
People got surprised by Gen-AI, and what's going to happen here is the same thing. At some point, people are going to say, Wow, what a surprise, what a shock, that our cryptography is broken!
Certainly by 2029-30, we're going to see scaled, fault-tolerant quantum computers. But you might say, I need a certain number of qubits [to break encryption] today. But my prediction is that the number of qubits is going to come down.
Think about the brass prize of being able to decrypt everything in the world! This is a major issue. [] We have to act now.
So, there may be a global crisis conceivably as early as this decade unless organizations treat this foreseeable event as an urgent, real-world problem, and not as a long-term theoretical one. Less of a serene black swan, in fact, and more of a rampaging bear.
But was Hidary just trying to raise his own profile and using the World Economic Forum to do it? Perhaps, and he would not be the first. But it is equally possible that he has detected troubling signals amidst the industry noise.
The key issue (in every sense) is this: it is not as if the method for cracking RSA encryption, for example, is a secret. It just comes down to maths.
Peter Shor, a Professor of Applied Mathematics at MIT, proposed what became known as Shors Algorithm 30 years ago. This is a method for factoring semi-prime numbers on a quantum computer theoretical when he proposed it exponentially faster than on a classical device. (This is due to a qubits ability to superimpose multiple states, compared with the binary on or off of a classical bit.)
In this way, such an algorithm would, if run successfully, negate the security assumptions that underpin asymmetric cryptography. Namely that the timescales for running the required calculations on a classical computer billions of years to crack the minimal standard for secure encryption (RSA-2048) make it practically impossible. (The computation required grows exponentially larger with each digit in a sequence.)
By contrast, the only obstacles to using a quantum computer to run Shors algorithm or some evolution of it are the number of qubits (estimates range from one per bit all the way up to 20 million for cracking 2048-bit encryption), and the fact that their subatomic nature makes them noisy, and prone to error.
So, a quantum computer simply needs to be both powerful enough and fault tolerant, or self-correcting.
Most researchers believe that such an algorithm cant run at present; and certainly not for keys that have hundreds or thousands of digits. But it is purely a matter of time, though opinions differ as to whether that might be within a decade, a lifetime, or something closer to geological time.
But there is a problem, however. And that is: what if encryption is much closer to being cracked than most researchers believe?
Unsurprisingly, this is a matter of claim and counterclaim for anyone keen to make a name for themselves, or to spook rival governments. For example, a year ago, a group of Chinese researchers claimed that a 2048-bit RSA key could, theoretically, be broken by running the similarly named Schnorrs Algorithm on a quantum device of only a few hundred qubits.
That is troubling, given that the latest quantum hardware is up to the 1,000-qubit mark already, while smaller, more fault-tolerant devices exist too. However, others have claimed that this algorithm works well enough to crack, say, a 48-bit key, but cannot scale to much larger numbers. As a result, the computation would fail.
Meanwhile in November 2023,veteran researcher and Physics PhD Ed Gerck made a truly astonishing claim: that he had broken RSA-2048 encryption in seconds using quantum emulation on a cellphone, using an all states at once technique called simultaneous multifactor logic.
In the absence of formal publication of his research, or any peer-reviewed data, the security community remains deeply sceptical. Even so, the problem facing the industry is that even the most sensational or unlikely claims cant just be dismissed despite astronomer Carl Sagans aphorism that extraordinary claims require extraordinary evidence. The stakes are simply too high.
One reason is the possibility, however remote, that a researcher might have made a giant leap forward, or spotted a flaw in orthodox thinking; consider how Einsteins thought experiments a century ago transformed our picture of time, space, and gravity, for example.
Another is the phenomenon known as Store Now, Decrypt Later (SNDL): the awareness that any number of organizations, hackers, or hostile states will have been hording others encrypted data for decades, and are just waiting for the breakthrough that enables them to read it.
For this reason, Gerck urged authorities to retire RSA and implement quantum-safe standards as soon as possible. Even if his own claim proves to be bunkum, that sounds like sensible advice.
But the risk of a quantum computer breaking strong encryption is not the only black swan that might arise from the technology, or demand its urgent adoption. According to Hidary who was on a panel with Ana Paula Assis, EMEA Chair of the IBM Corporation, and Jol Mesot, President of ETH Zurich another black swan is already with us.
Quantum sensors are essential today, he explained, because of problems with the satellite-based GPS systems that we all use to navigate, plus the inaccuracy of others in the medical profession.
He said:
What if GPS is not available? Over huge swathes of the ocean right now in the Pacific Rim area, particularly near Taiwan, there is no GPS. And over huge swathes of the Middle East, GPS is not only being jammed, but being spoofed. Four planes went into Iranian airspace in the last four months, unintentionally. So, this is a major issue.
But we can use quantum sensors to detect the unique magnetic footprint of every square meter on earth, in the same way that birds and whales navigate.
Boeing and Airbus are among the aerospace companies that have been investing in quantum navigation and timing research in Boeings case, as far back as 2018.
Quantum sensors are also 24 months away from being approved for use in hospitals to monitor patients hearts more accurately, claimed Hidary, thus avoiding the problem of traditional sensors missing a defect. However, such devices demand AI running on a classical computer to pull the signal from the noise of the many other sources of electromagnetic radiation.
He explained:
The [magnetic] signal from a heart is very, very faint so faint that you need a quantum sensor to pick it up. But there is so much other noise, so much other information. If you have an iPhone, if you have a smartwatch, or any of the other magnetic signals in this room, it can confound that sensor. So, we have to pass it through a GPU into an AI model, trained on the data of what a heartbeat looks like.
This convergence of AI and quantum is what's happening now. We need to move into the quantum realm to understand our own bodies. First the heart. And then, of course, the brain.
While little of what Hidary said is new these issues have been known about, conceptually, for decades the force of his argument, and its delivery, was unusual. As a result, the possibility that these challenges might be more urgent than most researchers believe cannot be ignored.
Watch out for those black swans!
Go here to read the rest:
Quantum technology the black swans are gathering, claims start-up CEO - diginomica
The secret tech investor: Prepare for the quantum leap – Citywire
The rapid advance of artificial intelligence (AI) has led to the proposition that we are now in the era of AI. This raises the question of what comes next.
According to Michio Kakus illuminating book Quantum Supremacy, the next era after AI will be quantum computing (QC). Quantum physics, once considered an abstract concept, now permeates numerous everyday activities, from photosynthesis to MRI scans and the behaviour of electrons on the nanoscale within semiconductors.
Is it too early to invest in this forthcoming wave now? Almost certainly yes but it is time to start preparing. Heres a briefing note to help you know more about QC with one ETF investment opportunity you can consider now.
The key differentiator in QC is the use of qubits instead of bits. Qubits can exist in multiple states simultaneously, unlike classic computer bits which can only be in one of two exclusive states: 1 or 0. This fundamental distinction means that while a classic computers power increases linearly with the number of transistors, a quantum computers power grows exponentially in relation to the number of qubits linked together.
Furthermore, quantum computers can leverage quantum algorithms that make use of quantum superposition or entanglement, reducing the time complexity of the algorithm and fundamentally accelerating problem-solving capabilities. Given that much of the interaction of real-world particles is quantum by nature, it is intuitive that using quantum technologies to simulate and predict their behaviour will offer more authentic results.
Advanced QC has the potential to revolutionise and solve complex real-life problems that are currently intractable for classic computers, even when using AI. It is worth noting that some of the great advances we are likely to see during this era may be the product of a collaboration between AI and QC.
Many computer scientists have proposed that artificial general intelligence (AGI) will only be reached once AI is working in full collaboration with QC. AGI is when computer programs exhibit the ability to understand, learn and apply knowledge across a wide range of tasks, essentially mirroring generalised human cognitive abilities.
The potential of QC for AI is immense. Quantum machine learning could classify larger datasets in less time, and quantum neural networks could process information in methodologies that classic neural networks cannot. While existing AI tools are powerful and practical for many applications today, QC represents a new frontier with the potential to significantly advance the field.
Some of the other areas where QC could have a significant real-world and equity market impact include:
Cybersecurity Quantum computers could break widely used encryption methods which rely on the difficulty of factoring large numbers. Conversely, they could enable the development of disruptive quantum-resistant encryption algorithms to ensure data security in a post-quantum era. There is no reason why the current crop of cybersecurity players could not be at the forefront of quantum encryption and cybersecurity but it is quite possible that we will see new players evolve leading to a cast change in the major sector players. Hence, I would suggest there is medium-level QC disruption risk in this sector.
Drug discovery and material science Quantum computers can simulate the behaviour of molecules and materials at the quantum level with high precision. The complexity of the interaction of molecules (which is a quantum event) means that classical computing is highly limited in its ability to process or simulate complex molecular problems.
QC will improve the discovery of new drugs and materials, and potentially revolutionise the pharmaceutical and materials science industries. Drug discovery is likely to become less expensive, more predictable and quicker to market through better simulation.
In turn, this should reduce costs throughout the industry and may lead to efficacy cliffs for existing blockbusting drugs. I see the QC disruption risk to the pharmaceutical sector as high.
Materials We could see a scenario where raw elemental materials are inferior for each of their respective end tasks when compared to QC-discovered synthetic composites which may be formulated more efficiently through more common elements or compounds.
Supply chain optimisation The current Red Sea hot flashes and the Covid supply chain issues show how the logistics and supply chain is vital to a functioning global economy. QC can optimise supply chains by efficiently solving large-scale logistical and transportation problems, reducing costs, and improving overall efficiency. Improved supply chains should reduce working capital levels and hence increase the return on invested capital for companies which should theoretically drive general equities higher.
Energy production and storage QC may be successful in finding efficient methodologies for nuclear fusion (which is a quantum phenomenon) resulting in a boundless energy supply, discovering new materials for energy production and storage, and potentially accelerating the development of renewable energy technologies and improving energy efficiency. The oil & gas sector will certainly go the way of the stagecoach sector, and I would guess the electricity utility companies will suffer demand destruction as customers shift to at home generation.
Climate modelling The weather is a quantum phenomenon which is why classic computing prediction models are of low efficacy. Simulating the behaviour of molecules, particles, and climate systems at a quantum level could improve the accuracy and speed of climate models. This can aid in understanding climate change and developing strategies to mitigate its effects. It will also allow Insurance companies to price premiums with more certainty lowering this cost for many businesses.
As ever with advances in technology, QC will have a transformative effect on the technology & software sectors and, as with AI, I would suspect that the software sector will again bear the brunt of disruption risk. One company that will have to respond and adapt to QC is simulation software specialist Ansys which we discussed in our last article.
Its important to note that while QC holds immense promise, it is still in its early stages of development, and practical, large-scale quantum computers are not yet widely available. Additionally, the full extent of their capabilities and limitations is still being explored by a very limited number of companies.
In the rapidly evolving landscape of quantum computing, several companies are at the forefront of driving the development and application of QC technologies. These companies are making strides in advancing the capabilities of quantum computing, with each taking a unique approach to this transformative technology.
Microsoft is actively working on delivering quantum at scale by engineering a unique, stable qubit and bringing a full-stack, fault-tolerant quantum machine to its Azure cloud platform. CEO Satya Nadella has emphasised the importance of QC in the companys cloud computing strategy, positioning it as the next-generation cloud. Additionally, Microsoft has launched a programming language called Q# which can be used for simulating quantum algorithms and developing quantum software.
Google Quantum AI achieved a significant milestone in 2019 by demonstrating quantum supremacy, with its 53-qubit Sycamore quantum computer performing a calculation in 200 seconds that would have taken the worlds fastest supercomputer 10,000 years. The companys Quantum AI research team has continued to push the boundaries of QC by reducing its errors through the increase of qubits, boasting a 3-5x performance enhancement over previous models. Google Quantum AI has also been expediting the prototyping of hybrid quantum-classical machine learning models.
Intel is actively involved in the development of QC chips, such as the 49-qubit Tangle Lake quantum chip. The companys focus on advancing the state of the art in QC reflects its commitment to driving innovation in this transformative technology.
IBM is a leader in QC and has launched advanced quantum computer systems. Last month it introduced the Heron processor, featuring 133 fixed-frequency qubits, which marks a significant leap in QC performance and error reduction. IBM believes that practical, effective QC will be available in the year 2033 and has set a roadmap of development to this date.
Alpine Quantum Technologies has taken a distinctive approach to QC by focusing on trapped-ion quantum technology. The company has developed a 20-qubit ion trap quantum computer using complex laser systems to control the trapped ions. This approach sets AQT apart from other companies that are pursuing different types of QC technologies, such as superconducting qubits or silicon-based approaches.
Rigetti Computings approach to QC is unique due to its focus on a multichip strategy for building quantum processors. This approach involves connecting several identical processors into a large-scale quantum processor, which, the company claims, reduces manufacturing complexity and allows for accelerated, predictable scaling.
While it is still early to predict which companies will emerge as the winners in the QC space, investing in a sector or thematic-based ETF or equity basket may be a prudent option in such uncertainty.
The Defiance Quantum ETF (QTUM) is a multi-cap global fund using the BlueStar Quantum Computing and Machine Learning index as a benchmark.
The ETF primarily includes semiconductor and software companies (AI star Nvidia is the fifth-largest holding) involved in the research, development, and commercialisation of QC systems and materials. The fund has assets of about $200m and rose almost 40% in 2023. I would be surprised if the larger US investment banks dont start to launch similar quantum-tracking equity baskets.
In conclusion, the era of QC holds great promise for reshaping our technological landscape and solving complex problems that are currently beyond the capabilities of classic computing. It should have far-reaching implications for the valuation of various sectors of the equity market.
The economic value unlocked by QC combined with its potential to reshape industries underscores the importance of understanding and preparing for the implications of this transformative technology on the equity market. The integration of AI and QC is expected to drive significant advancements, potentially leading to the realisation of artificial general intelligence.
So keep an eye on things QC-related and be prepared to invest should events dictate.
What might trigger such a change? Well Elon Musk has been remarkably silent on the subject - so may be we should use any change in that state as a signal.
Be warned though that from the bits of classic computing we have suffered Bitcoin, so it will not be long before the perma-perplexed fellow down the pub is boring you about QuBitcoins.
The Secret Tech Investor is an experienced professional who has been running tech assets for more than 20 years. The author may have long or short positions in the stocks mentioned.
See the original post here:
The secret tech investor: Prepare for the quantum leap - Citywire
Preparing for Post-Quantum Cryptography: Trust is the Key – Embedded Computing Design
January 23, 2024
Blog
The era of quantum computing is on its way as governments and private sectors have been taking steps to standardize quantum cryptography. With the advent of the new era, we are faced with new opportunities and challenges. This article will outline the potential impact of quantum computing and discuss strategies for preparing ourselves amid these anticipated changes.
In 1980, Paul Benioff first introduced Quantum Computing (QC) by describing the quantum model of computing. In classical computing, data is processed using binary bits, which can be either 0 or 1, whereas quantum computing uses quantum particles called qubits. Qubits can be in multiple states beyond 0 or 1, making them much faster and more powerful to perform calculations than a normal bit. To be more specific, with a quantum computer, we can finish a series of operations that would take a classical computer thousands of years in just hundreds of seconds. In fact, IBM just launched the first quantum computer with more than 1,000 qubits in 2023.
Nevertheless, the speed boost of quantum computing can have double-edged consequences. Modern cryptographers have been concerned about the potential impacts on the security of public-key crypto algorithms. Those regarded as unbreakable are now at risk, as a cryptographically relevant quantum computer (CRQC) can do short work of decryption. For instance, the most popular public-key cryptosystem, Rivest-Shamir-Adleman (RSA), was previously considered very challenging with its complex inverse computation. However, in Shors algorithm where quantum speedup is particularly evident, the once reliable computation time becomes CRQC-vulnerable. As such, the US National Institute of Standards and Technology (NIST) has been promoting the standardization of post-quantum cryptography (PQC). In addition, the National Security Memorandum (NSM-10) was issued in 2022 in response to the threat brought by cryptographically relevant quantum computers (CRQC).
In fact, when it comes to quantum computing, there are still many issues that researchers cannot agree on. In the current noisy intermediate scale quantum (NISQ) era, it is still unclear what the ideal architecture of a quantum computer is, when we can expect the first CRQC, and how many qubits we will need for a quantum computer. Take the minimum number of qubits would qualify a quantum computer as an example. Google estimated that it may be 20 million qubits. But with a different quantum algorithm, Chinese researchers in 2022 proposed their own integer factoring algorithm, claiming that only 372 qubits are needed to break a 2048-bit RSA key.
Despite the various quantum computing issues, researchers have a consensus on the necessity and urgency of the PQC transition. Based on the guidelines proposed by both public and private sectors, we have concluded the following key points for a smooth PQC transition:
The above suggestions are, in fact, not dependent on the PQC standards, and the preparations can start now. It is important to keep in mind that overall system security remains the top priority in both classical computing and the PQC era. The scope of the transition will not really affect all the classical cryptographic algorithms we are familiar with. That is, the current NIST-recommended AES-256 cipher and SHA-384 hash algorithms are still acceptable (yet not satisfying) in the post-quantum world.
The full transition to PQC may span many years, giving us more time to examine PQC readiness and stay crypto-agile. According to the National Security Memorandum (NSM-10), the winners of the final round of NISTs PQC Standardization are expected to be announced in 2024, so organizations are suggested to start the timer then. Table 1 compares those algorithms that have already been selected for NIST standards with their classical counterparts in terms of public key and ciphertext/signature size (in bytes). More importantly, any systems built today should maintain the ability to stay flexible enough to account for possible future modifications, understanding that what may appear quantum-safe today may not be so soon.
Table1: Candidates of NISTs PQC Standardization
Security concerns and levels will continue to evolve as quantum computing advances. This makes a more robust safety storage system, such as NeoPUF, necessary. When all is said and done, security is all about trust. Without the foundation of trust, the classical RSA public-key algorithm or a lattice-based PQC algorithm becomes ineffective. Since important system keys should be highly random and unable to be guessed, the secure methods for creating trust in a system will become increasingly important in the post-quantum world.An even stronger base of trust, a hardware root of trust (HRoT), must be implemented in the hardware, as the software root of trust alone is no longer considered sufficient. The most robust form of such internal provisioning is PUF-based. Having delivered trust on multiple foundry platforms, eMemory and its subsidiary PUFsecurity are highly credible. Experienced solution providers such as eMemory and PUFsecurity will still be the best choice now and moving into the post-quantum world.
To learn more about Post-Quantum Cryptography, please read the full article on PUFsecurity Website.
Lawrence Liu is a leading member of the R&D team here at PUFsecurity, bringing over 25 years of experience working with NAND Flash, NOR Flash, and DRAM. Before specializing in the field of memory design, he graduated from the mid-peninsula university affectionately known as The Farm with BSEE/MSEE specializing in computer architecture.
More from Lawrence
View original post here:
Preparing for Post-Quantum Cryptography: Trust is the Key - Embedded Computing Design
Quantum Reinforcement Learnings Impact on AI Evolution | by The Tech Robot | Jan, 2024 – Medium
Quantum Reinforcement Learnings Impact on AI Evolution
The quickly developing field of quantum computing has great promise for enhancing machine learning on conventional computers. Quantum computers are more effective than conventional computers because they can manage complex connections between inputs. These quantum computers provide ten times greater data processing and storage capacity than modern supercomputers.
Quantum computing is an area of study that integrates computer science, physics, and mathematics to tackle complicated problems more quickly than ordinary computers. To solve specific problems, it employs quantum mechanical processes such as coherence and quantum interference. Machine learning, efficiency, and modeling physical systems, as well as portfolio optimization and chemical simulation, will be future uses.
Qubits store data using what is known as the superposition principle in quantum computing. This enables qubits to be in many states at the same time. Quantum machine learning (QML) augments regular machine learning software with quantum devices. Quantum computers offer substantially more storage and processing power than ordinary computers, enabling them to analyze massive volumes of data that older technologies would take much longer to handle. With this extraordinary processing power, QML can speed and improve the development of machine learning models, neural networks, and other kinds of quantum artificial intelligence (AI).
Four major types of data, based on quantum (Q) or classical type and previous computation on Q or C computers, are derived from the blend of quantum and machine learning.
1. CC: Classical Dataset analyzed in Classical Computers Classical Machine Learning (ML) is a method that is unlikely to have a direct quantum base but draws principles from quantum machine learning theory.
2. QC: Quantum Dataset in Classical Computers learns from quantum states of consciousness using classical machine learning challenges. This technique would address the problem of classifying quantum states produced by physical experiments.
3. CQ: Quantum Computers Handle Classical Datasets In quantum computers, traditional datasets are processed. In a nutshell, quantum computers are employed to find faster solutions to problems that have previously been solved using ML. Traditional algorithms, like picture categorization, are fed into quantum machines to discover the best algorithm parameters.
4. QQ: Using quantum computers that work solely on quantum states would be the purest way. The outcome of a quantum simulation is fed into a machine learning system.
1. Positive and negative interference are used in quantum neural network training.
2. Multi-state exploration and convergence are accelerated by quantum reinforcement learning.
3. Run-time optimization: providing speedier outcomes; Enhancements to learning capacity: enhancing the capacity of connection or content-addressable memory.
4. Advances in learning efficiency: Depending on the degree of training knowledge required, the same data may be used to learn more complicated relations or simpler models.
1. Limited quantum hardware: In the current environment, Noise Intermediate-Scale Quantum (NISQ) systems must limit qubit availability for modeling reasons. Millions of qubits are expected to be required for practical usefulness.
2. Creating data that is quantum-ready: It is difficult to encode standard data using quantum state representations. Today, the bulk of data lacks underlying quantum structure.
3. Algorithm design: To reap the benefits of QML, new quantum-optimized machine learning frameworks and approaches, such as deep learning, are required.
4. Software infrastructure: Because quantum development frameworks are presently in their infancy, integrating them with regular Machine Learning technologies and workflows is difficult.
5. Training Datasets are Limited: There is insufficient labeled quantum data available. Although artificial dataset generation is advantageous, it has limitations.
6. Inadequate skills: Only a handful of academics are currently working on QML at the intersection of quantum research and AI.
Quantum Machine Learning (QML) is a new field of AI and quantum computing that has the potential for spectacular outcomes due to developments in quantum equipment, algorithms, and academic-engineer collaboration. Take classes, join clubs, or experiment with cloud-based technologies to participate in this exciting future.
People Also read The Role of Reinforcement Learning in NLP
The rest is here:
Quantum Reinforcement Learnings Impact on AI Evolution | by The Tech Robot | Jan, 2024 - Medium
India’s Emergence Makes It a Critical Partner For The Western Quantum Ecosystem – The Quantum Insider
Insider Brief
Recently, China and Russia announced that they successfully tested quantum communication over a distance of 3,800 kilometers, using secure keys transmitted by Chinas quantum satellite. Although no peer review research is or is expected to be made available, the satellite could give the nations a fully secure, unhackable link of communications.
Recently, the Eurasian Times is reporting that India one of the BRICS (Brazil, Russia, India, China and South Africa) was invited to join this quantum communication project.
As last years the Future Technologies forum, Russian President Vladimir Putin said he expected to discuss with Indian partners, particularly in cutting-edge computing technology, data processing, storage, and transmission technologies.
The partnership would benefit the China-Russia project greatly. The countrys quantum communication research is advanced and these capabilities are growing.
While the invite appears to be rejected by India because of suspicions about China, according to the Eurasian Times, the news should be a reason for concern among the rapidly emerging quantum ecosystem in Western states. In the rapidly evolving world of quantum computing, more and deeper partnerships and collaborations with India would not just beneficial; they will be imperative.
As nations across the globe race to harness the transformative power of quantum technology, India stands out as a critical partner for the Western quantum ecosystem. The reasons for this are many, ranging from geopolitical significance to an abundant talent pool, technological prowess and more.
Indias strategic position in the global quantum community is as obvious as it is seemingly ignored. With its growing economy, political stability and strategic location, India is a pivotal player in global affairs. For Western countries, partnering with India in quantum computing could be not just a technological collaboration but a geopolitical strategy.
India would offer a counterbalance to other dominant forces in the quantum realm, particularly China, which is aggressively advancing its quantum capabilities. The West could serve as a counterbalance to China for India, as well.
An Indo-Western quantum alliance would foster a more diversified and balanced global quantum landscape, reducing the risk of a single-nation monopoly in this critical field.
Indias greatest asset in the quantum journey is undoubtedly its talent. The country has a vast reservoir of young, talented, and highly skilled professionals in STEM fields. The Quantum Insiders Intelligence Platform lists more than 100 universities, government entities, research instiutions, companies and investors centered around India.
Indian universities and research institutions are churning out world-class scientists, engineers and programmers, many of whom are already contributing significantly to global tech giants and research labs. In the jigsaw puzzle of developing quantum supply chains, the biggest missing pieces for growing quantum startups is the lack of trained, skilled professionals.
By integrating this talent pool into the Western quantum ecosystem, there is an opportunity to accelerate innovation and development.
Additionally, such integration would provide Indian professionals with international exposure and opportunities, further enhancing their skills and contributions.
India has been making strides in quantum computing and related technologies.
The Indian governments significant investments in quantum research under initiatives like the National Mission on Quantum Technologies and Applications (NM-QTA) reflect a commitment to this field. By partnering with Western nations, India can leverage their advanced research facilities, funding, and expertise, leading to a synergistic relationship.
This collaboration would not only aid in the development of cutting-edge quantum technologies but also ensure that these advancements are grounded in diverse perspectives, leading to more robust and versatile solutions.
The economic benefits of a robust Indo-Western partnership in quantum computing are immense.
Quantum technology is poised to revolutionize industries from cybersecurity to healthcare, finance, and logistics. By collaborating, both India and Western countries can tap into new markets, foster innovation, and drive economic growth. For India, it would mean an influx of foreign investment, job creation, and an opportunity to position itself as a leader in the quantum sector.
Its likely that India would have access to commercial opportunities in the West that they would not have in a Russia-China partnership.
Such partnerships often extend beyond mere technological collaborations. And these collaborations could benefit quantum science, in general.
Increased Indo-Western collaborations could pave the way for cultural and educational exchanges, fostering a deeper understanding and appreciation of diverse viewpoints and practices. Joint research programs, academic exchanges, and collaborative projects between Indian and Western universities would enrich the educational experiences of students and researchers alike.
This cross-pollination of ideas is invaluable in a field as dynamic and interdisciplinary as quantum computing.
Perhaps, as a side note, Western universities, too, are facing cratering demographic issues that will affect enrollments. Indian talent could help shore up some of the damage of these shifting trends in student enrollment.
In a world increasingly defined by digital threats, quantum computing holds the key to unparalleled advancements in cybersecurity. An Indo-Western partnership in this domain would enhance collective security capabilities, particularly in encryption and data protection.
In addition, for India, such collaboration offers a path to strategic autonomy. By developing its quantum capabilities, India can ensure its national security, protect its data sovereignty, and reduce dependence on other nations for critical technologies.
When it comes to quantum strengths, India offers a buffet table of solid and innovative research output and entrepreneurship in many areas of the quantum industry and is leveraging quantum computers to explore key use cases. Just a few examples:
General Impact and Applications: Quantum computing is seen as a key driver for advancements in various fields such as medications, machine learning, cybersecurity and climate change, with significant investment and initiatives like Indias quantum mission aiming to develop a 50-qubit computer (Khan, Dalawai, & Nagachandan, 2020).
Enhancement of Artificial Intelligence: The integration of quantum computing with AI is expected to significantly boost the capabilities of research activities globally (Mehta, Paharia, Singh, & Salman, 2019).
Renewable Energy Applications: Research explores the utilization of quantum computing for industrial applications, particularly in the renewable energy sector (Rajawat et al., 2022).
Sustainable Development in IT Industry: Quantum computing is recognized for its potential to revolutionize the Indian IT industry, contributing to sustainable development and employment generation (Chatterjee, 2018).
Quantum Information Theory and Quantum Computation: Papers explore various aspects of quantum information theory and its applications in computation and communication (Pati & Agrawal, 2012).
Obviously, the benefits of such a partnership are multifaceted from geopolitical advantages to technological advancements, economic growth, and beyond. However, it may seem like a simplistic rundown of advantages. There are, of course, concerns. Adding more pieces to an already complex ecosystem offers just one more point of failure. Security risks for the broad Western quantum community could be exposed as well.
Understanding those concerns, there is a growing acknowledgement that the world stands on the brink of the development of quantum technology, for good and ill. The collaboration between India and Western nations could very well be the catalyst that propels us into a new era of technological prowess and innovation.
Davos and the global state of quantum – POLITICO
With help from Christine Mui and Steven Overly
Participants waiting for a session at the 2024 meeting of the World Economic Forum. | Fabrice Coffrini/AFP via Getty Images
Davos wants you to plan to have a plan on quantum technology.
The World Economic Forum that sponsors the annual Switzerland confab released a Quantum Economy Blueprint today its first major paper on how a global economy centered around quantum technology might develop, even as many skeptics say the technology isnt yet ready for prime time.
Its authors, a trio of researchers from the WEF, AI and quantum startup SandboxAQ, and IBM, lay out a set of recommendations and examples for how countries can find their fit in the global development of quantum computing, sensing, and communications technologies especially as China invests billions into the technology largely in isolation from the West.
If youve been reading DFDs past coverage of quantum developments, you might be wondering: Isnt it a bit early for this? Thats what the reports authors are counting on, writing that seizing on an early adopter advantage will allow governments to get infrastructure in place to ensure all countries are able to benefit from the gradual replacement of zeroes and ones by superposition and entanglement.
Notably, the report, with the full backing of the WEF, makes assumptions about quantum that are decidedly up for debate in the wider research community. Those include: it will be possible to build a useful, fully programmable universal fault-tolerant quantum computer; quantum computing will make the computation of specific problems more efficient or precise, and that quantum utility, the ability for existing quantum computers to solve problems beyond classical computings reach, has been demonstrated.
(Some in the commercial sector even say the WEF isnt bullish enough Allison Schwartz, government relations lead for commercial quantum company D-Wave, told DFD in a statement that the report narrowly focuses on a single approach that is far from market readiness in a manner that skews the timelines for adoption and near-term application development.)
With that in mind I pinged Sergio Gago Huerta, head of quantum at Moodys and someone who does not hesitate to call out quantum hype as the author of the Quantum Pirates Substack newsletter. Huerta was all in favor of the blueprint, saying that by focusing on governance and infrastructure it provides helpful pointers to pretty much anyone hoping to compete or even participate in the quantum economy.
Every country should have their own quantum program, either as part of a coalition or by themselves, Huerta wrote in an email. He noted that while many countries tend to focus on quantum as a cyber threat the ability of quantum computers to bust traditional cryptography is one of the most well-established findings in the field the report provides welcome focus on other technologies like quantum sensing and metrology, something governments will need to provide enough support, governance and training [for]... in order to stay relevant and keep a competitive advantage.
Celia Merzbacher, executive director of the Quantum Economic Development Consortium that aims to grow quantum in the U.S., was a consultant on the report. She praised its analysis of the complex landscape facing nations on quantum and said it would be useful to anyone working right now in the quantum technology stack.
The report takes a granular dive into nations quantum building blocks, from national research funding to politics to worker training, and finds not surprisingly that the most successful innovation efforts come from deeply interconnected and collaborative ecosystems.
One example they cite is the United Kingdoms National Quantum Strategy: In that case, pumping a billion-plus British pounds into the U.K.s research infrastructure led to a successful effort to develop commercial applications for quantum in fields like the automotive industry, telecom, and defense.
At a smaller scale, that virtuous-cycle collaboration tends to cross national boundaries, like in the case of the Quantum Leap Africa program that saw five nations team up to gather top students from across the continent and educate them about quantum.
The U.S. National Quantum Initiative, authorized by a $1.2 billion bill passed in 2018, has placed Washington at the center of this global conversation even as its re-authorization lingers in Congressional limbo. The report contains plenty of detail about the U.S. quantum push and its ripples throughout the global economy, as well as the importance of maintaining a quantum advantage to defense and national security. Where its decidedly more circumspect, however, is on exactly what those geopolitical threats are: State Department official Rick Switzer is quoted saying its critical that the United States and our allies retain access to key components in the quantum supply chain, requiring policy-makers to account for the geopolitics surrounding this access.
By geopolitics, he means China and the repercussions for quantum in what the New York Times called Americas silicon blockade against Beijing. The WEF report notes that China has spent $15 billion on quantum, more than the U.S., U.K., France and Germany combined.
The number of times China itself is referenced in the report? Exactly two, a far cry from the in-depth treatment other nations quantum strategies get. Anyone who tracks quantum development (or any other technology, for that matter) in the West knows that potential threat from Beijing is a huge political motivator for quantum policy, especially when it comes to cybersecurity. Davos plan might be a globally collaborative one, but as with so many other tech policy issues, theres a large elephant in the room thats central to its analysis while remaining oddly silent.
Also in Davos, India made its case as a democratic alternative source for electronics manufacturing to China.
The worlds fifth-largest economy for years lagged on making microchips, lacking the specialized hardware and skilled talent needed to grow the industry. Then in late 2021, the Modi government offered up $10 billion in incentives, luring companies like Micron and Tata Group to invest in new fabs. With nine semiconductor manufacturing proposals on the table, India is eager for more.
Speaking at the WEF, Chip War author Chris Miller drew parallels between India and past success stories: Taiwan, South Korea started a half century ago developing their chip industries, and today theyre the world leaders. And so I think theres no doubt that India is beginning to follow that path.
But the country wont dive into the competition around cutting-edge chips thats captured governments around the world yet choosing to first focus on legacy chips for telecommunications and cars, Indian Cabinet Minister Ashwini Vaishnaw told the panel. Asked about future pressure to take sides between China and the West, Vaishnaw dodged, saying we dont think that its a battle and that circumstances are too complex and too dynamic to imagine what could happen in a decade.
Indias semiconductor moves piqued the interest of the Netherlands, a chipmaking equipment powerhouse thats all too familiar with getting caught in the U.S.-China faceoff. Dutch Minister of Economic Affairs and Climate Micky Adriaansens called India a different story from China and reiterated its plan to join forces with like-minded countries. Christine Mui
A fundamental question hangs over the global debate over how to regulate artificial intelligence: open or closed?
That is, should the most powerful AI systems be widely available to any interested developer or under the tight control of just a few players? Top politicos and tech minds grappled with that topic in Davos, including at POLITICO Lives own AI debate on Tuesday.
In the U.S., Assistant Secretary of Commerce Alan Davidson said the answer may not be so binary.
Weve learned that theres a real gradient of openness, and that we may be able to find ways, we have to be able to find ways, to support innovation and competition, but also protect safety and security as we open up these systems, Davidson told the POLITICO Tech podcast.
Davidson heads the National Telecommunications and Information Administration, which has been tasked by the White House with studying the open vs. closed question. He made the case that while closed systems are seemingly easier to close off to bad actors, they also concentrate power in the hands of a small number of tech companies, many of which already exert significant influence over our daily lives.
We know that its very powerful if you can democratize access to these technologies, Davidson said. Its good for innovation. It actually can be good for safety and security. Steven Overly
Listen to the full interview with Davidson on todays POLITICO Tech.
Stay in touch with the whole team: Ben Schreckinger ([emailprotected]); Derek Robertson ([emailprotected]); Mohar Chatterjee ([emailprotected]); Steve Heuser ([emailprotected]); Nate Robson ([emailprotected]); Daniella Cheslow ([emailprotected]); and Christine Mui ([emailprotected]).
If youve had this newsletter forwarded to you, you can sign up and read our mission statement at the links provided.