Page 823«..1020..822823824825..830840..»

Can AI Outperform Humans at Creative Thinking Task? This Study Provides Insights into the Relationship Between Human and Machine Learning Creativity -…

While AI has made tremendous progress and has become a valuable tool in many domains, it is not a replacement for humans unique qualities and capabilities. The most effective approach, in many cases, involves humans working alongside AI, leveraging each others strengths to achieve the best outcomes. There are fundamental differences between human and artificial intelligence, and there are tasks and domains where human intelligence remains superior.

Humans can think creatively, imagine new concepts, and innovate. AI systems are limited by the data and patterns theyve been trained on and often struggle with truly novel and creative tasks. However, the question is, can an average human outperform the AI model?

Researchers tried to compare the creativity of humans (n= 256) with that of three current AI chatbots, ChatGPT3.5, ChatGPT4, and Copy.AI, by using the alternate uses task (AUT), which is a divergent thinking task. It is a cognitive method used in psychology and creativity research to assess an individuals ability to generate creative and novel ideas in response to a specific stimulus. These tasks measure a persons capacity for divergent thinking, which is the ability to think broadly and generate multiple solutions or ideas from a single problem.

Participants were asked to generate uncommon and creative uses for everyday objects. AUT consisted of four tasks with objects: rope, box, pencil, and candle. The human participants were instructed to provide ideas qualitatively but not depend solely on the quantity. The chatbots were tested 11 times with four object prompts in different sessions. The four objects were tested only once within that session.

They collected subjective creativity or originality ratings from six professionally trained humans to evaluate the results. The order in which the responses within object categories were presented was randomized separately for each rater. The scores of each rater were averaged across all the responses a participant or chatbot in a session gave to an object, and the final subjective scores for each object were formed by averaging the six raters scores.

On average, the AI chatbots outperformed human participants. While human responses included poor-quality ideas, the chatbots generally produced more creative responses. However, the best human ideas still matched or exceeded those of the chatbots. While this study highlights the potential of AI as a tool to enhance creativity, it also underscores the unique and complex nature of human creativity that may be difficult to replicate or surpass with AI technology fully.

However, AI technology is rapidly developing, and the results may be different after half a year. Based on the present study, the clearest weakness in human performance lies in the relatively high proportion of poor-quality ideas, which were absent in chatbot responses. This weakness may be due to normal variations in human performance, including failures in associative and executive processes and motivational factors.

Check out thePaper.All Credit For This Research Goes To the Researchers on This Project. Also,dont forget to joinour 30k+ ML SubReddit,40k+ Facebook Community,Discord Channel,andEmail Newsletter, where we share the latest AI research news, cool AI projects, and more.If you like our work, you will love our newsletter..

Arshad is an intern at MarktechPost. He is currently pursuing his Int. MSc Physics from the Indian Institute of Technology Kharagpur. Understanding things to the fundamental level leads to new discoveries which lead to advancement in technology. He is passionate about understanding the nature fundamentally with the help of tools like mathematical models, ML models and AI.

View original post here:
Can AI Outperform Humans at Creative Thinking Task? This Study Provides Insights into the Relationship Between Human and Machine Learning Creativity -...

Read More..

Progress in using deep learning to treat cancer – Nature.com

Deep learning approaches have potential to substantially reduce the astronomical costs and long timescales involved in drug discovery. KarmaDock proposes a deep learning workflow for ligand docking that shows improved performance against both benchmark cases and in a real-world virtual screening experiment.

Drug discovery is a long and arduous process that is staggeringly expensive the average estimated time needed to take a new drug from discovery to launch is 1012 years1, at a high cost of ~US$2.2 billion per drug2, which is a major problem considering that this process is also plagued by low hit rates. Computer-aided drug discovery (CADD) can substantially aid this process3, including both by predicting how a range of drug-like ligands would bind to a given drug-target (virtual screening) using docking algorithms, as well as predicting the corresponding binding free energies of the docking predicted poses, which are a measure of the strength with which the ligand binds to its target. However, despite significant progress in this area, challenges remain, including (1) the quality of the binding poses predicted, which is crucial for rational drug discovery, a process that is complicated by the presence of error, non-linearity, and randomness4; (2) the precision and accuracy of the predicted binding free energies for those poses there can be, for instance, significant variation in the pose ranking for the same ligand/target combination between docking approaches and (3) the speed of the approach, which is particularly an issue in the face of increasing library sizes. That is, computational approaches need to be both efficient enough to be able to perform ultra-large docking on libraries that can reach billions of compounds5, without significantly compromising the quality of the binding pose and free energy predictions. Such huge libraries are out of the scope of conventional CADD approaches, but are an ideal target for deep-learning (DL) approaches5, which typically perform better than traditional shallow machine learning techniques (or even deep learning approaches with expert descriptors) when processing large data sets6. However, even DL approaches face challenges optimizing both accuracy and computational speed, due to the inherent complexity of the problem, as well as the degree of seeming randomness involved4. Writing in Nature Computational Science Xujun Zhang and colleagues7 propose KarmaDock, a DL approach for ligand docking, showing both improved speed and accuracy compared to benchmark data sets, as well as performing well in a real-world virtual screening project where it was used to discover experimentally validated active inhibitors of LTK, which is a target for the treatment of non-small-cell lung cancer8.

Read the rest here:
Progress in using deep learning to treat cancer - Nature.com

Read More..

Revolutionary AI Set To Predict Your Future Health With a Single Click – SciTechDaily

Researchers from Edith Cowan University developed software that rapidly analyzes bone density scans to detect abdominal aortic calcification (AAC), a predictor of cardiovascular events and other health risks. The software processed images with 80% agreement with experts and could revolutionize early disease detection during routine clinical practice.

Thanks to artificial intelligence, well soon have the ability to predict our risk of developing serious health conditions in the future, at the press of a button.

Abdominal aortic calcification (AAC) refers to the buildup of calcium deposits in the walls of the abdominal aorta. It can indicate an increased risk of cardiovascular events, including heart attacks and strokes.

It also predicts your risk of falls, fractures, and late-life dementia. Conveniently, common bone density machine scans used to detect osteoporosis, canalso detect AAC.

However, highly trained expert readers are needed to analyze the images, a process that can take 5-15 minutes per image.

But researchers from Edith Cowan Universitys (ECU) School of Science and School of Medical and Health Sciences have collaborated to develop software that can analyze scans much, much faster: roughly 60,000 images in a single day.

Researcher and Heart Foundation Future Leader FellowAssociate Professor Joshua Lewissaid this significant boost in efficiency will be crucial for the widespread use of AAC in research and helping people avoid developing health problems later in life.

Since these images and automated scores can be rapidly and easily acquired at the time of bone density testing, this may lead to new approaches in the future for early cardiovascular disease detection and disease monitoring during routine clinical practice, he said.

The results were from an international collaboration between ECU, the University of WA, the University of Minnesota, Southampton, the University of Manitoba, the Marcus Institute for Aging Research, and Hebrew SeniorLife Harvard Medical School. Truly a multidisciplinary global effort.

Though its not the first algorithm developed to assess AAC from these images, the study is the biggest of its kind, was based on the most commonly used bone density machine models, and is the first to be tested in a real-world setting using images taken as part of routine bone density testing.

It saw more than 5000 images analyzed by experts and the teams software.

After comparing the results, the expert and software arrived at the same conclusion regarding the extent of AAC (low, moderate, or high) 80 percent of the time an impressive figure given it was the first version of the software.

Importantly, only 3 percent of people deemed to have high AAC levels were incorrectly diagnosed to have low levels by the software.

This is notable as these are the individuals with the greatest extent of disease and highest risk of fatal and nonfatal cardiovascular events and all-cause mortality, Professor Lewis said.

Whilst there is still work to do to improve the softwares accuracy compared to human readings, these results are from our version 1.0 algorithm, and we already have improved the results substantially with our more recent versions.

Automated assessment of the presence and extent of AAC with similar accuracies to imaging specialists provides the possibility of large-scale screening for cardiovascular disease and other conditions even before someone has any symptoms.

This will allow people at risk to make the necessary lifestyle changes far earlier and put them in a better place to be healthier in their later years.

Reference: Machine learning for abdominal aortic calcification assessment from bone density machine-derived lateral spine images by Naeha Sharif, Syed Zulqarnain Gilani, David Suter, Siobhan Reid, Pawel Szulc, Douglas Kimelman, Barret A. Monchka, Mohammad Jafari Jozani, Jonathan M. Hodgson, Marc Sim, Kun Zhu, Nicholas C. Harvey, Douglas P. Kiel, Richard L. Prince, John T. Schousboe, William D. Leslie and Joshua R. Lewis, eBioMedicine.DOI: 10.1016/j.ebiom.2023.104676

The Heart Foundation provided funding for the project, thanks to Professor Lewis 2019 Future Leadership Fellowship providing support for research over a three-year period.

Link:
Revolutionary AI Set To Predict Your Future Health With a Single Click - SciTechDaily

Read More..

An artificial intelligence model for the radiographic diagnosis of … – Nature.com

Vina, E. R. & Kwoh, C. K. Epidemiology of osteoarthritis: Literature update. Curr. Opin. Rheumatol. 30(2), 160167 (2018).

Article PubMed PubMed Central Google Scholar

Quicke, J. G., Conaghan, P. G., Corp, N. & Peat, G. Osteoarthritis year in review 2021: Epidemiology and therapy. Osteoarthr. Cartil. 30(2), 196206 (2022).

Article CAS Google Scholar

Bianchi, J. et al. Osteoarthritis of the Temporomandibular Joint can be diagnosed earlier using biomarkers and machine learning. Sci. Rep. 10(1), 8012 (2020).

Article ADS CAS PubMed PubMed Central Google Scholar

Tanaka, E., Detamore, M. S. & Mercuri, L. G. Degenerative disorders of the temporomandibular joint: Etiology, diagnosis, and treatment. J. Dent. Res. 87(4), 296307 (2008).

Article CAS PubMed Google Scholar

Wang, X. D. et al. Deterioration of mechanical properties of discs in chronically inflamed TMJ. J. Dent. Res. 93(11), 11701176 (2014).

Article CAS PubMed PubMed Central Google Scholar

Jiao, K. et al. Subchondral bone loss following orthodontically induced cartilage degradation in the mandibular condyles of rats. Bone 48(2), 362371 (2011).

Article CAS PubMed Google Scholar

Greene, C. S. & Manfredini, D. Treating temporomandibular disorders in the 21st century: Can we finally eliminate the third pathway?. J. Oral Facial Pain Headache 34(3), 206216 (2020).

Article PubMed Google Scholar

Gonalves, D. A., Speciali, J. G., Jales, L. C., Camparis, C. M. & Bigal, M. E. Temporomandibular symptoms, migraine, and chronic daily headaches in the population. Neurology 73(8), 645646 (2009).

Article PubMed Google Scholar

Alketbi, N. & Talaat, W. Prevalence and characteristics of referred pain in patients diagnosed with temporomandibular disorders according to the Diagnostic Criteria for Temporomandibular Disorders (DC/TMD) in Sharjah, United Arab Emirates. F1000 Res. 11, 656 (2022).

CAS Google Scholar

Stegenga, B. Nomenclature and classification of temporomandibular joint disorders. J. Oral Rehabil. 37(10), 760765 (2010).

Article CAS PubMed Google Scholar

Choi, E., Kim, D., Lee, J. Y. & Park, H. K. Artificial intelligence in detecting temporomandibular joint osteoarthritis on orthopantomogram. Sci. Rep. 11(1), 10246 (2021).

Article ADS CAS PubMed PubMed Central Google Scholar

Jung, W., Lee, K. E., Suh, B. J., Seok, H. & Lee, D. W. Deep learning for osteoarthritis classification in temporomandibular joint. Oral Dis. 29(3), 10501059 (2023).

Article PubMed Google Scholar

Ahmad, M. et al. Research diagnostic criteria for temporomandibular disorders (RDC/TMD): Development of image analysis criteria and examiner reliability for image analysis. Oral Surg. Oral Med. Oral Pathol. Oral Radiol. Endod. 107(6), 844860 (2009).

Article PubMed PubMed Central Google Scholar

Schiffman, E. et al. Diagnostic criteria for temporomandibular disorders (DC/TMD) for clinical and research applications: Recommendations of the International RDC/TMD Consortium Network* and Orofacial Pain Special Interest Group. J. Oral Facial Pain Headache 28(1), 627 (2014).

Article PubMed PubMed Central Google Scholar

Asendorf, A. et al. Interexaminer reliability of the German version of the DC/TMD. J. Oral Rehabil. 48(1), 2834 (2021).

Article PubMed Google Scholar

Ohrbach, R. & Dworkin, S. F. The evolution of TMD diagnosis: Past, present, future. J. Dent. Res. 95(10), 10931101 (2016).

Article CAS PubMed PubMed Central Google Scholar

Tsai, C. M., Wu, F. Y., Chai, J. W., Chen, M. H. & Kao, C. T. The advantage of cone-beam computerized tomography over panoramic radiography and temporomandibular joint quadruple radiography in assessing temporomandibular joint osseous degenerative changes. J. Dent. Sci. 15(2), 153162 (2020).

Article PubMed PubMed Central Google Scholar

Lee, K. S., Jha, N. & Kim, Y. J. Risk factor assessments of temporomandibular disorders via machine learning. Sci. Rep. 11(1), 19802 (2021).

Article ADS CAS PubMed PubMed Central Google Scholar

Farook, T. H. & Dudley, J. Automation and deep (machine) learning in temporomandibular joint disorder radiomics: A systematic review. J. Oral Rehabil. 50(6), 501521 (2023).

Article PubMed Google Scholar

Vandenbroucke, J. P. et al. Strengthening the Reporting of Observational Studies in Epidemiology (STROBE): Explanation and elaboration. Int. J. Surg. 12(12), 15001524 (2014).

Article PubMed Google Scholar

Ohrbach, R., Larsson, P. & List, T. The jaw functional limitation scale: Development, reliability, and validity of 8-item and 20-item versions. J. Orofac. Pain 22(3), 219230 (2008).

PubMed Google Scholar

Gonzalez, Y. M. et al. Development of a brief and effective temporomandibular disorder pain screening questionnaire: Reliability and validity. J. Am. Dent. Assoc. 142(10), 11831191 (2011).

Article PubMed PubMed Central Google Scholar

Von Korff, M. et al. Graded chronic pain scale revised: Mild, bothersome, and high-impact chronic pain. Pain 161(3), 651661 (2020).

Article Google Scholar

Helkimo, M. Studies on function and dysfunction of the masticatory system. II. Index for anamnestic and clinical dysfunction and occlusal state. Sven. Tandlak. Tidskr. 67(2), 101121 (1974).

CAS PubMed Google Scholar

Talaat, W. M., Adel, O. I. & Al Bayatti, S. Prevalence of temporomandibular disorders discovered incidentally during routine dental examination using the Research Diagnostic Criteria for Temporomandibular Disorders. Oral Surg. Oral Med. Oral Pathol. Oral Radiol. 125(3), 250259 (2018).

Article PubMed Google Scholar

Talaat, S. et al. Improving the accuracy of publicly available search engines in recognizing and classifying dental visual assets using convolutional neural networks. Int. J. Comput. Dent. 23(3), 211218 (2020).

PubMed Google Scholar

Diwan, T., Anirudh, G. & Tembhurne, J. V. Object detection using YOLO: Challenges, architectural successors, datasets and applications. Multimed. Tools Appl. 82(6), 92439275 (2023).

Article PubMed Google Scholar

Karako, K., Chen, Y. & Tang, W. On medical application of neural networks trained with various types of data. Biosci. Trends 12(6), 553559 (2019).

Article PubMed Google Scholar

Talaat, W., Al Bayatti, S. & Al Kawas, S. CBCT analysis of bony changes associated with temporomandibular disorders. Cranio 34(2), 8894 (2016).

Article PubMed Google Scholar

de Boer, E. W., Dijkstra, P. U., Stegenga, B., de Bont, L. G. & Spijkervet, F. K. Value of cone-beam computed tomography in the process of diagnosis and management of disorders of the temporomandibular joint. Br. J. Oral Maxillofac. Surg. 52(3), 241246 (2014).

Article PubMed Google Scholar

Trp, J. C., Kowalski, C. J. & Stohler, C. S. Treatment-seeking patterns of facial pain patients: Many possibilities, limited satisfaction. J. Orofac. Pain 12(1), 6166 (1998).

PubMed Google Scholar

Beecroft, E. V., Durham, J. & Thomson, P. Retrospective examination of the healthcare journey of chronic orofacial pain patients referred to oral and maxillofacial surgery. Br. Dent. J. 214(5), E12 (2013).

Article CAS PubMed Google Scholar

Ahmad, M. & Schiffman, E. L. Temporomandibular joint disorders and orofacial pain. Dent. Clin. North Am. 60(1), 105124 (2016).

Article PubMed Google Scholar

Hiraiwa, T. et al. A deep-learning artificial intelligence system for assessment of root morphology of the mandibular first molar on panoramic radiography. Dentomaxillofac. Radiol. 48(3), 20180218 (2019).

Article PubMed Google Scholar

Bamba, Y. et al. Object and anatomical feature recognition in surgical video images based on a convolutional neural network. Int. J. Comput. Assist. Radiol. Surg. 16(11), 20452054 (2021).

Article PubMed PubMed Central Google Scholar

Kreiner, M. & Viloria, J. A novel artificial neural network for the diagnosis of orofacial pain and temporomandibular disorders. J. Oral Rehabil. 49(9), 884889 (2022).

Article PubMed Google Scholar

Shoukri, B. et al. Minimally invasive approach for diagnosing TMJ osteoarthritis. J. Dent. Res. 98(10), 11031111 (2019).

Article CAS PubMed PubMed Central Google Scholar

Nam, Y., Kim, H. G. & Kho, H. S. Differential diagnosis of jaw pain using informatics technology. J. Oral Rehabil. 45(8), 581588 (2018).

Article CAS PubMed Google Scholar

Srivastava, S. et al. Comparative analysis of deep learning image detection algorithms. J. Big Data 8, 66 (2021).

Article Google Scholar

Xuan, A. et al. The application of machine learning in early diagnosis of osteoarthritis: A narrative review. Ther. Adv. Musculoskelet. Dis. 15, 1759720X231158198 (2023).

Article PubMed PubMed Central Google Scholar

See original here:
An artificial intelligence model for the radiographic diagnosis of ... - Nature.com

Read More..

Where are we at with quantum computing? – Cosmos

Aberdeen, Maryland in the late 1940s was an exciting place to be. They had a computer so powerful and so energy intensive that there were rumours that when it switched on, the lights in Philadelphia dimmed.

The computer called the ENIAC took up an area almost the size of a tennis court. It needed 18,000 vacuum tubes and had cords thicker than fists crisscrossing the room connecting one section to another.

Despite its size, today its less impressive. Its computing power would be dwarfed by a desk calculator.

Professor Tom Stace, the Deputy Director of the ARC Centre of Excellence in Engineered Quantum Systems (EQUS) believes that quantum computing is best thought of not as computers like we know them today, but as big lumbering systems like the ENIAC.

ENIAC was the first digital computer, said Stace.

You see engineers programming, but that meant literally unplugging cables and plugging them into these gigantic room-size things. Thats sort of what a quantum computer looks like now. Its literally bolt cables that people have to wire up and solder together.

To understand where were at with quantum computing currently, you first have to understand their potential.

Right now, quantum computing is still in the very earliest stages of its development, despite the huge hype around quantum suggesting otherwise.

The ENIAC was useful despite its bulk, allowing programmers to do thousands of mathematical problems a second, and computations for the hydrogen bomb.

On the other hand, quantum computers are not yet suitable even for the niche roles that scientists hope they will one day fill. The idea that quantum computers might one day replace your laptop is still basically in the realm of science fiction.

But that doesnt mean that they cant one day be useful.

We know that quantum computers can solve a few sets of problems in a way that that ordinary computers just cant do, says Stace.

The famous one is factoring numbers. Finding the prime factors of a large number is genuinely a very difficult mathematical problem.

Because banks, governments, and anyone who wants to keep something secret all use factoring prime numbers for their digital security, our security systems would fall apart as soon as someone created a quantum computer that could outpace ordinary computers. Groups like the Australian Cyber Security Centre have already started putting in plans for when this eventually occurs.

Quantum computers could also fundamentally change the chemistry field, with more processing power to simulate better catalysts, fertilisers, or other industrial chemicals.

But this can only happen if quantum computers move beyond the realm they are in now what scientists call Noisy Intermediate Scale Quantum.

Computers are simply devices that can store and process data. Even the earliest computers used bits, a basic unit of information that can either be on or off.

Quantum computers are also devices that can store and process information, but instead of using bits, quantum computers use quantum bits or qubits, which dont just turn on and off but also can point to any point in between.

The key to quantum computers huge potential and also problems are these qubits.

Groups like IBM and Google have spent millions of dollars on creating quantum computers, no doubt buoyed by the riches for the company that comes first.

Their efforts so far have been relatively lacklustre.

The machines are clunky, each wire and qubit need to be individually placed or set up manually. The whole thing needs to be set up inside a freezer cooled down to almost absolute zero.

Despite all these safeguards the machines still have enough errors that its almost impossible to tell if the machines worked, or if these million-dollar systems are just producing random noise.

And even that is impressive to scientists like Stace.

Twenty years ago, if you had one qubit you got a Nature paper. Fifteen years ago, two or three qubits got you a Nature paper. Ten years ago, five qubits got you a Nature paper. Now, 70 qubits might get your Nature paper, says Stace.

Thats telling you what the frontier looks like.

Those on the frontier are aiming for supremacy quantum supremacy to be exact.

Quantum supremacy is a term given to a quantum computer that could solve a problem no classical computer could solve in a reasonable time frame. Its important to note though that this problem doesnt have to be useful. Theres been a debate in quantum circles about how useful and practical these sorts of problems, or simulations, actually are to prove quantum is better.

Googles machine called the Sycamore processor has currently got 70 qubits all lined up and connected. In 2019, the researchers had claimed theyd reached quantum supremacy. More recently, they went more specific suggesting that a top-level supercomputer would take 47 years to do the calculations that Sycamore managed to do in seconds.

IBM says its 433-qubit quantum computer called Osprey could soon start having real-world applications. However, while IBM is further ahead in number of qubits, it is still struggling with the same error issues as other quantum systems.

To get to a quantum computer that could rival supercomputers at actual tasks, you need hundreds of thousands, or millions of qubits rather than a few hundred. But the more qubits you have the more errors that end up in the system.

Quantum systems are typically single atoms or single particles of light. Naturally, these are very fragile and very prone to disturbance or noise, says UNSW quantum researcher and entrepreneur Professor Andrew Dzurak.

That noise causes errors in the qubit information.

Heat also causes errors; vibration causes errors. Even just simply looking or measuring the qubit stops it altogether.

Both Dzurak and Stace stress the importance of fixing these errors. Without it, you have a very expensive, fragile machine that cant tell you anything accurately.

How to fix these errors isnt yet certain. While IBM, Google and other big companies are using superconducting qubits, smaller groups around the world are using everything from silicon to imperfections in diamond.

Dzurak has formed a start-up called Diraq which is aiming to use traditional computer chip technology to mount the qubits, allowing easier design and the ability to pack millions of qubits on one chip.

We have a mountain to climb, and you have to go through the stages to get up that mountain, he says.

The work that is being done by [IBM and Google] in collaboration, often with university groups is important research and is moving the field forward.

Entanglement is another important aspect of quantum computers which makes them infinitely harder to make work. A quirk in quantum mechanics is that particles can become intrinsically linked, despite their distance. This means that if you measure one particle you can tell information about the other, even if youre halfway across the Universe. This is entanglement, and the more and more particles you can entangle, the more powerful your quantum computer can be.

But the more particles you entangle, the more complicated the system becomes, and the more likely it will break down.

Here the history of computers seems to be repeating.

While ENIAC in Maryland was an undisputed success, it wasnt the first design of a computer, not by a long shot. The first design of a computer called the differential engine was designed by a mathematician Charles Babbage in the 1820s.

But it wouldnt be built in Babbages lifetime.

Using only the technology available, it was impossible to fine tune the metal precisely enough to build the machine. It was doomed to fail from the start.

It wasnt until an invention of something seemingly unrelated vacuum tubes or valves that ENIAC and other types of computers could begin being built in earnest.

Its a hard thing to admit, but when it comes to quantum computers, we dont yet know whether were building the ENIAC or struggling with Babbages differential engine.

It might be the case that the components that were pursuing now arent just precise enough, in the same way that the machining tools that they had in the 19th century werent precise enough to make a mechanical computer, says Stace.

So where are we at with quantum computing? Not very far at all.

It could be that were somewhere between Charles Babbage and the valve. Weve got the idea, we know in principle we can make this thing. We just dont know if we have the engineering chops to do it.

Original post:
Where are we at with quantum computing? - Cosmos

Read More..

Mastercard preps for the post-quantum cybersecurity threat – CIO

The ecosystem of digital payments is a sitting duck.

The billions of transactions we conduct online today are protected by what are called public-key encryption technologies. But as quantum computers become more powerful, they will be able to break these cryptographic algorithms. Such a cryptographically relevant quantum computer (CRQC) could deliver a devastating impact to global cybersecurity protocols.

To prepare for this worst-case scenario, Mastercard launched its Quantum Security and Communications project, which earned the company a 2023 US CIO 100 Award for IT innovation and leadership.

Were working proactively to mitigate the future risks related to quantum computing that could impact the security of the billions of digital transactions we process globally, says George Maddaloni, chief technology officer of operations at Mastercard, explaining the impetus for the project.

As it stands today, the online transactions that you and I conduct swear allegiance to public-key cryptography. In this technique, the person (or entity) sending the message secures (locks) it with a publicly available key and the entity at the receiving end decrypts it with a private key. The premise is that since only the receiver has the private key, the transaction is secure.

Secure private keys derive from mathematical algorithms the Rivest-Shamir-Adleman (RSA) algorithm is a common one that are impossible to reverse-engineer and hack. At least until a CQRC gets here and does so through sheer brute force of quantum computing.

Entities in the private and public sector are preparing by following one of two tracks: working on a whole new set of quantum-resistant algorithms on which to base the private keys (post-quantum cryptography, PQC) or using quantum physics to do the same (quantum key distribution, QKD). Mastercards project focuses on the latter method. Other enterprises in the financial sector are also exploring QKD.

On a parallel track, public institutions such as the National Institute of Standards and Commerce (NIST) are following the harden-the algorithms PQC approach. NIST has selected four quantum-resistant algorithms and is in the process of standardizing them. The final ones are expected to be available in the first half of 2024 and NIST has established a quantum-readiness roadmap for enterprises to follow.

Given that Mastercard has embraced the quantum key distribution method, its pilot project determined the architectural requirements and limitations of QKD and the operational readiness of the QKD systems.

Mastercards Maddaloni reports that the team tested the quantum key distribution solution over a dark fiber network. Toshiba and ID Quantique were used to produce the keys. Two networking vendors that Mastercard has worked with in the past were also brought in. Their input from an IP Ethernet networking perspective helped, Maddaloni says. The goal was to conduct an inventory of the types of networking capabilities within Mastercards network, which has thousands of endpoints connected with a few different telecommunications capabilities. We wanted to look at whether the quantum key distribution capabilities work in that environment, Maddaloni says.

The availability of QKD-enabled services and equipment is very specialized and currently quite limited, Maddaloni says. Not many hardware vendors have features available that can integrate with the QKD systems. Designing the test was also challenging. QKD requires individual photons to arrive at precise times, and quantum states used for encryption can be easily disturbed by external factors such as noise, temperature changes, and vibration, among other factors.

The project was designed to meet these challenges and deliver provable results and validation of the technology potential, Maddaloni adds. And it was successful.

Questions of cybersecurity like the ones Mastercard is addressing are key because they address the very foundation of the system that financial institutions have built.

Transaction security and the trust of our customers are the backbone of our business, Maddaloni points out. The impact of current PKI encryption methods being compromised could quite literally threaten our ability to operate securely, he adds. We believe being ready for a post-quantum landscape is part of our job and sends the right message to our partners, our customers, and our regulators.

Jeff Miller, CIO and senior vice president of IT and Security at Quantinuum, a full-stack quantum services company, agrees that protecting data is vital because its a conversation of trust with the consumer. The process of being crypto-agile is realizing that bad actors get more creative in the ways that they break into environments. As a result, enterprises must continue to build an iterative process and develop protocols to address these vulnerabilities.

While financial companies such as Mastercard are gearing up using their own pilot projects, the industry standards committee X9 is also working on guidance for enterprises in the financial sector, points out Dr. Dustin Moody, a mathematician who leads the post-quantum cryptography project at the National Institute of Standards and Technology (NIST).

The road ahead is not easy, the experts admit. The availability of quantum key distribution services and equipment is still very limited. Some of the hardware vendors we worked with have features that are just announced and very new in the market, and some havent even been generally made available, Maddaloni points out. I do think that the industry understands that financial services will need this capability in the future.

Moody advises companies to hone their post-quantum readiness despite what might look like a daunting landscape. The first order of business? You need to find all instances of public-key cryptography, which is tricky and it will take time to do that inventory, Moody says. Its gonna be a complex migration that will take time, he says, so we encourage organizations to get ahead of it as soon as they can.

Miller agrees. He likens the process to preparing for Y2K, when enterprises were worried about formatting and storage of information beyond the year 2000. The migration to post-quantum preparedness even has a similar catchy acronym: Y2Q. A key difference, Miller says, is that there was a fixed countdown clock to Y2K. The cryptographically relevant quantum computer is not here today but it could be five years from now. Or ten.

Knowing that we dont have a firm date for when our current encryption methodologies are no longer useful, Miller says, thats what keeps me awake at night.

See the original post:
Mastercard preps for the post-quantum cybersecurity threat - CIO

Read More..

ParTec AG becomes a Complete Integrator of Quantum Computers – HPCwire

MUNICH, Sept. 22, 2023 ParTec AG, the leading company in the field of modular supercomputing, announced today that as a result of its years of work in the field of quantum computing, it is positioning itself as a complete integrator of quantum computers. ParTec offers a comprehensive qubit-agnostic solution based on a component-based design. Similar to developments that took place in classical computing, a supply chain ecosystem with companies focussing on individual component technologies is emerging in the quantum space. This development allows ParTec AG to leverage its best-of-breed approach from supercomputing and collaborate with leading technology providers to offer comprehensive quantum complete solutions.

Bernhard Frohwitter, CEO of ParTec AG: Todays solutions for quantum computers are monolithic designs, mostly developed by qubit technology developers. This approach carries substantial risks for customers, in particular in terms of being tied to a specific provider and technology in a market that still is very volatile with respect to players and technologies. ParTec adopts a different, fresh and innovative approach that will lead to a strong market position.

The company aims to launch its first quantum computer in 2024. Dominik Ulmer, Chief Customer Solutions Officer at ParTec: Therefore, we have decided to start a project to establish a production facility for quantum computers in the Greater Munich area. The ParTec Quantum Factory is expected to start operations in the second half of 2024.

The company will initially invest five million euros in the construction of a production facility for assembly and testing of cryogenic and non-cryogenic systems.

Among ParTecs achievements in the field of quantum computing is the development of QBridge, a software solution that enables seamless integration of high-performance and quantum computers, created in collaboration with Quantum Machines, an Israeli developer of quantum control and orchestration products. In addition, ParTec is actively working on expanding its Parastation Modulo software, used in modular supercomputers. This expansion, Parastation Modulo 2.0, aims to bridge the gap to embed quantum computers into modular supercomputers. Furthermore, ParTec will deliver a superconducting complete solution and a cloud-based user access and management software infrastructure for the Israeli National Quantum Initiative (INQI), as well as establish a new laboratory for exploring hybrid quantum computing in collaboration with NVIDIA and the Jlich Supercomputing Centre (JSC).

The second worldwide quantum computer study by the International Data Corporation (IDC) predicts that potential customers spending on quantum computers will increase from 1.1 billion dollars in 2022 to 7.6 billion dollars in 2027, with a compound annual growth rate (CAGR) of 48.1% (2023-2027). The study further states, Quantum computing will revolutionize companies ability to solve some of the most complex challenges.

About ParTec AG:

ParTec AG specialises in the development and manufacture of modular supercomputers and quantum computers as well as accompanying system software. Its services include the distribution of future-oriented High-Performance Computers (HPC) and Quantum Computers (QC) as well as consulting and support services in all areas of development, construction and operation of these advanced systems. The approach of modular supercomputing represents a unique selling point and success feature of ParTec AG. Further information on the company as well as on ParTec AGs innovative solutions in the field of high-performance computing and quantum computing can be found at http://www.par-tec.com.

Source: ParTec AG

Link:
ParTec AG becomes a Complete Integrator of Quantum Computers - HPCwire

Read More..

The Platform for Digital and Quantum Innovation of Quebec (PINQ … – IBM Newsroom

Bromont, Quebec, September 22, 2023 The Platform for Digital and Quantum Innovation of Quebec (PINQ), a non-profit organization (NPO) founded by the Ministry of Economy, Innovation and Energy of Quebec (MEIE ministre de lconomie, de lInnovation et de lnergie du Qubec) and the Universit de Sherbrooke, along with IBM, are proud to announce the historic inauguration of an IBM Quantum System One at IBM Bromont. This event marks a major turning point in the field of information technology and all sectors of innovation in Quebec, making PINQ the sole administrator to inaugurate and operate an IBM Quantum System One in Canada. To date, this is one of the most advanced quantum computers in IBM's global fleet of quantum computers.

This new quantum computer in Quebec reinforces Quebec's and Canada's position as a force in the rapidly advancing field of quantum computing, opening new prospects for the technological future of the province and the country. Access to this technology is a considerable asset not only for the ecosystem of DistriQ, the quantum innovation zone for Quebec, but also for the Technum Qubec innovation zone, the new "Energy Transition Valley" innovation zone and other strategic sectors for Quebec.

The Platform for Digital and Quantum Innovation of Quebec (PINQ) announces the historic inauguration of an IBM Quantum System One Quantum Computer in Bromont, Quebec. (Credit: Ryan Lavine for IBM.)

The installation of this IBM quantum computer is a giant leap that will promote the growth of Quebec's quantum sciences ecosystem and the development of our DistriQ innovation zones in Sherbrooke and Technum Qubec in Bromont. This is a showcase for Quebec, which will be recognized as a force in quantum sciences, but also in international sustainable development, said Pierre Fitzgibbon, Minister of Economy, Innovation and Energy, Minister responsible for Regional Economic Development and Minister responsible for Greater Montreal and the Montreal Region.

The objective of DistriQ is to create the worlds largest commercial quantum research infrastructure, explained Richard St-Pierre, General Manager, DistriQ, Sherbrookes Quantum Innovation Zone. PINQs hybrid quantum computer is a unique and powerful asset that will allow the Innovation Zones companies to reach their objectives; we are very proud of this partnership.

In addition to having access to an IBM Quantum System One, the high-performance computing centre (HPC) set up at the Humano District in Sherbrooke will enable PINQ to offer a hybrid computing approach. This technological capability will provide businesses with a unique opportunity to access a full range of hybrid quantum computing service. PINQ offers businesses an easy and seamless experience to assess the potential of digital and quantum technologies and innovations within their existing processes, with an emphasis on specific sectors such as healthcare, energy, manufacturing, the environment and sustainable development.

As part of the partnership between PINQ and IBM announced in July 2023,the two organizations willleada world-class quantum working group dedicated toexploring quantum computing to developsolutionstosustainability challenges. This working group will be supported by the valuable contributions offoundingmembers: Hydro-Qubec and the Universit de Sherbrooke through its Institut Quantique.

For the energy sector, the ongoing energy and digital transitions impose the need for increasingly efficient calculations in terms of R&D and application development, a need that will grow significantly in the coming years, said Christian Blanger, Senior Director Research & Innovation at Hydro-Qubec. At our research center, we are already working hard to tackle the challenges of the energy transition. We believe that quantum technologies that PINQgives access to offers promising prospects and rich opportunities for value creation in terms of energy and technological solutions forHydro-Qubec. We will most certainly be exploring and harnessing the potential of these technologies as they evolve.

Discovery Accelerator

PINQ is currently the only entity to offer access to an IBM Quantum System One situated in Canada, and PINQ positions Quebec as the only other place in the world, outside of the United States, to be engaged in an IBM Discovery Accelerator associated with its own high-performance computing infrastructure and a quantum computer entirely dedicated to research and industrial innovation.

Quantum computing is accelerating at a rapid pace. This is in large part due to a growing global ecosystem that continues to push the boundaries of what is possible, said Jay Gambetta, Vice President, IBM Quantum. Our partnership with PINQ to deploy an IBM Quantum System One in Quebec, Canada marks a significant milestone in quantum technological and scientific progress, and enables the regions strong culture of innovation and talent to help extend the frontiers of quantum computings potential.

A first Centre of Excellence in quantum software development

PINQ is also proud to announce the establishment of its Center of Excellence, aimed at accelerating the adoption of quantum technologies by providing accessible access to PINQs infrastructure for businesses and researchers. The Center of Excellence will support a community dedicated to quantum software, making it easier to use, create, and foster dynamic collaboration, all while setting industry benchmarks in software engineering.

With the goal of making quantum technologies accessible to all, the Center of Excellence will evolve into a platform offering training opportunities, collaborative projects with universities and industry partners, and the development of open-source algorithms. As the inaugural partner in this initiative, the cole de Technologie Suprieure (TS) is contributing a team of researchers dedicated to democratizing best practices in quantum software.

A historic turning point for the province and the country

At PINQ, our passion for digital and quantum innovation is our driving force, said ric Capelle, General Manager of PINQ. The inauguration of an IBM Quantum System One quantum computer marks a historic turning point for Quebec and Canada. We are proud to play a key role in this technological revolution.

In addition to this news, PINQ is accelerating its services for businesses. We are working with a network of Canadian academic partners such as IVADO, Universit de Sherbrooke, University of Saskatchewan, Quantum Algorithms Institute and Concordia University to collaborate with this industry and train quantum talent.

We are also proud to announce the creation of a multidisciplinary team to accelerate the development of quantum business solutions through the Centre of Excellence in Quantum Hybrid Software Engineering, as well as the deployment on our platform of a first curriculum dedicated to professionals and available to PINQ customers.

About PINQ

The Platform for Digital and Quantum Innovation of Quebec is a non-profit organization created by the Universit de Sherbrooke and the ministre de lconomie, de lInnovation et de lnergie du Qubec (Ministry of Economy, Innovation and Energy of Quebec) in 2020. Its mission is to support organizations in accelerating their digital transformation, to enhance collaboration, and to simplify technology transfers between industries and research, in addition to training the talents of tomorrow.

About IBM

IBM is a leading provider of global hybrid cloud and AI, and consulting expertise. We help clients in more than 175 countries capitalize on insights from their data, streamline business processes, reduce costs and gain the competitive edge in their industries. More than 4,000 government and corporate entities in critical infrastructure areas such as financial services, telecommunications and healthcare rely on IBM's hybrid cloud platform and Red Hat OpenShift to effect their digital transformations quickly, efficiently and securely. IBM's breakthrough innovations in AI, quantum computing, industry-specific cloud solutions and consulting deliver open and flexible options to our clients. All of this is backed by IBM's long-standing commitment to trust, transparency, responsibility, inclusivity and service.

For more information, visit http://www.ibm.com

About IBM Quantum System One

Quantum computing is an emerging technology that exploits the laws of quantum mechanics to solve certain problems that today's most powerful supercomputers cannot practically solve. IBM Quantum System One is the first integrated quantum system with a compact design optimized for stability, reliability and continuous use. It has been deployed in a number of sites around the world, in Germany, Japan, the United States and now Canada. Its 127-qubit utility processor will offer improved coherence times and lower error rates than IBM's previous quantum systems.

*Pictured in image: L-R: Alessandro Curioni, IBM Fellow and Vice President Europe and Africa and Director IBM Research Zurich, IBM; Jamie Thomas, General Manager, Technology Lifecycle Services and IBM Enterprise Security Executive, IBM; Stphane Tremblay, Chief Director, Bromont, Site Location Executive, IBM Canada; Nathalie Le Prohon, Director, IBM Technologies, Qubec, IBM; Dave McCann, President, IBM Canada and Associate Director, IBM Consulting Canada, IBM; Isabelle Charest, Minister Responsible for Sport, Recreation and the Outdoors, Government of Qubec; Jay Gambetta, IBM Fellow and Vice President, IBM Quantum, IBM; Pierre Fitzgibbon, Minister of Economy and Innovation and Energy, Government of Qubec; Eric Capelle, CEO, PINQ; Marie-Eve Boulanger, Program Manager Quantum, PINQ; Richard St-Pierre, Executive Director of DistriQ, Quantum Innovation Zone of Qubec.

Press contacts:

Simon Fauchersfaucher@zonefrancherp.com514-402-3873

Marie Foucherotmfoucherot@zonefrancherp.com579-372-6015

Lorraine BaldwinIBM Canada Communicationslorraine@ca.ibm.com

Katia Moskvitchkam@zurich.ibm.com+41 78208 9666

See the rest here:
The Platform for Digital and Quantum Innovation of Quebec (PINQ ... - IBM Newsroom

Read More..

How can quantum computers be better than classical computers? – The Hindu

Quantum computing is becoming more popular both as a field of study and in the public imagination. The technology promises more speed and more efficient problem-solving abilities, challenging the boundaries set by classical, conventional computing.

The hype has led to inflated expectations. But whether or not it can meet them, the raison dtre of a quantum computer is taken to be synonymous with the ability to solve some problems much faster than a classical computer can. This achievement, called quantum supremacy, will establish quantum computers as superior machines.

Scientists have been exploring both experimental and theoretical ways to prove quantum supremacy.

Ramis Movassagh, a researcher at Google Quantum AI, recently had a study published in the journal Nature Physics. Here, he has reportedly demonstrated in theory that simulating random quantum circuits and determining their output will be extremely difficult for classical computers. In other words, if a quantum computer solves this problem, it can achieve quantum supremacy.

But why do such problems exist?

Quantum computers use quantum bits, or qubits, whereas classical computers use binary bits (0 and 1). Qubits are fundamentally different from classical bits as they can have the value 0 or 1, as a classical bit can, or a value thats a combination of 0 and 1, called a superposition.

Superposition states allow qubits to carry more information. This capacity for parallelism gives quantum computers their archetypal advantage over classical computers, allowing them to perform a disproportionately greater number of operations.

Qubits also exhibit entanglement, meaning that two qubits can be intrinsically linked regardless of their physical separation. This property allows quantum computers to tackle complex problems that may be out of reach of classical devices.

All this said, the real breakthrough in quantum computing is scalability. In classical computers, the processing power grows linearly with the number of bits. Add 50 bits and the processing power will increase by 50 units. So the more operations you want to perform, the more bits you add.

Quantum computers defy this linearity, however. When you add more qubits to a quantum computer, its computational power for certain tasks grows exponentially as 2n, where n is the number of qubits. For example, whereas a one-qubit quantum computer can perform 21 = 2 computations, a two-qubit quantum computer can perform 22 = 4 computations, and so forth.

Quantum circuits are at the heart of quantum computing. These circuits consist of qubits and quantum gates, analogous to the logic gates of classical computers. For example, an AND gate in a classical setup has output 1 if both its inputs are 0 or 1 i.e. (0,0) or (1,1). Similarly, a quantum circuit can have qubits and quantum gates wired to combine input values in a certain way.

In such a circuit, a quantum gate could manipulate the qubits to perform specific functions, leading to an output. These outputs can be combined to solve complex mathematical problems.

Classical computers struggle with #P-hard problems a set of problems that includes estimating the probability that random quantum circuits will yield a certain output.

#P-hard problems are a subset of #P problems, which are all counting problems. To understand what this means, lets consider another set of problems called NP problems. These are decision-making problems, meaning that the output is always either yes or no.

A famous example of an NP problem is the travelling salesman problem. Given a set of cities, is there a route passing through all of them and returning to the first one, without visiting any city twice, whose total distance is less than a certain value? As the number of cities increases, the problem becomes vastly more difficult to solve.

To turn this NP problem into a #P problem, we must count all the different possible routes that are shorter than the specified limit. #P problems are at least as hard as NP problems because they require not just a yes or no answer but the number of possible solutions. That is, when the answer is no, the count will be zero; but when the answer is yes, the count will have to be computed.

If a problem is #P-hard, then it is so challenging that if you can efficiently solve it, you can also efficiently solve every other problem in the #P class by making certain types of transformations.

To prove that there is a class of problems that can be solved by quantum computers but not by classical computers, Dr. Movassagh used a mathematical construct called the Cayley path.

The Cayley path is like a bridge that helps the travelling salesman move smoothly between two different situations in the study like one random route and one significantly complicated route. With quantum computers, one situation would be the worst-case scenario, like imagining the most challenging quantum circuit possible. The other would be the average case, a quantum circuit that has been randomly selected from the set of all possible circuits.

This bridge allows us to reframe the most challenging quantum circuit in terms of the average circuit like seeing how tough it might be to handle the worst traffic jam compared to your regular commute.

Dr. Movassagh showed that estimating the output probability of a random quantum circuit is a #P-hard problem, and has all the characteristics of a problem in this computational complexity class including overwhelming the ability of a classical computer to solve it.

His paper is also notable because of its error-quantifiable nature. That is, the work dispenses with approximations, and allows independent researchers to explicitly quantify the robustness of his findings.

As such, Dr. Mossavaghs paper shows that there exists a problem that presents a computational barrier to classical computers but not to quantum computers (assuming a quantum computer can crack a #P-hard problem).

The establishment of quantum supremacy will have a positive impact on several fields: cryptography is expected to be a particularly famous beneficiary, at least once the requisite advances in hardware and materials science have been achieved.

Dr. Movassaghs paper is also an advance in quantum complexity theory. The sets NP, #P, #P-hard, etc. were defined keeping the computational abilities of classical computers in mind. Quantum complexity theory is concerned with limits of complexity defined by quantum computers.

The theory also challenges the extended Church-Turing thesis, which is the idea that classical computers can efficiently simulate any physical process. Dr. Movassagh hopes to continue his work to investigate the hardness of additional quantum tasks and someday disprove the thesis.

Tejasri Gururaj is a freelance science writer and journalist.

More:
How can quantum computers be better than classical computers? - The Hindu

Read More..

Quantum Computing Inc. Selects Tempe, Arizona as the Site for its … – PR Newswire

LEESBURG, Va., Sept. 21, 2023 /PRNewswire/ -- Quantum Computing Inc. ("QCi" or the"Company") (Nasdaq: QUBT), an innovative, quantum optics andnanophotonics technology company,announces that it has chosen ASU Research Parkin Tempe, Arizona, as the location for its new quantum photonic chips manufacturing facility, where it will produce its Thin Film Lithium Niobate (TFLN) chips.

Known for its photonic-based quantum solutions, QCi is expanding production with this new facility to accelerate its advanced technology development in nanophotonics and optical chip manufacturing for use in its high-performance computing, machine learning, cyber security, sensing, and imaging products.A characteristic feature of these chips is heightened scalability and performance advantages such as speed, accuracy and ultra low electric power consumption. Lithium niobate nanophotonic circuits (quantum chips) will be used in QCi's products and for general sale in the market as well.

The State of Arizona is one of the nation's leading semiconductor ecosystems, comprised of more than 100 leading tech companies, built upon a long history of advancing the chip industry dating back to the first Motorola lab in the late 1940s. Importantly, Arizona is a driving force in the field of optics as its government leaders were early in the championing of advanced photonic research by supporting strong research universities interested in exploring mission-ready quantum computing and related technologies.

The location QCi chose for its new facility is on five acres within the extensive 320-acre research park hosted by ASU, a global leader in academic microelectronics research and #1 in Innovation according to U.S. News & World Report. The research park currently provides 2.2 million sf. of Class A office and research facilities for more than 6,700 employees and 50 corporations and organizations, including Honeywell, Texas Instruments, Linear Technology, GE Healthcare, Iridium Satellite and the Institute for Supply Management. The selection of this site aims to promote cooperation, creativity and leverage the park's pre-existing infrastructure and skilled workforce.

To date, QCi has placed deposits for the procurement of critical long-lead equipment and paid other expenses associated with the new chip fabrication facility of approximately $2 million. The Company plans to begin the buildout of the facility during fourth quarter 2023. As a lease incentive, the landlord agreed to provide a significant portion of the leasehold improvements needed to develop the fabrication facility. In addition, there are multiple funding sources available for state-of-the-art manufacturing facilities, including state and federal grants and low single-digit interest rate loans. Importantly,QCi believes it is in a strong position to benefit from the CHIPS and Science Act of 2022, which allocated $113.2 billion in federal funding and tax incentives to companies for the development of semiconductor research and manufacturing in the U.S.QCi anticipates that its chip manufacturing will commence operations first half of 2024, initially with singly purposed chipssuch as those physically unclonable functions, electro-optical modulation, quantum entanglement generationand mass-producing quantum photonics chips with complex nanophotonic circuits by late 2024 / early 2025.

"The quantum photonic chip facility is poised to make a significant impact in the United States by becoming the first US-based developer and producer of thin film lithium niobate chips. This accomplishment not only enhances the nation's manufacturing capabilities but also reduces reliance on foreign chip imports," commented Robert Liscouski, QCi's CEO. "The chips, which will be manufactured domestically, are a central part of QCi's technical and commercial growth strategy and will serve as the foundation for a new wave of innovative quantum technologies, spanning fields such as data processing, hybrid computing, cryptography, sensing, and artificial intelligence. This initiative will serve to keep the US as a leading technology provider and will reinforce supply chain security and solidify QCi's position in the nanophotonics and quantum optics industry."

Mr. Liscouski added, "The multiple benefits of locating in Arizona, and at the ASU Research Park in particular, are expected to accelerate the time-to-market of the first products powered by QCI's quantum photonic chips. Expansion into Arizona represents a strategic initiative for the Company. The location was selected due to the State's leadership in the field of optics, its early recognition of the importance of advanced photonic research, and the presence of numerous State and US Government entities interested in exploring mission-ready quantum computing and related technologies."

Mr. Corey Woods, Mayor of Tempe, Arizona commented, "This expansion by QCi holds great potential for Tempe's economic development. By prioritizing the production of lithium niobate chips, QCi is not only creating job opportunities in engineering and manufacturing but is also establishing a pioneering quantum technology sector in the region. This move is set to enhance Tempe's reputation as a technology hub, attracting talent and promoting an environment conducive to innovation."

About Quantum Computing Inc.

Quantum Computing Inc. (QCi) (Nasdaq: QUBT) is an innovative, quantum optics and nanophotonics technology company on a mission to accelerate the value of quantum computing for real-world business solutions, delivering the future of quantum computing, today. The company provides accessible and affordable solutions with real-world industrial applications, usingnanophotonic-based quantum entropy that can be used anywhere and with little to no training, operates at normal room temperatures, low power and is not burdened with unique environmental requirements. QCi is competitively advantaged delivering its quantum solutions at greater speed, accuracy, and security at less cost. QCi's core nanophotonic-based technology is applicable to both quantum computing as well as quantum intelligence, cybersecurity, sensing and imaging solutions, providing QCi with a unique position in the marketplace. QCi's core entropy computing capability, the Dirac series, delivers solutions for both binary and integer-based optimization problems using over 11,000 qubits for binary problems and over 1000 (n=64) qubits for integer-based problems, each of which are the highest number of variables and problem size available in quantum computing today.Using the Company's core quantum methodologies, QCi has developed specific quantum applications for AI, cybersecurity, and remote sensing, including its Reservoir Photonic Computer series (intelligence), reprogrammable and non-repeatable Quantum Random Number Generator (cybersecurity) and LiDAR and Vibrometer (sensing) products. For more information about QCi, visitwww.quantumcomputinginc.com.

About Quantum Innovative Solutions

Quantum Innovative Solutions (QI Solutions or QIS), a wholly owned subsidiary of Quantum Computing Inc., is an Arizona-based supplier of quantum technology solutions and services to the government and defense industries. With a team of qualified and cleared staff, QIS delivers a range of solutions from entropy quantum computing to quantum communications and sensing, backed by expertise in logistics, manufacturing, R&D and training. The company is exclusively focused on delivering tailored solutions for partners in various government departments and agencies.

Important Cautions Regarding Forward-Looking Statements

This press release contains forward-looking statements as defined within Section 27A of the Securities Act of 1933, as amended, and Section 21E of the Securities Exchange Act of 1934, as amended. By their nature, forward-looking statements and forecasts involve risks and uncertainties because they relate to events and depend on circumstances that will occur in the near future. Those statements include statements regarding the intent, belief or current expectations of QCi and members of its management as well as the assumptions on which such statements are based. Prospective investors are cautioned that any such forward-looking statements are not guarantees of future performance and involve risks and uncertainties, and that actual results may differ materially from those contemplated by such forward-looking statements.

QCi undertakes no obligation to update or revise forward-looking statements to reflect changed conditions. Statements in this press release that are not descriptions of historical facts are forward-looking statements relating to future events, and as such all forward-looking statements are made pursuant to the Securities Litigation Reform Act of 1995. Statements may contain certain forward-looking statements pertaining to future anticipated or projected plans, performance and developments, as well as other statements relating to future operations and results. Words such as "may," "will," "expect," "believe," "anticipate," "estimate," "intends," "goal," "objective," "seek," "attempt," "aim to," or variations of these or similar words, identify forward-looking statements. These risks and uncertainties include, but are not limited to, those described in Item 1A in QCi's Annual Report on Form 10-K and other factors as may periodically be described in QCi's filings with the U.S. Securities and Exchange Commission.

SOURCE Quantum Computing Inc.

Visit link:
Quantum Computing Inc. Selects Tempe, Arizona as the Site for its ... - PR Newswire

Read More..