Page 1,655«..1020..1,6541,6551,6561,657..1,6601,670..»

The quantum revolution: The race to build a quantum computer – Financial Times

Tech companies including Google, Microsoft and IBM are all working on plans for a commercially viable quantum computer. They say that these machines will be able to solve climate change, help develop new pharmaceutical drugs and transform our economy. But harnessing quantum physics requires overcoming massive challenges.

As researchers tinker away on uber-sensitive, ultra-cold quantum computers and investors become increasingly interested in the potential commercial applications some people in the quantum computing world arent buying the hype.

In this episode of Tech Tonic, FT innovation editor John Thornhill travels to the West Coast to visit Julie Love and Krysta Svore, both of Microsofts quantum computing programme, and tours Googles quantum computing lab with engineer Erik Lucero. We hear from Bessemer Venture Partners investor David Cowan, and FT artificial intelligence editor Madhumita Murgia talks to long-time quantum computing researcher Sankar Das Sarma.

Presented by Madhumita Murgia and John Thornhill, produced by Josh Gabert-Doyon and Edwin Lane. Executive producer is Manuela Saragosa. Sound design by Breen Turner and Samantha Giovinco. The FTs head of audio is Cheryl Brumley.

We're keen to hear more from our listeners about this show and want to know what you'd like to hear more of, so we're running a survey which you can find at ft.com/techtonicsurvey. It takes about 10 minutes to complete and you will be in with a chance to win a pair of Bose QuietComfort Earbuds.

Read a transcript of this episode on FT.com

View our accessibility guide.

Originally posted here:
The quantum revolution: The race to build a quantum computer - Financial Times

Read More..

Here are the top Dutch-based Quantum technology startups to watch … – Silicon Canals

Image credit: Delft Circuits

The concepts of quantum mechanics, created in the early 20th century to describe nature at the size of atomic and subatomic particles, serve as the foundation for quantum technology.

Applications in encrypted transmission, disaster management through improved prediction, computers, simulations, science, medicine, cryptography, and medical imaging are just a few examples of how quantum technology can be used.

Even though the layman can be confused by definitions, the use of quantum technology in our everyday lives is all pervasive. From phones and computers to television and cars, applications of quantum technology can be found everywhere.

The Netherlands has of late become a hub of quantum technology research and application. Many startups in the country are working in the field to make the technology accessible and useful to humanity.

We have listed out the top quantum technology startups in the Netherlands. Do take a look:

Fermioniq is a quantum software company based in Amsterdam. Co-founded by Jorgen Sandig, Ido Niesen, Chris Cane, and Prof. Harry Buhrman, it is one of the top Dutch-based quantum technology startups with a highly competitive teamare the co-founders of Fermioniq.

The company makes softwares to run specifically on quantum computers.

QuiX Quantum is a photonic quantum technology startup based in Enschede, the Netherlands. Dr. Jelmer Renema, a specialist in quantum photonics, and a group of professors from the University of Twente formed the company in January 2019 with the help of Dr. Hans van den Vlekkert, a seasoned businessman and veteran of the photonics sector.

The company counts RAPH2Invest, FORWARD.one and Oost NL among its investors.

The companys product portfolio includes:

Founded in July 2021, QuantWare is a quantum technology startup with the single goal of growing the field, democratising hardware, and advancing the usability of the quantum computer.

The Dutch company focuses on the quickest route to practical quantum computation, building on its unrivalled know-how in scaling up superconducting QPUs. They are a cooperative business that collaborates with industry leaders and specialists to provide complementary solutions.

Their products include:

Qu & Co creates quantum software and algorithms. The company wants to make quantum advances while maintaining outstanding standards of objectivity.

The first quantum computing platform designed exclusively for chemistry and materials science, QUBEC, is made available to clients of Qu & Co.

QphoX is a quantum transduction company. The first quantum modem developed by OphoX will link quantum computers in a quantum network. The foundation of the upcoming quantum internet will be their technology.

Through optical interconnects operating at room temperature, it will enable remote communication between quantum computers. They use a mechanical intermediary resonator to couple microwave and optical photons to create a quantum transducer. This synchronised, reversible method is based on the piezoelectric and optomechanical effects.

Quantum chips must be conceived, made, and tested in order to achieve a commercial quantum advantage. Orange Quantum Systems is developing the test system and protocols to get important information on quantum chip performance.

The company also provides diagnostic tools to improve quantum chip design.

To ensure high-quality research in quantum technologies, 39 research funding organisations have come together to form QuantERA.

The programmes objectives are to:

Delft Circuits brings quantum technology to life in collaboration with their clients as an independent, committed supplier of quantum hardware. The companys products include:

By mapping exposures and executing real-time attack detection with their Identity Threat Detection and Response (ITDR) solution suite, QuompleX decreases cyber risk and attack surfaces.

Researchers from the Netherlands, Italy, and Austria are the firms participants.

Here is the original post:
Here are the top Dutch-based Quantum technology startups to watch ... - Silicon Canals

Read More..

Leibniz QIC’s Mission to Coax Qubits and Bits to Work Together – HPCwire

Four years after passing the U.S. National Quantum Initiative Act and decades after early quantum development and commercialization efforts started think D-Wave Systems and IBM, for example the U.S. quantum landscape has become a roiling cauldron of diverse activity. Its perhaps too easy to forget that the U.S. is hardly alone in catching the quantum bug. Europe has also jumped into the fray, as have China, Japan, Canada, Australia and many others. No one wants to miss out on what could globally become a transformational technology.

One area that Europe is tackling sooner than the U.S. is work to fully integrate quantum computing with traditional HPC infrastructure. While theres an emerging consensus worldwide that quantum computing is likely to become just another accelerator in the heterogeneous advanced computing architecture, Europe is taking deliberate steps now to make this a reality and the Quantum Integration Center (QIC) at the Leibniz Supercomputing Center (LRZ) is an illustrative example.

Now turning two years old, the Leibniz QIC has two major objectives. Its meant to be a user facility providing access to a variety of quantum hardware and software and assist in their development. Nothing new there, and its still early on in standing up quantum systems. The second mission goal is to integrate quantum computing with Leibniz traditional computing infrastructure, which includes new AI technologies such as a Cerebras system.

The broad idea is that in the future, Leibniz Supercomputing Center users may submit jobs and not necessarily know which of the underlying hardware options are doing the number crunching. Quantum will be another accelerator in a mix of accelerators ready for work. Creating the blended infrastructure to do that efficiently is at the core of the Leibniz QICs mandate.

We are responsible for what were calling the Munich Quantum Software stack thats to be able to develop needed algorithms and software tools all the way through to running and managing applications on quantum resources and incorporating HPC. The HPC-QC integration is a big part of this. Also, well develop this capability in a qubit-modality-agnostic way, said Laura Schulz, head of Leibniz QIC, who was part of the team that wrote the strategic plan for the QIC.

At the end of the day, our users should be able to utilize this technology with the simplest, cleanest path available. Some users will care about what system theyre actually on, and will want to be able to fine-tune the pulses on those quantum systems. Then youve got the other spectrum, users, like many HPC users, that are not going to care as much about what theyre computing on; theyre going to care more about getting the performance.

Ambitious goals. Schulz recently briefed HPCwire on Leibniz QIC plans and progress. A key early milestone, said Schulz, is demonstrating a working HPC-QC stack.

Weve got an early quantum system (5-qubit) and we have an HPC test center [that comprise] a great testbed, literally sitting in the same room. If you came into the room, you see them literally next to each other. The first milestone is making sure that these systems are connected and that we can send jobs through the HPC to the QPU. It comes back to work on software development. We want to have these systems in place and to have the software to enable interaction not two different software stacks running independently, but a single source software. Then well get progressively better as we get other systems in, she said.

Though these are still early days, the Leibniz QIC has been growing rapidly. On the hardware side, it currently has a 5-qubit superconducting processor from IQM, simulators from Atos (QLM) and Intel (IQS) and will add more QPUs and types. For example, theres a 20-qubit system coming as part of the Q-Exa Project. At the moment, Leibniz QIC is focused on superconducting-based quantum processors but the broad goal is to avoid being locked into single qubit technology.

Were waiting on the neutral atoms; those are a little bit further down the timeline. For us, right now, its superconducting because it offers great opportunities for scaling. Each of these systems has its own flavor and benefits. Superconducting is great for scalability, but its not as stable. Ion trap has more stability, but you cant quite scale it as much.

We have these different systems that were building up and we are going on the postulate were going to have multiple types of QPUs in the ecosystem; theres not going to be one winner, right, and the technology is too new to bank on any one particular [approach]. But by having a suite of different types of modalities around, well be able to experiment, said Schulz, who was selected this year as an HPCwire Person to Watch.

Its worth noting the wide range of the Leibniz QICs constituency. It is part of one of six European supercomputing centers involved in Europes quantum computing development effort. It is also part of the Munich Quantum Valley(MQV). Heres how MQV describes itself:

As a hub between research, industry, funders, and the public, Munich Quantum Valley (MQV) is the crystallization point for the development of the full spectrum of quantum technologies. It promotes an efficient knowledge transfer from research to industry, establishes a network with international reach and provides tailor-made education and training opportunities in the fields of quantum science and technology.

Harnessing three of the most promising technology platforms superconducting, neutral-atom, and trapped-ion qubit systems Munich Quantum Valley will develop and operate competitive quantum computers in Bavaria. In a unique holistic approach researchers develop all layers, from hard- and software up to applications.

The Munich Quantum Valley collaboration unites research capacities and technology transfer power of three major universities and key research organizations: the Bavarian Academy of Sciences and Humanities (BAdW), the Fraunhofer-Gesellschaft (FhG), the Friedrich-Alexander-Universitt Erlangen-Nrnberg (FAU), the German Aerospace Center (DLR), the Ludwig- Maximilians-Universitt Mnchen (LMU), the Max Planck Society (MPG), and the Technical University of Munich (TUM). Their joint work will advance quantum technologies at all levels for future use in science, research and industrial applications.

Think Silicon Valley focused on quantum. Perhaps more than in the U.S., the interplay between industry and government-funded programs is fundamental. For example, the MQV interchange with the Leibniz QIC is extensive said Schulz.

I havent paid as much attention to the American situation as much as I should. What impresses me about what I see happening in Europe is this early dedication to HPC-QC integration. We know that quantum is going to have to be trusted, and have to be fortified, and its going to come in to the supercomputing realm. I mean, quantum is high performance computing, right. Its going to end up as another accelerator capability.

The other thing that Ive noticed is the partnership with industry. And while there is some of the early hype, some overly ambitious promises and all, but what Im seeing, trend-wise, is the conversation is more tempered. We realize that theres a lot of possibilities here, but also realize weve got several steps to go to get to that potential promise. The companies that weve been interacting with have that mentality, they understand that the possibility is there, theyre doing these proof of concept projects, said Schulz.

Over time, of course, the market will determine which development approaches win. China has embarked on an aggressive centralized plan. The U.S. has a blend of DOE-funded National QIS Research Centers and a vigorous separate commercial quantum development community. Europe has a Quantum Technologies Flagship program and the European High-Performance Computing Joint Undertaking (EuroHPC JU) which named LRZ as a quantum site.

At ground level, Schulz is busily ramping the Leibniz QIC. Staffing has been a challenge. Headcount is currently ~24 and headed north of 40 by year-end, hopes Schulz who is actively hiring. Its a multi-discipline group, with its share of physicists, but software workers currently comprise around 50 percent of the team. Theres also many external collaborations within the MQV.

Said Schulz, My team is kind of this microcosm of the community. Ive got computer scientists, software developers, electrical engineers, and quantum physicists, experimental and theoretical, I have this really nice little community that represents this bigger picture. Whats funny is some of the issues that we face as a team. We were just doing all of our annual reviews, and the HPC people were saying, Im good on the quantum more or less, but need to know a lot more about how this works. The quantum people were like, I really need to understand HPC more, for example how does the scheduling work? Were cross-training each other within our own team, to ensure that everybody has a baseline to understand how this comes together smartly.

Mixing quantum computing and traditional HPC in the same facility has also prompted new challenges.

Schulz said, With HPC, and energy efficiency, the whole infrastructure is already complex and has evolved on its own. But now weve got these cryostats that were taking care of and were having to change out the nitrogen on a particular schedule. Were having to learn how to calibrate these things and how to maintain the calibration. Were having to learn a whole new set of operational programs. We have to worry about all these other external factors humidity, temperature, electromagnetic radiation. This is a new instrument that we have for the compute and we have to figure out how to bring into an HPC center.

Some of this technology is coming straight out of the physics labs, and were going to startup companies for some of the early pieces of this. Weve got to try to help them understand what its going to take in their evolution and their form factors and their stability to be to be able to leave the system alone and have it function at the same level of care taking as is an HPC system.

During this developmental stage, creating a hybrid quantum-classical HPC infrastructure at Leibniz QIC is an all-hands-on-deck enterprise. Thats not practical long-term, said Schulz, We want to get to the point where we dont have to have trained and skilled experimental physicists on staff 24/7. Thats a little extreme to have dedicated experimental physicists taking care of these systems. We see ourselves, at this point, helping the maturation of these technologies where they can exist in an HPC center. Hopefully, as these systems become more commercially viable, were hoping to help them exist in a commercial market space.

Lately there has been buzz around chasing quantum advantage the notion of using quantum technology in a commercial application. Schulz urges both patience and a change in thinking.

When I hear quantum advantage, I know its usually used as metric for beating a classical computer in a particular application. I want to challenge that and suggest that there are other ways we should be thinking about quantum advantage. I think that for particular types of algorithms, for particular applications, or parts of an application, quantum is going to be fantastic or has the potential to be fantastic. However, thats when we have a real-world application, an assembly of algorithms involved, and all of that has to work together. Quantum may be able to take part of that load off of this overall application.

Im looking at it from the HPC-QC integration and how all of this works together. So, Im thinking about what is the HPC-QC advantage? What does that mean? I mentioned things like energy. So, energy may end up being an advantage for quantum over HPC. Theres other parameters that we should be thinking about. I know that everybodys been shooting for that (quantum advantage). I think that thats going to be a bit farther out. Lets be honest, you know, theres a lot of friction points that we have to sort out along the way.

It will be interesting to monitor how QC-HPC integration efforts proceed at Leibniz QIC. The notion of a single stack able to manage multiple qubit modalities and systems and traditional HPC resources together, seamlessly, is enticing. Stay tuned.

Visit link:
Leibniz QIC's Mission to Coax Qubits and Bits to Work Together - HPCwire

Read More..

What is the National Cybersecurity Strategy? A cybersecurity expert explains what it is and what the Biden administration has changed – The…

The Biden administration released its first National Cybersecurity Strategy on March 2, 2023. The last version was issued in 2018 during the Trump administration.

As the National Security Strategy does for national defense, the National Cybersecurity Strategy outlines a presidents priorities regarding cybersecurity issues. The document is not a directive. Rather, it describes in general terms what the administration is most concerned about, who its major adversaries are and how it might achieve its goals through legislation or executive action. These types of strategy statements are often aspirational.

As expected, the 2023 Biden National Cybersecurity Strategy reiterates previous recommendations about how to improve American cybersecurity. It calls for improved sharing of information between the government and private sector about cybersecurity threats, vulnerabilities and risks. It prescribes coordinating cybersecurity incident response across the federal government and enhancing regulations. It describes the need to expand the federal cybersecurity workforce. It emphasizes the importance of protecting the countrys critical infrastructure and federal computer systems. And it identifies China, Russia, Iran and North Korea as Americas main adversaries in cyberspace.

However, as a former cybersecurity industry practitioner and current cybersecurity researcher, I think that the 2023 document incorporates some fresh ideas and perspectives that represent a more holistic approach to cybersecurity. At the same time, though, some of what is proposed may not be as helpful as envisioned.

Some of the key provisions in the current National Cybersecurity Strategy relate to the private sector, both in terms of product liability and cybersecurity insurance. It also aims to reduce the cybersecurity burden on individuals and smaller organizations. However, I believe it doesnt go far enough in fostering information-sharing or addressing the specific tactics and techniques used by attackers.

For decades, the technology industry has operated under what is known as shrink-wrap licensing. This refers to the multiple pages of legal text that customers, both large and small, routinely are forced to accept before installing or using computer products, software and services.

While much has been written about these agreements, such licenses generally have one thing in common: They ultimately protect vendors such as Microsoft or Adobe from legal consequences for any damages or costs arising from a customers use of their products, even if the vendor is at fault for producing a flawed or insecure product that affects the end user.

In a groundbreaking move, the new cybersecurity strategy says that while no product is totally secure, the administration will work with Congress and the private sector to prevent companies from being shielded from liability claims over the security of their products. These products underpin most of modern society.

Removing that legal shield is likely to encourage companies to make security a priority in their product development cycles and have a greater stake in the reliability of their products beyond the point of sale.

In another noteworthy shift, the strategy observes that end users bear too great a burden for mitigating cybersecurity risks. It states that a collaborative approach to cybersecurity and resiliency cannot rely on the constant vigilance of our smallest organizations and individual citizens. It stresses the importance of manufacturers of critical computer systems, as well as companies that operate them, in taking a greater role in improving the security of their products. It also suggests expanded regulation toward that goal may be forthcoming.

Interestingly, the strategy places great emphasis on the threat from ransomware as the most pressing cybercrime facing the U.S. at all levels of government and business. It now calls ransomware a national security threat and not simply a criminal matter.

The new strategy also directs the federal government to consider taking on some responsibility for so-called cybersecurity insurance.

Here, the administration wants to ensure that insurance companies are adequately funded to respond to claims following a significant or catastrophic cybersecurity incident. Since 2020, the market for cybersecurity-related insurance has grown nearly 75%, and organizations of all sizes consider such policies necessary.

This is understandable given how many companies and government agencies are reliant on the internet and corporate networks to conduct daily operations. By protecting, or backstopping, cybersecurity insurers, the administration hopes to prevent a major systemic financial crisis for insurers and victims during a cybersecurity incident.

However, cybersecurity insurance should not be treated as a free pass for complacency. Thankfully, insurers now often require policyholders to prove they are following best cybersecurity practices before approving a policy. This helps protect them from issuing policies that are likely to face claims arising from gross negligence by policyholders.

In addition to dealing with present concerns, the strategy also makes a strong case for ensuring the U.S. is prepared for the future. It speaks about fostering technology research that can improve or introduce cybersecurity in such fields as artificial intelligence, critical infrastructure and industrial control systems.

The strategy specifically warns that the U.S. must be prepared for a post-quantum future where emerging technologies could render existing cybersecurity controls vulnerable. This includes current encryption systems that could be broken by future quantum computers.

While the National Cybersecurity Strategy calls for continuing to expand information-sharing related to cybersecurity, it pledges to review federal classification policy to see where additional classified access to information is necessary.

The federal government already suffers from overclassification, so if anything, I believe less classification of cybersecurity information is needed to facilitate better information-sharing on this issue. Its important to reduce administrative and operational obstacles to effective and timely interaction, especially where collaborative relationships are needed between industry, academia and federal and state governments. Excessive classification is one such challenge.

Further, the strategy does not address the use of cyber tactics, techniques and procedures in influence or disinformation campaigns and other actions that might target the U.S. This omission is perhaps intentional because, although cybersecurity and influence operations are often intertwined, reference to countering influence operations could lead to partisan conflicts over freedom of speech and political activity. Ideally, the National Cybersecurity Strategy should be apolitical.

That being said, the 2023 National Cybersecurity Strategy is a balanced document. While in many ways it reiterates recommendations made since the first National Cybersecurity Strategy in 2002, it also provides some innovative ideas that could strengthen U.S. cybersecurity in meaningful ways and help modernize Americas technology industry, both now and into the future.

If so, youll be interested in our free daily newsletter. Its filled with the insights of academic experts, written so that everyone can understand whats going on in the world. With the latest scientific discoveries, thoughtful analysis on political issues and research-based life tips, each email is filled with articles that will inform you and often intrigue you.

Get our newsletters

Editor and General Manager

Find peace of mind, and the facts, with experts. Add evidence-based articles to your news digest. No uninformed commentariat. Just experts. 90,000 of them have written for us. They trust us. Give it a go.

Get our newsletter

If you found the article you just read to be insightful, youll be interested in our free daily newsletter. Its filled with the insights of academic experts, written so that everyone can understand whats going on in the world. Each newsletter has articles that will inform and intrigue you.

Subscribe now

CEO | Editor-in-Chief

It helps you go deeper into key political issues and also introduces you to the diversity of research coming out of the continent. It's not about breaking news. It's not about unfounded opinions. The Europe newsletter is evidence-based expertise from European scholars, presented by myself in France, and two of my colleagues in Spain and the UK.

Get our newsletter

Head of English section, France edition

See more here:
What is the National Cybersecurity Strategy? A cybersecurity expert explains what it is and what the Biden administration has changed - The...

Read More..

Quantum engineers have designed a new tool to probe nature with extreme sensitivity – Newswise

Newswise In a paper published over the weekend in the journalScience Advances,Associate Professor Jarryd Plaand his team fromUNSW School of Electrical Engineering and Telecommunications, together with colleagueScientia Professor Andrea Morello, described a new device that can measure the spins in materials with high precision.

The spin of an electron whether it points up or down is a fundamental property of nature, says A/Prof. Pla. It is used in magnetic hard disks to store information, MRI machines use the spins of water molecules to create images of the inside of our bodies, and spins are even being used to build quantum computers.

Being able to detect the spins inside materials is therefore important for a whole range of applications, particularly in chemistry and biology where it can be used to understand the structure and purpose of materials, allowing us to design better chemicals, drugs and so on.

In fields of research such as chemistry, biology, physics and medicine, the tool that is used to measure spins is called a spin resonance spectrometer. Normally, commercially produced spectrometers require billions to trillions of spins to get an accurate reading, but A/Prof. Pla and his colleagues were able to measure spins of electrons in the order of thousands, meaning the new tool was about a million times more sensitive.

This is quite a feat, as there are a whole range of systems that cannot be measured with commercial tools, such as microscopic samples, two-dimensional materials and high-quality solar cells, which simply have too few spins to create a measurable signal.

The breakthrough came about almost by chance, as the team were developing a quantum memory element for a superconducting quantum computer. The objective of the memory element was to transfer quantum information from a superconducting electrical circuit to an ensemble of spins placed beneath the circuit.

We noticed that while the device didnt quite work as planned as a memory element, it was extremely good at measuring the spin ensemble, says Wyatt Vine, a lead author on the study. We found that by sending microwave power into the device as the spins emitted their signals, we could amplify the signals before they left the device. Whats more, this amplification could be performed with very little added noise, almost reaching the limit set by quantum mechanics.

While other highly sensitive spectrometers using superconducting circuits had been developed in the past, they required multiple components, were incompatible with magnetic fields and had to be operated in very cold environments using expensive dilution refrigerators, which reach temperatures down to 0.01 Kelvin.

In this new development, A/Prof. Pla says he and the team managed to integrate the components on a single chip.

Our new technology integrates several important parts of the spectrometer into one device and is compatible with relatively large magnetic fields. This is important, since measure the spins they need to be placed in a field of about 0.5 Tesla, which is ten thousand times stronger than the earths magnetic field.

Further, our device operated at a temperature more than 10 times higherthan previous demonstrations, meaning we dont need to use a dilution refrigerator.

A/Prof. Pla says the UNSW team has patented the technology with a view to potentially commercialise, but stresses that there is still work to be done.

There is potential to package this thing up and commercialize it which will allow other researchers to plug it into their existing commercial systems to give them a sensitivity gain.

If this new technology was properly developed, it could help chemists, biologists and medical researchers, who currently rely on tools made by these large tech companies that work, but which could do something orders of magnitude better.

See the rest here:
Quantum engineers have designed a new tool to probe nature with extreme sensitivity - Newswise

Read More..

The philosopher: A conversation with Grady Booch – InfoWorld

Grady Booch is a unique voice in computing, whose contributions encompass a wide range of interests. He introduced the Booch method, which led to his co-creation of the Unified Modeling Language. He also helped usher in the use of design patterns and agile methods and has written a large corpus of books and articles addressing software engineering and software architecture. Today, he is chief scientist for software engineering at IBM Research

Grady Booch, chief scientist for software engineering at IBM Research.

and is creating a documentary exploring the intersection of computing and what it means to be human at Computing: The Human Experience.

Our recent conversation touched on both practical and philosophical aspects of human-computer interaction and co-evolution, artificial intelligence, quantum machines, and Web3.

Tyson: Thanks for the chance to talk, Grady!

Theres so much to cover.Let me begin by asking something "of the moment." There has been an almost cultural war between object-oriented programming and functional programming. What is your take on this?

Booch: I had the opportunity to conduct an oral history with John Backusone of the pioneers of functional programmingin 2006 on behalf of the Computer History Museum. I asked John why functional programming didnt enter the mainstream, and his answer was perfect: Functional programming makes it easy to do hard things he said, but functional programming makes it very difficult to do easy things.

Functional Programming has a role to play: many web-centric software-intensive systems at global elastic scale are well-served with having some elements written in stateless form, and thats precisely what functional programming is good for. But remember this: thats still only a part of those systems, and furthermore, there is much, much more to the world of computing than web-centric systems at global elastic scale.

Tyson: Okay, let me leap across from the specific to the general: what is software? What is a computer?Why are these seemingly obvious things so significant?

Booch: If you were to have asked me that question at the turn of the centurythe start of the 1900s, I meanI would have said a computer is a person who computes, and as for software, I would have no idea what you meant. You see, the term computer was at first a personusually a womanliterally someone who calculated/computed. It wasnt until we began to devise machines in the mid 1900s that we replaced the activity of those squishy organic computers with relays, vacuum tubes, and and eventually transistors.

Even if we consider the Turing test, Alan had in mind the question of whether we could build a machine that duplicated the ability of a human to think. As for the term software, its etymology tells us a great deal about how astonishingly young the field of computing is. The term digital was first coined by George Stibitz in 1942, and the term software was introduced by John Tukey in 1952. Heres an easy way to distinguish the terms: when something goes wrong, hardware is the thing you kick and software is the thing you yell at.

Tyson: You said in our earlier chat that perhaps the most important outcome of our computing technology is that it compels us to examine what it means to be human. Would you continue that thought?

Booch: The story of computing is the story of humanity. This is a story of ambition, invention, creativity, vision, avarice, and serendipity, all powered by a refusal to accept the limits of our bodies and our minds. As we co-evolve with computing, the best of us and the worst of us is amplified, and along the way, we are challenged as to what it means to be intelligent, to be creative, to be conscious. We are on a journey to build computers in our own image, and that means we have to not only understand the essence of who we are, but we must also consider what makes us different.

Tyson: Babbage said, We may propose to execute, by means of machinery, the mechanical branch of these labours, reserving for pure intellect that which depends on the reasoning faculties. Where are we at in that journey?

Booch: Actually, I think his colleague, Ada Augusta, Countess of Lovelace, better understood the potential of computers than he ever did. The Analytical Engine does not occupy common ground with mere 'calculating machines,' she said. Rather, it holds a position wholly of its own. Ada recognized that the symbols manipulated by machines could mean something more than numbers. The field of computing has made astonishing progress since the time of Babbage and Lovelace and Boole, but still, we are a very young discipline, and in many ways we have just begun.

Tyson: Speaking of Babbage does lead naturally to Ada Lovelace.I notice a strong thread in your work of pointing out the sometimes hidden role women play in moving us forward.How do you think we as a society are doing on that front?

Booch: Poorly. There was a time in the earliest days of computing when women played a far larger role. Annie Jump Cannon was the lead among the Harvard Computers in the 1800s; the ENIAC was programmed mainly by five women; Grace Hopper pioneered the idea of compilers and high-order programming languages. Sadly, a variety of economic and social and political forces have reduced the number of women in the ranks of computing. A dear colleague, Mar Hicks, has written extensively on these factors. We must do better. Computing impacts individuals, communities, societies, civilizations, and as such there must be equitable representation of all voices to shape its future.

Tyson: AI, especially conversational AI, has really taken off recently.What do you think is the next phase in that story?

Booch: Remember ELIZA from the mid-1960s? This was an early natural language system that absolutely astonished the world in its ability to carry out Rogerian therapy or at least a fair illusion of it. Weve come a long way, owing to a perfect storm; the rise of abundant computational resources, the accumulation of vast lakes of data, and the discovery of algorithms for neural networks, particularly a recent architecture called a transformer. In many ways, that recent advances we have seen with systems such as ChatGPT, Bard, and (in the visual world), DALL-E and Stable Diffusion have come about by applying these three elements at scale.

The field of artificial intelligence has seen a number of vibrant springs and dismal winters over the years, but this time it seems different: there are a multitude of economically-interesting use cases that are fueling the field, and so in the coming years we will see these advances weave themselves into our world. Indeed, AI already has: every time we take a photograph, search for a product to buy, interact with some computerized appliance, we are likely using AI in one way or another.

Chat systems will incrementally get better. But, that being said, we are still generations away from creating synthetic minds. In that journey, it is important that we consider not just what our machines can do, but what they do to us. As Allen Newellone of the early pioneers of artificial intelligencenoted, computer technology offers the possibility of incorporating intelligent behavior in all the nooks and crannies of our world. With it, we could build an enchanted world. To put it somewhat poetically, software is the invisible writing that whispers the stories of possibility to our hardware and we are the storytellers. Its up to us to decide if those stories amplify us, or diminish us.

Tyson: Quantum computing is alongside AI in terms of its revolutionary potential.Do you think well have a similar breakthrough in quantum computers anytime soon?

Booch: The underlying assumption of science is that the cosmos is understandable; the underlying assumption of computing is that the cosmos is computable. As such, from the lens of computing, we can imagine new worlds, but to make those things manifest, we must make programs that run on physical machines. As such, we must abide by the laws of physics, and quantum computing, at this current stage in its development, is mostly trying to find ways to work within those laws.

Two things I must mention. First, quantum computing is a bit of a misnomer: we dont store information in its quantum state for very long, we just process it. As such, I prefer the term quantum processing not quantum computing. Second, theoretically, non-quantum computers and quantum devices are Turing equivalent. They both have the same computational potential, and each have particular advantages and efficiencies, with very different scalability, latency, resiliency, correctness, and risk. Quantum machines are particularly good at attacking what are called NP problems, problems that grow harder and harder as their size increases. As for breakthroughs, I prefer to see this as a world of steady, continuous, incremental progress advancing over solving some very hard problems of physics and engineering.

Tyson: Quantum computing leads me to cryptographywhere, almost as a side-effect, it is able to attack public-key algorithms.I get the sense you are wary of blockchains ethics.Would you talk a bit about cryptography and Web3?

Booch: Web3 is a flaming pile of feces orbiting a giant dripping hairball. Cryptocurrenciesones not backed by the full faith and credit of stable nation stateshave only a few meaningful use cases, particularly if you are a corrupt dictator of a nation with a broken economic system, or a fraud and scammer who wants to grow their wealth at the expense of greater fools. I was one of the original signatories of a letter to Congress in 2022 for a very good reason: these technologies are inherently dangerous, they are architecturally flawed, and they introduce an attack surface that threatens economies.

Tyson:You said, I hope we will also see some normalization with regards to the expectations of large language models. Would you elaborate?

Booch: I stand with Gary Marcus, Timnit Gebru, and many others in this: large language models such as GPT and its peers are just stochastic parrots, very clever and useful mechanisms that offer the illusion of coherence but at the expense of having absolutely no degree of understanding. There are indeed useful purposes for LLMs, but at the same time, we must be cognizant of their risks and limitations.

Tyson: What do you make of transhumanism?

Booch: Its a nice word that has little utility for me other than as something people use to sell books and to write clickbait articles. That being said, lets return to an earlier theme in our interview: what it means to be human. Conscience, sentience, sapience are all exquisite consequences of the laws of physics. It is likely that the cosmos is teeming with life; it is also likely that sentient life is a rare outcome; it is also unlikely that, in the fullness of time of the cosmos, that we are the only sentient beings. That being said, we are, you, me, everyone reading this, are sentient beings, born of star-stuff and able to contemplate ourselves. That, for me is enough.

Tyson: Do you think well ever see conscious machines? Or, perhaps, something that compels us to accept them as such?

Booch: My experience tells me that the mind is computable. Hence, yes, I have reason to believe that we will see synthetic minds. But not in my lifetime; or yours; or your children; or your childrens children. Remember, also, that this will likely happen incrementally, not with a bang, and as such, we will co-evolve with these new species.

Tyson: Everyone should look at your lists of books you've read.Knowing that you've read, A Universe of Consciousness gives me permission to ask: Do you hold a materialist viewpoint? (Or, falling completely into the realm of philosophy, What is consciousness?)

Booch: Let me put it this way: I have reason to believe I am conscious and sentient; I have reason to believe that you are, as well, because my theory of mind yields a consistency in our being. Reflecting Dennets point of view, consciousness is an illusion, but it is an exquisite illusion, one that enables me to see and be seen, know and be known, love and be loved. And for me, that is enough.

See the rest here:
The philosopher: A conversation with Grady Booch - InfoWorld

Read More..

Is the Indian govt developing quantum cyber-attack proof systems? – MediaNama.com

In response to a parliamentary question, the Indian government did not provide an explicit answer to whether it has developed systems capable of withstanding quantum cyber-attacks under the National Mission on Quantum Technologies and Applications. Responding to Congress MP Shashi Tharoor's written question on the matter yesterday, the Ministry of Science and Technology simply said: Yes Sir. Government is developing a proposal to initiate National Mission on Quantum Technologies & Applications with the objectives of building capabilities in Quantum Computing, Quantum Communication, Quantum Materials, Quantum Sensing and metrology. The Ministry added that the details and deliverables of the Mission, announced in the 2020-21 Budget, are yet to be finalised. Wait, what's quantum computing? Without delving too deeply into its physics, "quantum" here refers to quantum mechanicswhich is what the computer system uses to calculate an output. So, essentially, quantum computers use concepts of quantum physics and apply them to computers. The hope is that they'll help systems process things super fast. Or, as WIREDputs it, "the basic idea is to smash some barriers that limit the speed of existing computers by harnessing the counterintuitive physics of subatomic scales". Right, so why doquantum cyberattacks matter? Right now, much of the electronic information we send online is encrypted. This means information is coded to be unintelligible until the recipient gets it and decrypts or decodes the message using a specific key. Doing this ensures that a "computer system's information can't be stolen and read by someone who wants to use it for

Read more:
Is the Indian govt developing quantum cyber-attack proof systems? - MediaNama.com

Read More..

Quantum engineering meets nanoscale data processing: Unleashing the power of light-driven conductivity control – Phys.org

by Kosala Herath and Malin Premaratne , Phys.org

Over the past few decades, the field of data processing and transferring technology has advanced at a rapid pace. This growth can be attributed to Moore's Law, which predicts that the number of transistors on a microchip will double roughly every two years, enabling the semiconductor industry to make electronic devices smaller, faster, and more efficient.

However, electronic interconnections have presented challenges in data transferring due to delays and thermal issues that limit capacity. In an effort to address this issue, researchers have turned to the use of optical waves instead of electronic signals. Optical waves offer significant information-carrying capacity and minimal loss, but the challenge lies in miniaturizing photonic devices as much as electronic components.

Enter plasmonics, a research area that combines microscale photonics and nanoscale electronics to overcome this limitation [1]. Using surface plasmon polaritons (SPPs) to deliver light energy between nanophotonic devices, plasmonics offers a high degree of confinement, overcoming the limitations of conventional dielectric waveguides. With plasmonics, it is possible to manipulate light at the nanoscale and create a world of possibilities for the future of data processing.

However, a significant challenge arises in simultaneously achieving longer propagation lengths and highly confined SPP modes. This is where Floquet engineering comes in. This cutting-edge technology transforms quantum material engineering through high-intensity periodic driving, allowing researchers to manipulate matter in previously considered impossible [2, 3].

To address this challenge, we have proposed a comprehensive theoretical framework that uses quantum electrodynamics coupled with Floquet engineering to enhance the propagation length of SPP modes [4]. By modifying the electrical and optical characteristics of metal nanostructures within plasmonic waveguides, it is possible to enhance SPP propagation lengths significantly.

It was observed that exposing a metallic system to this particular form of light (dressing field) increases the propagation length of SPP (surface plasmon polariton) modes due to an enhancement in the metal's conductivity. This can be explained by the fact that SPPs are created by the collective movements of electrons in the metal, and the metal's energy loss can be attributed to electron scattering. Dressing by the external field modifies the wave functions and, thereby, the rates, which can improve the conductivity and increase the propagation length of SPP modes.

Finally, the plasmonic and conductive properties of popular plasmonic metals, including silver (Ag), gold (Au), copper (Cu), and aluminum (Al), were analyzed under the illumination of a particular form of light using computer simulation based on code specifically written to explore this novel observation. Based on the evaluation, several metals were found be much suited for application due to their controllable conductivity response. This discovery could lead to the development of efficient and advanced nanoscale plasmonic data processing devices, circuits, and components in the near future.

This story is part of Science X Dialog, where researchers can report findings from their published research articles. Visit this page for information about ScienceX Dialog and how to participate.

More information: [1]. Malin Premaratne and Govind P. Agrawal, Theoretical foundations of nanoscale quantum devices, Cambridge University Press (2021). doi.org/10.1017/9781108634472

[2]. Kosala Herath and Malin Premaratne, Generalized model for the charge transport properties of dressed quantum Hall systems, Physical Review B (2022). DOI: 10.1103/PhysRevB.105.035430

[3]. Kosala Herath and Malin Premaratne, Polarization effect on dressed plasmonic waveguides, Emerging Imaging and Sensing Technologies for Security and Defence VII (2022). DOI: 10.1117/12.2635710

[4]. Kosala Herath and Malin Premaratne, Floquet engineering of dressed surface plasmon polariton modes in plasmonic waveguides, Physical Review B (2022). DOI: 10.1103/PhysRevB.106.235422

Kosala Herath is a Ph.D. candidate and a member of the Advanced Computing and Simulations Laboratory (qdresearch.net/) at the Electrical and Computer System Engineering, Monash University, Australia. He received his B.Sc. degree in Electronic and Telecommunication Engineering (with first-class honors) from the University of Moratuwa, Sri Lanka in 2018. Currently, his research focused on the fields of nanoplasmonics, low-dimensional electron transport, and Floquet systems. He is a member of SPIE.

Malin Premaratne earned several degrees from the University of Melbourne, including a B.Sc. in mathematics, a B.E. in electrical and electronics engineering (with first-class honors), and a PhD in 1995, 1995, and 1998, respectively. He has been leading the research program in high-performance computing applications to complex systems simulations at the Advanced Computing and Simulation Laboratory, Monash University, Clayton, since 2004. Currently, he serves as the Vice President of the Academic Board of Monash University and a Full Professor. In addition to his work at Monash University, Professor Premaratne is also a Visiting Researcher at several prestigious institutions, including the Jet- Propulsion Laboratory at Caltech, the University of Melbourne, the Australian National University, the University of California Los Angeles, the University of Rochester New York, and Oxford University. He has published more than 250 journal papers and two books and has served as an associate editor for several leading academic journals, including IEEE Photonics Technology Letters, IEEE Photonics Journal, and OSA Advances in Optics and Photonics. Professor Premaratne's contributions to the field of optics and photonics have been recognized with numerous fellowships, including the Fellow of the Optical Society of America (FOSA), the Society of Photo-Optical Instrumentation Engineers USA (FSPIE), the Institute of Physics U.K. (FInstP), the Institution of Engineering and Technology U.K. (FIET), and The Institute of Engineers Australia (FIEAust).

Journal information: Physical Review B

Read the original here:
Quantum engineering meets nanoscale data processing: Unleashing the power of light-driven conductivity control - Phys.org

Read More..

Quantum Leap episode 15 Review: Ben Song to the defense – VULKK.com

Ben Song to the defense has Jen, not Ben, shine in the entire episode 15 of Quantum Leap while Ben and Addison take a backseat.

Episode 15 has Ben Song leap into the body of a young woman once again. This time its a public defender. Her name is Aleyda Ramirez and she is defending a young boy who was sentenced before he even went to trial.

But it is up to Ben and Team Quantum Leap in the shape of Jen to help and keep this innocent young man from jail. The police apprehended the boy after he threatened a gang member who was trying to goad his little brother into thug life.

However since there were no witnesses, hes already been convicted before a fair trial happened. Not officially, but when Assistant District Attorney Barnes threatens Ben that if he doesnt take the plea deal, he will put his client away for life.

In the meantime, Ben finds out that Aleyda has a girlfriend named Vicky, who works for the District Attorneys office. Which is an ethics violation and if people ever find out then 100s of cases will have to be reviewed due to conflict of interest. But the 2 love each other so they are willing to take that risk. Whats love without a little risk huh?

Then there is the corrupt A.D.A. Barnes who goes after people with ferocity and doesnt care whether or not the person is innocent or not. Which makes me wonder how he even got the job. Did nobody ask for an ethics test or something? To check if this person is suited for what he is doing. And Vicky is willing to turn a blind eye because she looks up to him.

This poses a problem when Ben notices a page with notes of witnesses who can exonerate Bens client. When Ben confronts Vicky, Vicky storms off and as a gut punch, she says happy anniversary. Because it was the day the 2 got together.

When Ben is in a rut and cant figure out how to keep the kid out of jail. Jen comes up with a story about rabbits drowning in a river. The boy keeps saving the rabbits but quickly figures out that if he goes upstream then he may find the source of the rabbit issues.

On top of that, Vicky and Ben are still not on speaking terms. This saddens Ben because he feels he is messing up his host bodys life. But Vicky has a change of heart and wants to correct her mistake and with the help of Vicky and knowledge of the future, Vicky and Ben took down the ADA for corruption.

The boy who was released eventually became the first college graduate of the family and turned out to be a good lawyer.

This weeks Ben was funny. He was once again a woman and this time a lesbian woman as well. To make the distinction in Ben Song to the Defense, Ben wore classy earrings during the entire episode. And pants suits because no classy lawyer can leave the house without that.

But this whole episode was not about Ben. Ben once again takes a backseat to make Jen shine. This whole weeks episode was about Jen helping Ben and finally shows her wonderful true colors.

Have I mentioned that there is barely any Addison in this weeks episode Ben Song to the defense? This week Jen is the hologram to assist Ben. Its interesting to see a different dynamic other than Addison and Ben. We have seen a plethora of that over the course of the season.

Jen wanted to assist Ben this week and having both Jen and Addison proved too much for Quantum Computer Ziggy. So Addison took the backseat while investigating what happened to Aleyda and Vicky.

And the thing is, the dynamic between the two was quite amazing. I mean, I know Addison and Ben are lovers. And Ben at least has some awareness of his love for Addison, I dont think he knows the full story since he barely remembers Magic. Oh, there we go. Yes, he does. But more about that later.

Jen as a character has been very underused. All we see her do is run intel ops for Team Quantum Leap and we only see the action but not much of the character. In Ben Song to the defense, my view on Jen has changed, and here is why.

In this episode, we get to see Jens amazing character on full display. She is not as much of a goodie two shoe but shes morally gray. Ill give you an example of that.

There was a couple on the show and the husband was jailed. And the bail was quite high too, $50.000. And Jen urges Ben to have the woman go to the race track and bet on a horse. The money won was exactly $50.000 and thus the woman could post bail to free her husband.

This to me shows that Jen is willing to do something dark, gambling, in order to get something good done. This makes Jen the morally gray member of Team Quantum Leap. This is promising for the future. So far we have seen all of them being quite attuned to the light side and Jen is the first character who isnt.

The question will be if Leaper X shows up and puts her in the same situation, what her next action is going to be? Will she screw over Team Quantum Leap in order to get something good done or side with Team Quantum Leap to get Leaper X down?

Also, Leaper X is, of course, Ramirez. As per this episode, the Quantum Reapers are officially named Leaper X by the show. So I guess we will adapt to that name and say farewell to the Quantum Reaper name.

After the story of the rabbits that needed to be saved from the river, Ben quickly figures out who Jen means and it is Magic. And I guess that after Ian, Addison, and Jen, the next episode will feature Magic heavily in order to get the gang ready for the first season finale of the show. With 3 episodes left to go, the show continues to build up momentum for the first finale.

Who knows, the writers may even use a few bits of the episodes to develop Janis Calavicci a bit more. I would regard her as the darkest member of Team Quantum Leap. Even if shes the reserve member and technically jailed, her contributions shaped Team Quantum Leap a lot and she at least deserves a bit more credit by developing her character a little more in the final episodes of the season up until the finale.

Janis character, I would regard that as the darkest one. Her dealings are very shady compared to the rest of them. Blowing up houses, using cyber warfare in order to get things done, and putting odd drugs in her mothers drinks to get her fathers old holo communicator.

The episode Ben Song to the defense was a good episode but once again, Ben takes a backseat and it is Jen that truly shines. And she deserved it after a season of her not even being in the backseat but waiting for the car rather as a figure of speech.

Read more from the original source:
Quantum Leap episode 15 Review: Ben Song to the defense - VULKK.com

Read More..

IBM Quantum System One unveiled at Cleveland Clinic – Crain’s Cleveland Business

Through their 10-year Discovery Accelerator partnership, Cleveland Clinic and IBM have unveiled the IBM-managed quantum computer, billed as the first of its kind in the world dedicated to health care research.

Installed on the Clinic's main campus, the IBM Quantum System One aims to help accelerate biomedical discoveries and is the first deployment of an onsite private sector IBM-managed quantum computer in the United States, according to a news release.

With the Clinic's expertise in health care and life sciences and IBM's expertise in technology, the two organizations each bring different skills and competencies to the partnership, said Ruoyi Zhou, director of the IBM Discovery Accelerator at Cleveland Clinic.

"We have something in common; that is innovation," Zhou said. "And so, when we work together, we identify problems that are really suitable for quantum computing and artificial intelligence."

A rapidly emerging technology, quantum computing harnesses the laws of quantum mechanics to solve problems that are currently impractical or impossible to solve on today's supercomputers. The IBM Quantum System One, which will be made available for use through collaborations at a cost, and its new computational spaces could help researchers discover medicines and treatments more quickly.

For instance, bringing a drug from discovery to a patient can take upwards of 17 to 20 years, said Dr. Serpil Erzurum, the Clinic's chief research and academic officer. This technology can potentially accelerate that timeline to just two to three years, meaning if a patient gets sick and providers know the cause, "this could design the right drug treatment for you," she said.

"And as a physician caregiver, and as you know, a human being and a mom, you want things soon, you don't want to wait for them, right?" she said. "You don't want to wait 20 years."

The Clinic and IBM are both contributing resources to the effort, with the Clinic's cost being part of its $300 million commitment to the Cleveland Innovation District, a $565 million multi-institution, public-private push to create 20,000 jobs and boost research in the city.

The Cleveland Clinic-IBM Discovery Accelerator partnership, announced in 2021, focuses on advancing the pace of biomedical research through the use of high-performance computing, artificial intelligence and quantum computing, according to the release. It serves as the technology foundation for the Clinic's Global Center for Pathogen & Human Health Research, which is part of Cleveland Innovation District.

Through the innovation district overall, the Clinic estimates 1,000 new jobs will be generated at the system by 2029 and an additional 7,500 jobs in Ohio by 2034.

An educational curriculum is being designed for participants from high school to the professional level, offering training and certification programs in data science, machine learning and quantum computing, according to the news release, which notes a "significant part" of the collaboration focuses on job creation, economy growth and educating the workforce of the future.

"We are committed to developing the community ability for digital work," Erzurum said. "If you know how to work in this digital world, your opportunities expand exponentially."

IBM and the Clinic also are hosting research symposia, seminars and workshops intended for academia, industry, government and the public in an effort to build a critical mass of computing specialists in Cleveland, according to the release.

While many universities are teaching artificial intelligence, there aren't enough of those skills in the marketplace, Zhou said. And finding quantum computing skills is very difficult.

"We're looking at how do we build a workforce by partnering with universities, local universities, to develop the quantum skills locally," Zhou said.

To help expedite discoveries in biomedical research, the Discovery Accelerator has generated multiple projects that leverage the latest in quantum computing, AI and hybrid cloud, including, according to the release: developing quantum computing pipelines to screen and optimize drugs targeted to specific proteins, improving a prediction model for cardiovascular risk after non-cardiac surgery and applying AI to search genome sequencing findings and large drug-target databases to find effective, existing drugs that could help patients with Alzheimer's and other diseases.

Accelerating innovation and discovery of new drugs and targeted treatments will "dramatically" bring down costs, Erzurum said.

"So this, to me is a big part of our precision medicine initiative, the affordability of health care and drugs, and the time to treatment, all of which are critical when you have a very terrible disease," she said. "So that is my dream. And I want to see it happen here at the Cleveland Clinic and in Cleveland, Ohio, while I'm still living. I think it'll happen."

View original post here:
IBM Quantum System One unveiled at Cleveland Clinic - Crain's Cleveland Business

Read More..