Category Archives: Quantum Computer
Quantum engineers have designed a new tool to probe nature with extreme sensitivity – Newswise
Newswise In a paper published over the weekend in the journalScience Advances,Associate Professor Jarryd Plaand his team fromUNSW School of Electrical Engineering and Telecommunications, together with colleagueScientia Professor Andrea Morello, described a new device that can measure the spins in materials with high precision.
The spin of an electron whether it points up or down is a fundamental property of nature, says A/Prof. Pla. It is used in magnetic hard disks to store information, MRI machines use the spins of water molecules to create images of the inside of our bodies, and spins are even being used to build quantum computers.
Being able to detect the spins inside materials is therefore important for a whole range of applications, particularly in chemistry and biology where it can be used to understand the structure and purpose of materials, allowing us to design better chemicals, drugs and so on.
In fields of research such as chemistry, biology, physics and medicine, the tool that is used to measure spins is called a spin resonance spectrometer. Normally, commercially produced spectrometers require billions to trillions of spins to get an accurate reading, but A/Prof. Pla and his colleagues were able to measure spins of electrons in the order of thousands, meaning the new tool was about a million times more sensitive.
This is quite a feat, as there are a whole range of systems that cannot be measured with commercial tools, such as microscopic samples, two-dimensional materials and high-quality solar cells, which simply have too few spins to create a measurable signal.
The breakthrough came about almost by chance, as the team were developing a quantum memory element for a superconducting quantum computer. The objective of the memory element was to transfer quantum information from a superconducting electrical circuit to an ensemble of spins placed beneath the circuit.
We noticed that while the device didnt quite work as planned as a memory element, it was extremely good at measuring the spin ensemble, says Wyatt Vine, a lead author on the study. We found that by sending microwave power into the device as the spins emitted their signals, we could amplify the signals before they left the device. Whats more, this amplification could be performed with very little added noise, almost reaching the limit set by quantum mechanics.
While other highly sensitive spectrometers using superconducting circuits had been developed in the past, they required multiple components, were incompatible with magnetic fields and had to be operated in very cold environments using expensive dilution refrigerators, which reach temperatures down to 0.01 Kelvin.
In this new development, A/Prof. Pla says he and the team managed to integrate the components on a single chip.
Our new technology integrates several important parts of the spectrometer into one device and is compatible with relatively large magnetic fields. This is important, since measure the spins they need to be placed in a field of about 0.5 Tesla, which is ten thousand times stronger than the earths magnetic field.
Further, our device operated at a temperature more than 10 times higherthan previous demonstrations, meaning we dont need to use a dilution refrigerator.
A/Prof. Pla says the UNSW team has patented the technology with a view to potentially commercialise, but stresses that there is still work to be done.
There is potential to package this thing up and commercialize it which will allow other researchers to plug it into their existing commercial systems to give them a sensitivity gain.
If this new technology was properly developed, it could help chemists, biologists and medical researchers, who currently rely on tools made by these large tech companies that work, but which could do something orders of magnitude better.
See the rest here:
Quantum engineers have designed a new tool to probe nature with extreme sensitivity - Newswise
What is the National Cybersecurity Strategy? A cybersecurity expert explains what it is and what the Biden administration has changed – The…
The Biden administration released its first National Cybersecurity Strategy on March 2, 2023. The last version was issued in 2018 during the Trump administration.
As the National Security Strategy does for national defense, the National Cybersecurity Strategy outlines a presidents priorities regarding cybersecurity issues. The document is not a directive. Rather, it describes in general terms what the administration is most concerned about, who its major adversaries are and how it might achieve its goals through legislation or executive action. These types of strategy statements are often aspirational.
As expected, the 2023 Biden National Cybersecurity Strategy reiterates previous recommendations about how to improve American cybersecurity. It calls for improved sharing of information between the government and private sector about cybersecurity threats, vulnerabilities and risks. It prescribes coordinating cybersecurity incident response across the federal government and enhancing regulations. It describes the need to expand the federal cybersecurity workforce. It emphasizes the importance of protecting the countrys critical infrastructure and federal computer systems. And it identifies China, Russia, Iran and North Korea as Americas main adversaries in cyberspace.
However, as a former cybersecurity industry practitioner and current cybersecurity researcher, I think that the 2023 document incorporates some fresh ideas and perspectives that represent a more holistic approach to cybersecurity. At the same time, though, some of what is proposed may not be as helpful as envisioned.
Some of the key provisions in the current National Cybersecurity Strategy relate to the private sector, both in terms of product liability and cybersecurity insurance. It also aims to reduce the cybersecurity burden on individuals and smaller organizations. However, I believe it doesnt go far enough in fostering information-sharing or addressing the specific tactics and techniques used by attackers.
For decades, the technology industry has operated under what is known as shrink-wrap licensing. This refers to the multiple pages of legal text that customers, both large and small, routinely are forced to accept before installing or using computer products, software and services.
While much has been written about these agreements, such licenses generally have one thing in common: They ultimately protect vendors such as Microsoft or Adobe from legal consequences for any damages or costs arising from a customers use of their products, even if the vendor is at fault for producing a flawed or insecure product that affects the end user.
In a groundbreaking move, the new cybersecurity strategy says that while no product is totally secure, the administration will work with Congress and the private sector to prevent companies from being shielded from liability claims over the security of their products. These products underpin most of modern society.
Removing that legal shield is likely to encourage companies to make security a priority in their product development cycles and have a greater stake in the reliability of their products beyond the point of sale.
In another noteworthy shift, the strategy observes that end users bear too great a burden for mitigating cybersecurity risks. It states that a collaborative approach to cybersecurity and resiliency cannot rely on the constant vigilance of our smallest organizations and individual citizens. It stresses the importance of manufacturers of critical computer systems, as well as companies that operate them, in taking a greater role in improving the security of their products. It also suggests expanded regulation toward that goal may be forthcoming.
Interestingly, the strategy places great emphasis on the threat from ransomware as the most pressing cybercrime facing the U.S. at all levels of government and business. It now calls ransomware a national security threat and not simply a criminal matter.
The new strategy also directs the federal government to consider taking on some responsibility for so-called cybersecurity insurance.
Here, the administration wants to ensure that insurance companies are adequately funded to respond to claims following a significant or catastrophic cybersecurity incident. Since 2020, the market for cybersecurity-related insurance has grown nearly 75%, and organizations of all sizes consider such policies necessary.
This is understandable given how many companies and government agencies are reliant on the internet and corporate networks to conduct daily operations. By protecting, or backstopping, cybersecurity insurers, the administration hopes to prevent a major systemic financial crisis for insurers and victims during a cybersecurity incident.
However, cybersecurity insurance should not be treated as a free pass for complacency. Thankfully, insurers now often require policyholders to prove they are following best cybersecurity practices before approving a policy. This helps protect them from issuing policies that are likely to face claims arising from gross negligence by policyholders.
In addition to dealing with present concerns, the strategy also makes a strong case for ensuring the U.S. is prepared for the future. It speaks about fostering technology research that can improve or introduce cybersecurity in such fields as artificial intelligence, critical infrastructure and industrial control systems.
The strategy specifically warns that the U.S. must be prepared for a post-quantum future where emerging technologies could render existing cybersecurity controls vulnerable. This includes current encryption systems that could be broken by future quantum computers.
While the National Cybersecurity Strategy calls for continuing to expand information-sharing related to cybersecurity, it pledges to review federal classification policy to see where additional classified access to information is necessary.
The federal government already suffers from overclassification, so if anything, I believe less classification of cybersecurity information is needed to facilitate better information-sharing on this issue. Its important to reduce administrative and operational obstacles to effective and timely interaction, especially where collaborative relationships are needed between industry, academia and federal and state governments. Excessive classification is one such challenge.
Further, the strategy does not address the use of cyber tactics, techniques and procedures in influence or disinformation campaigns and other actions that might target the U.S. This omission is perhaps intentional because, although cybersecurity and influence operations are often intertwined, reference to countering influence operations could lead to partisan conflicts over freedom of speech and political activity. Ideally, the National Cybersecurity Strategy should be apolitical.
That being said, the 2023 National Cybersecurity Strategy is a balanced document. While in many ways it reiterates recommendations made since the first National Cybersecurity Strategy in 2002, it also provides some innovative ideas that could strengthen U.S. cybersecurity in meaningful ways and help modernize Americas technology industry, both now and into the future.
If so, youll be interested in our free daily newsletter. Its filled with the insights of academic experts, written so that everyone can understand whats going on in the world. With the latest scientific discoveries, thoughtful analysis on political issues and research-based life tips, each email is filled with articles that will inform you and often intrigue you.
Get our newsletters
Editor and General Manager
Find peace of mind, and the facts, with experts. Add evidence-based articles to your news digest. No uninformed commentariat. Just experts. 90,000 of them have written for us. They trust us. Give it a go.
Get our newsletter
If you found the article you just read to be insightful, youll be interested in our free daily newsletter. Its filled with the insights of academic experts, written so that everyone can understand whats going on in the world. Each newsletter has articles that will inform and intrigue you.
Subscribe now
CEO | Editor-in-Chief
It helps you go deeper into key political issues and also introduces you to the diversity of research coming out of the continent. It's not about breaking news. It's not about unfounded opinions. The Europe newsletter is evidence-based expertise from European scholars, presented by myself in France, and two of my colleagues in Spain and the UK.
Get our newsletter
Head of English section, France edition
The philosopher: A conversation with Grady Booch – InfoWorld
Grady Booch is a unique voice in computing, whose contributions encompass a wide range of interests. He introduced the Booch method, which led to his co-creation of the Unified Modeling Language. He also helped usher in the use of design patterns and agile methods and has written a large corpus of books and articles addressing software engineering and software architecture. Today, he is chief scientist for software engineering at IBM Research
Grady Booch, chief scientist for software engineering at IBM Research.
and is creating a documentary exploring the intersection of computing and what it means to be human at Computing: The Human Experience.
Our recent conversation touched on both practical and philosophical aspects of human-computer interaction and co-evolution, artificial intelligence, quantum machines, and Web3.
Tyson: Thanks for the chance to talk, Grady!
Theres so much to cover.Let me begin by asking something "of the moment." There has been an almost cultural war between object-oriented programming and functional programming. What is your take on this?
Booch: I had the opportunity to conduct an oral history with John Backusone of the pioneers of functional programmingin 2006 on behalf of the Computer History Museum. I asked John why functional programming didnt enter the mainstream, and his answer was perfect: Functional programming makes it easy to do hard things he said, but functional programming makes it very difficult to do easy things.
Functional Programming has a role to play: many web-centric software-intensive systems at global elastic scale are well-served with having some elements written in stateless form, and thats precisely what functional programming is good for. But remember this: thats still only a part of those systems, and furthermore, there is much, much more to the world of computing than web-centric systems at global elastic scale.
Tyson: Okay, let me leap across from the specific to the general: what is software? What is a computer?Why are these seemingly obvious things so significant?
Booch: If you were to have asked me that question at the turn of the centurythe start of the 1900s, I meanI would have said a computer is a person who computes, and as for software, I would have no idea what you meant. You see, the term computer was at first a personusually a womanliterally someone who calculated/computed. It wasnt until we began to devise machines in the mid 1900s that we replaced the activity of those squishy organic computers with relays, vacuum tubes, and and eventually transistors.
Even if we consider the Turing test, Alan had in mind the question of whether we could build a machine that duplicated the ability of a human to think. As for the term software, its etymology tells us a great deal about how astonishingly young the field of computing is. The term digital was first coined by George Stibitz in 1942, and the term software was introduced by John Tukey in 1952. Heres an easy way to distinguish the terms: when something goes wrong, hardware is the thing you kick and software is the thing you yell at.
Tyson: You said in our earlier chat that perhaps the most important outcome of our computing technology is that it compels us to examine what it means to be human. Would you continue that thought?
Booch: The story of computing is the story of humanity. This is a story of ambition, invention, creativity, vision, avarice, and serendipity, all powered by a refusal to accept the limits of our bodies and our minds. As we co-evolve with computing, the best of us and the worst of us is amplified, and along the way, we are challenged as to what it means to be intelligent, to be creative, to be conscious. We are on a journey to build computers in our own image, and that means we have to not only understand the essence of who we are, but we must also consider what makes us different.
Tyson: Babbage said, We may propose to execute, by means of machinery, the mechanical branch of these labours, reserving for pure intellect that which depends on the reasoning faculties. Where are we at in that journey?
Booch: Actually, I think his colleague, Ada Augusta, Countess of Lovelace, better understood the potential of computers than he ever did. The Analytical Engine does not occupy common ground with mere 'calculating machines,' she said. Rather, it holds a position wholly of its own. Ada recognized that the symbols manipulated by machines could mean something more than numbers. The field of computing has made astonishing progress since the time of Babbage and Lovelace and Boole, but still, we are a very young discipline, and in many ways we have just begun.
Tyson: Speaking of Babbage does lead naturally to Ada Lovelace.I notice a strong thread in your work of pointing out the sometimes hidden role women play in moving us forward.How do you think we as a society are doing on that front?
Booch: Poorly. There was a time in the earliest days of computing when women played a far larger role. Annie Jump Cannon was the lead among the Harvard Computers in the 1800s; the ENIAC was programmed mainly by five women; Grace Hopper pioneered the idea of compilers and high-order programming languages. Sadly, a variety of economic and social and political forces have reduced the number of women in the ranks of computing. A dear colleague, Mar Hicks, has written extensively on these factors. We must do better. Computing impacts individuals, communities, societies, civilizations, and as such there must be equitable representation of all voices to shape its future.
Tyson: AI, especially conversational AI, has really taken off recently.What do you think is the next phase in that story?
Booch: Remember ELIZA from the mid-1960s? This was an early natural language system that absolutely astonished the world in its ability to carry out Rogerian therapy or at least a fair illusion of it. Weve come a long way, owing to a perfect storm; the rise of abundant computational resources, the accumulation of vast lakes of data, and the discovery of algorithms for neural networks, particularly a recent architecture called a transformer. In many ways, that recent advances we have seen with systems such as ChatGPT, Bard, and (in the visual world), DALL-E and Stable Diffusion have come about by applying these three elements at scale.
The field of artificial intelligence has seen a number of vibrant springs and dismal winters over the years, but this time it seems different: there are a multitude of economically-interesting use cases that are fueling the field, and so in the coming years we will see these advances weave themselves into our world. Indeed, AI already has: every time we take a photograph, search for a product to buy, interact with some computerized appliance, we are likely using AI in one way or another.
Chat systems will incrementally get better. But, that being said, we are still generations away from creating synthetic minds. In that journey, it is important that we consider not just what our machines can do, but what they do to us. As Allen Newellone of the early pioneers of artificial intelligencenoted, computer technology offers the possibility of incorporating intelligent behavior in all the nooks and crannies of our world. With it, we could build an enchanted world. To put it somewhat poetically, software is the invisible writing that whispers the stories of possibility to our hardware and we are the storytellers. Its up to us to decide if those stories amplify us, or diminish us.
Tyson: Quantum computing is alongside AI in terms of its revolutionary potential.Do you think well have a similar breakthrough in quantum computers anytime soon?
Booch: The underlying assumption of science is that the cosmos is understandable; the underlying assumption of computing is that the cosmos is computable. As such, from the lens of computing, we can imagine new worlds, but to make those things manifest, we must make programs that run on physical machines. As such, we must abide by the laws of physics, and quantum computing, at this current stage in its development, is mostly trying to find ways to work within those laws.
Two things I must mention. First, quantum computing is a bit of a misnomer: we dont store information in its quantum state for very long, we just process it. As such, I prefer the term quantum processing not quantum computing. Second, theoretically, non-quantum computers and quantum devices are Turing equivalent. They both have the same computational potential, and each have particular advantages and efficiencies, with very different scalability, latency, resiliency, correctness, and risk. Quantum machines are particularly good at attacking what are called NP problems, problems that grow harder and harder as their size increases. As for breakthroughs, I prefer to see this as a world of steady, continuous, incremental progress advancing over solving some very hard problems of physics and engineering.
Tyson: Quantum computing leads me to cryptographywhere, almost as a side-effect, it is able to attack public-key algorithms.I get the sense you are wary of blockchains ethics.Would you talk a bit about cryptography and Web3?
Booch: Web3 is a flaming pile of feces orbiting a giant dripping hairball. Cryptocurrenciesones not backed by the full faith and credit of stable nation stateshave only a few meaningful use cases, particularly if you are a corrupt dictator of a nation with a broken economic system, or a fraud and scammer who wants to grow their wealth at the expense of greater fools. I was one of the original signatories of a letter to Congress in 2022 for a very good reason: these technologies are inherently dangerous, they are architecturally flawed, and they introduce an attack surface that threatens economies.
Tyson:You said, I hope we will also see some normalization with regards to the expectations of large language models. Would you elaborate?
Booch: I stand with Gary Marcus, Timnit Gebru, and many others in this: large language models such as GPT and its peers are just stochastic parrots, very clever and useful mechanisms that offer the illusion of coherence but at the expense of having absolutely no degree of understanding. There are indeed useful purposes for LLMs, but at the same time, we must be cognizant of their risks and limitations.
Tyson: What do you make of transhumanism?
Booch: Its a nice word that has little utility for me other than as something people use to sell books and to write clickbait articles. That being said, lets return to an earlier theme in our interview: what it means to be human. Conscience, sentience, sapience are all exquisite consequences of the laws of physics. It is likely that the cosmos is teeming with life; it is also likely that sentient life is a rare outcome; it is also unlikely that, in the fullness of time of the cosmos, that we are the only sentient beings. That being said, we are, you, me, everyone reading this, are sentient beings, born of star-stuff and able to contemplate ourselves. That, for me is enough.
Tyson: Do you think well ever see conscious machines? Or, perhaps, something that compels us to accept them as such?
Booch: My experience tells me that the mind is computable. Hence, yes, I have reason to believe that we will see synthetic minds. But not in my lifetime; or yours; or your children; or your childrens children. Remember, also, that this will likely happen incrementally, not with a bang, and as such, we will co-evolve with these new species.
Tyson: Everyone should look at your lists of books you've read.Knowing that you've read, A Universe of Consciousness gives me permission to ask: Do you hold a materialist viewpoint? (Or, falling completely into the realm of philosophy, What is consciousness?)
Booch: Let me put it this way: I have reason to believe I am conscious and sentient; I have reason to believe that you are, as well, because my theory of mind yields a consistency in our being. Reflecting Dennets point of view, consciousness is an illusion, but it is an exquisite illusion, one that enables me to see and be seen, know and be known, love and be loved. And for me, that is enough.
See the rest here:
The philosopher: A conversation with Grady Booch - InfoWorld
Is the Indian govt developing quantum cyber-attack proof systems? – MediaNama.com
In response to a parliamentary question, the Indian government did not provide an explicit answer to whether it has developed systems capable of withstanding quantum cyber-attacks under the National Mission on Quantum Technologies and Applications. Responding to Congress MP Shashi Tharoor's written question on the matter yesterday, the Ministry of Science and Technology simply said: Yes Sir. Government is developing a proposal to initiate National Mission on Quantum Technologies & Applications with the objectives of building capabilities in Quantum Computing, Quantum Communication, Quantum Materials, Quantum Sensing and metrology. The Ministry added that the details and deliverables of the Mission, announced in the 2020-21 Budget, are yet to be finalised. Wait, what's quantum computing? Without delving too deeply into its physics, "quantum" here refers to quantum mechanicswhich is what the computer system uses to calculate an output. So, essentially, quantum computers use concepts of quantum physics and apply them to computers. The hope is that they'll help systems process things super fast. Or, as WIREDputs it, "the basic idea is to smash some barriers that limit the speed of existing computers by harnessing the counterintuitive physics of subatomic scales". Right, so why doquantum cyberattacks matter? Right now, much of the electronic information we send online is encrypted. This means information is coded to be unintelligible until the recipient gets it and decrypts or decodes the message using a specific key. Doing this ensures that a "computer system's information can't be stolen and read by someone who wants to use it for
Read more:
Is the Indian govt developing quantum cyber-attack proof systems? - MediaNama.com
Quantum engineering meets nanoscale data processing: Unleashing the power of light-driven conductivity control – Phys.org
by Kosala Herath and Malin Premaratne , Phys.org
Over the past few decades, the field of data processing and transferring technology has advanced at a rapid pace. This growth can be attributed to Moore's Law, which predicts that the number of transistors on a microchip will double roughly every two years, enabling the semiconductor industry to make electronic devices smaller, faster, and more efficient.
However, electronic interconnections have presented challenges in data transferring due to delays and thermal issues that limit capacity. In an effort to address this issue, researchers have turned to the use of optical waves instead of electronic signals. Optical waves offer significant information-carrying capacity and minimal loss, but the challenge lies in miniaturizing photonic devices as much as electronic components.
Enter plasmonics, a research area that combines microscale photonics and nanoscale electronics to overcome this limitation [1]. Using surface plasmon polaritons (SPPs) to deliver light energy between nanophotonic devices, plasmonics offers a high degree of confinement, overcoming the limitations of conventional dielectric waveguides. With plasmonics, it is possible to manipulate light at the nanoscale and create a world of possibilities for the future of data processing.
However, a significant challenge arises in simultaneously achieving longer propagation lengths and highly confined SPP modes. This is where Floquet engineering comes in. This cutting-edge technology transforms quantum material engineering through high-intensity periodic driving, allowing researchers to manipulate matter in previously considered impossible [2, 3].
To address this challenge, we have proposed a comprehensive theoretical framework that uses quantum electrodynamics coupled with Floquet engineering to enhance the propagation length of SPP modes [4]. By modifying the electrical and optical characteristics of metal nanostructures within plasmonic waveguides, it is possible to enhance SPP propagation lengths significantly.
It was observed that exposing a metallic system to this particular form of light (dressing field) increases the propagation length of SPP (surface plasmon polariton) modes due to an enhancement in the metal's conductivity. This can be explained by the fact that SPPs are created by the collective movements of electrons in the metal, and the metal's energy loss can be attributed to electron scattering. Dressing by the external field modifies the wave functions and, thereby, the rates, which can improve the conductivity and increase the propagation length of SPP modes.
Finally, the plasmonic and conductive properties of popular plasmonic metals, including silver (Ag), gold (Au), copper (Cu), and aluminum (Al), were analyzed under the illumination of a particular form of light using computer simulation based on code specifically written to explore this novel observation. Based on the evaluation, several metals were found be much suited for application due to their controllable conductivity response. This discovery could lead to the development of efficient and advanced nanoscale plasmonic data processing devices, circuits, and components in the near future.
This story is part of Science X Dialog, where researchers can report findings from their published research articles. Visit this page for information about ScienceX Dialog and how to participate.
More information: [1]. Malin Premaratne and Govind P. Agrawal, Theoretical foundations of nanoscale quantum devices, Cambridge University Press (2021). doi.org/10.1017/9781108634472
[2]. Kosala Herath and Malin Premaratne, Generalized model for the charge transport properties of dressed quantum Hall systems, Physical Review B (2022). DOI: 10.1103/PhysRevB.105.035430
[3]. Kosala Herath and Malin Premaratne, Polarization effect on dressed plasmonic waveguides, Emerging Imaging and Sensing Technologies for Security and Defence VII (2022). DOI: 10.1117/12.2635710
[4]. Kosala Herath and Malin Premaratne, Floquet engineering of dressed surface plasmon polariton modes in plasmonic waveguides, Physical Review B (2022). DOI: 10.1103/PhysRevB.106.235422
Kosala Herath is a Ph.D. candidate and a member of the Advanced Computing and Simulations Laboratory (qdresearch.net/) at the Electrical and Computer System Engineering, Monash University, Australia. He received his B.Sc. degree in Electronic and Telecommunication Engineering (with first-class honors) from the University of Moratuwa, Sri Lanka in 2018. Currently, his research focused on the fields of nanoplasmonics, low-dimensional electron transport, and Floquet systems. He is a member of SPIE.
Malin Premaratne earned several degrees from the University of Melbourne, including a B.Sc. in mathematics, a B.E. in electrical and electronics engineering (with first-class honors), and a PhD in 1995, 1995, and 1998, respectively. He has been leading the research program in high-performance computing applications to complex systems simulations at the Advanced Computing and Simulation Laboratory, Monash University, Clayton, since 2004. Currently, he serves as the Vice President of the Academic Board of Monash University and a Full Professor. In addition to his work at Monash University, Professor Premaratne is also a Visiting Researcher at several prestigious institutions, including the Jet- Propulsion Laboratory at Caltech, the University of Melbourne, the Australian National University, the University of California Los Angeles, the University of Rochester New York, and Oxford University. He has published more than 250 journal papers and two books and has served as an associate editor for several leading academic journals, including IEEE Photonics Technology Letters, IEEE Photonics Journal, and OSA Advances in Optics and Photonics. Professor Premaratne's contributions to the field of optics and photonics have been recognized with numerous fellowships, including the Fellow of the Optical Society of America (FOSA), the Society of Photo-Optical Instrumentation Engineers USA (FSPIE), the Institute of Physics U.K. (FInstP), the Institution of Engineering and Technology U.K. (FIET), and The Institute of Engineers Australia (FIEAust).
Journal information: Physical Review B
Read the original here:
Quantum engineering meets nanoscale data processing: Unleashing the power of light-driven conductivity control - Phys.org
Quantum Leap episode 15 Review: Ben Song to the defense – VULKK.com
Ben Song to the defense has Jen, not Ben, shine in the entire episode 15 of Quantum Leap while Ben and Addison take a backseat.
Episode 15 has Ben Song leap into the body of a young woman once again. This time its a public defender. Her name is Aleyda Ramirez and she is defending a young boy who was sentenced before he even went to trial.
But it is up to Ben and Team Quantum Leap in the shape of Jen to help and keep this innocent young man from jail. The police apprehended the boy after he threatened a gang member who was trying to goad his little brother into thug life.
However since there were no witnesses, hes already been convicted before a fair trial happened. Not officially, but when Assistant District Attorney Barnes threatens Ben that if he doesnt take the plea deal, he will put his client away for life.
In the meantime, Ben finds out that Aleyda has a girlfriend named Vicky, who works for the District Attorneys office. Which is an ethics violation and if people ever find out then 100s of cases will have to be reviewed due to conflict of interest. But the 2 love each other so they are willing to take that risk. Whats love without a little risk huh?
Then there is the corrupt A.D.A. Barnes who goes after people with ferocity and doesnt care whether or not the person is innocent or not. Which makes me wonder how he even got the job. Did nobody ask for an ethics test or something? To check if this person is suited for what he is doing. And Vicky is willing to turn a blind eye because she looks up to him.
This poses a problem when Ben notices a page with notes of witnesses who can exonerate Bens client. When Ben confronts Vicky, Vicky storms off and as a gut punch, she says happy anniversary. Because it was the day the 2 got together.
When Ben is in a rut and cant figure out how to keep the kid out of jail. Jen comes up with a story about rabbits drowning in a river. The boy keeps saving the rabbits but quickly figures out that if he goes upstream then he may find the source of the rabbit issues.
On top of that, Vicky and Ben are still not on speaking terms. This saddens Ben because he feels he is messing up his host bodys life. But Vicky has a change of heart and wants to correct her mistake and with the help of Vicky and knowledge of the future, Vicky and Ben took down the ADA for corruption.
The boy who was released eventually became the first college graduate of the family and turned out to be a good lawyer.
This weeks Ben was funny. He was once again a woman and this time a lesbian woman as well. To make the distinction in Ben Song to the Defense, Ben wore classy earrings during the entire episode. And pants suits because no classy lawyer can leave the house without that.
But this whole episode was not about Ben. Ben once again takes a backseat to make Jen shine. This whole weeks episode was about Jen helping Ben and finally shows her wonderful true colors.
Have I mentioned that there is barely any Addison in this weeks episode Ben Song to the defense? This week Jen is the hologram to assist Ben. Its interesting to see a different dynamic other than Addison and Ben. We have seen a plethora of that over the course of the season.
Jen wanted to assist Ben this week and having both Jen and Addison proved too much for Quantum Computer Ziggy. So Addison took the backseat while investigating what happened to Aleyda and Vicky.
And the thing is, the dynamic between the two was quite amazing. I mean, I know Addison and Ben are lovers. And Ben at least has some awareness of his love for Addison, I dont think he knows the full story since he barely remembers Magic. Oh, there we go. Yes, he does. But more about that later.
Jen as a character has been very underused. All we see her do is run intel ops for Team Quantum Leap and we only see the action but not much of the character. In Ben Song to the defense, my view on Jen has changed, and here is why.
In this episode, we get to see Jens amazing character on full display. She is not as much of a goodie two shoe but shes morally gray. Ill give you an example of that.
There was a couple on the show and the husband was jailed. And the bail was quite high too, $50.000. And Jen urges Ben to have the woman go to the race track and bet on a horse. The money won was exactly $50.000 and thus the woman could post bail to free her husband.
This to me shows that Jen is willing to do something dark, gambling, in order to get something good done. This makes Jen the morally gray member of Team Quantum Leap. This is promising for the future. So far we have seen all of them being quite attuned to the light side and Jen is the first character who isnt.
The question will be if Leaper X shows up and puts her in the same situation, what her next action is going to be? Will she screw over Team Quantum Leap in order to get something good done or side with Team Quantum Leap to get Leaper X down?
Also, Leaper X is, of course, Ramirez. As per this episode, the Quantum Reapers are officially named Leaper X by the show. So I guess we will adapt to that name and say farewell to the Quantum Reaper name.
After the story of the rabbits that needed to be saved from the river, Ben quickly figures out who Jen means and it is Magic. And I guess that after Ian, Addison, and Jen, the next episode will feature Magic heavily in order to get the gang ready for the first season finale of the show. With 3 episodes left to go, the show continues to build up momentum for the first finale.
Who knows, the writers may even use a few bits of the episodes to develop Janis Calavicci a bit more. I would regard her as the darkest member of Team Quantum Leap. Even if shes the reserve member and technically jailed, her contributions shaped Team Quantum Leap a lot and she at least deserves a bit more credit by developing her character a little more in the final episodes of the season up until the finale.
Janis character, I would regard that as the darkest one. Her dealings are very shady compared to the rest of them. Blowing up houses, using cyber warfare in order to get things done, and putting odd drugs in her mothers drinks to get her fathers old holo communicator.
The episode Ben Song to the defense was a good episode but once again, Ben takes a backseat and it is Jen that truly shines. And she deserved it after a season of her not even being in the backseat but waiting for the car rather as a figure of speech.
Read more from the original source:
Quantum Leap episode 15 Review: Ben Song to the defense - VULKK.com
Post-Quantum Cryptography | CSRC – NIST
The Candidates to be Standardized and Round 4 Submissions were announced July 5, 2022. NISTIR 8413, Status Report on the Third Round of the NIST Post-Quantum Cryptography Standardization Process is now available.
NIST initiated a process to solicit, evaluate, and standardize one or more quantum-resistant public-key cryptographic algorithms.Full details can be found in the Post-Quantum Cryptography Standardization page.
In recent years, there has been a substantial amount of research on quantum computers machines that exploit quantum mechanical phenomena to solve mathematical problems that are difficult or intractable for conventional computers. If large-scale quantum computers are ever built, they will be able to break many of the public-key cryptosystems currently in use. This would seriously compromise the confidentiality and integrity of digital communications on the Internet and elsewhere. The goal of post-quantum cryptography (also called quantum-resistant cryptography) is to develop cryptographic systems that are secure against both quantum and classical computers, and can interoperate with existing communications protocols and networks.
The question of when a large-scale quantum computer will be built is a complicated one. While in the past it was less clear that large quantum computers are a physical possibility, many scientists now believe it to be merely a significant engineering challenge. Some engineers even predict that within the next twenty or so years sufficiently large quantum computers will be built to break essentially all public key schemes currently in use. Historically, it has taken almost two decades to deploy our modern public key cryptography infrastructure. Therefore, regardless of whether we can estimate the exact time of the arrival of the quantum computing era, we must begin now to prepare our information security systems to be able to resist quantum computing.
Originally posted here:
Post-Quantum Cryptography | CSRC - NIST
IBM unveils its 433 qubit Osprey quantum computer TechCrunch
IBM wants to scale up its quantum computers to over 4,000 qubits by 2025 but were not quite there yet. For now, we have to make do with significantly smaller systems and today, IBM announced the launch of its Osprey quantum processor, which features 433 qubits, up from the 127 qubits of its 2021 Eagle processor. And with that, the slow but steady march toward a quantum processor with real-world applications continues.
The new 433 qubit Osprey processor brings us a step closer to the point where quantum computers will be used to tackle previously unsolvable problems, said Daro Gil, senior vice president, IBM and director of Research. We are continuously scaling up and advancing our quantum technology across hardware, software and classical integration to meet the biggest challenges of our time, in conjunction with our partners and clients worldwide. This work will prove foundational for the coming era of quantum-centric supercomputing.
Image Credits: IBM
IBMs quantum roadmap includes two additional stages the 1,121-qubit Condor and 1,386-qubit Flamingo processors in 2023 and 2024 before it plans to hit the 4,000-qubit stage with its Kookaburra processor in 2025. So far, the company has generally been able to make this roadmap work, but the number of qubits in a quantum processor is obviously only one part of a very large and complex puzzle, with longer coherence times and reduced noise being just as important.
Ideally, thats something developers who want to work with these machines wouldnt have to worry about, so increasingly, the tools they use are abstracting the hardware away for them. With the new version of its Qiskit Runtime, for example, developers can now trade speed for reduced error count.
The company also today detailed its Quantum System Two basically IBMs quantum mainframe which will be able to house multiple quantum processors and integrate them into a single system with high-speed communication links. The idea here is to launch this system by the end of 2023.
Image Credits: IBM
Excerpt from:
IBM unveils its 433 qubit Osprey quantum computer TechCrunch
5 Essential Hardware Components of a Quantum Computer | Quantum …
[47] R. Barends, J. Kelly, A. Megrant, A. Veitia, D. Sank, E. Jeffrey, T.C. White, et al., 2014, Superconducting quantum circuits at the surface code threshold for fault tolerance, Nature 508(7497):500.
[48] L. DiCarlo, J.M. Chow, J.M. Gambetta, L.S. Bishop, B.R. Johnson, D.I. Schuster, J. Majer, A. Blais, L. Frunzio, S.M. Girvin, and R.J. Schoelkopf, 2009, Demonstration of two-qubit algorithms with a superconducting quantum processor, Nature 460:240-244.
[49] E. Lucero, R. Barends, Y. Chen, J. Kelly, M. Mariantoni, A. Megrant, P. OMalley, et al., 2012, Computing prime factors with a Josephson phase qubit quantum processor, Nature Physics 8:719-723.
[50] P.J.J. OMalley, R. Babbush, I.D. Kivlichan, J. Romero, J.R. McClean, R. Barends, J. Kelly, et al., 2016, Scalable quantum simulation of molecular energies, Physical Review X 6:031007.
[51] N.K. Langford, R. Sagastizabal, M. Kounalakis, C. Dickel, A. Bruno, F. Luthi, D.J. Thoen, A. Endo, and L. DiCarlo, 2017, Experimentally simulating the dynamics of quantum light and matter at deep-strong coupling, Nature Communications 8:1715.
[52] M.D. Reed, L. DiCarlo, S.E. Nigg, L. Sun, L. Frunzio, S.M. Girvin, and R.J. Schoelkopf, 2012, Realization for three-qubit quantum error correction with superconducting circuits, Nature 482:382-385.
[53] J. Kelly, R. Barends, A.G. Fowler, A. Megrant, E. Jeffrey, T. C. White, D. Sank, et al., 2015, State preservation by repetitive error detection in a superconducting quantum circuit, Nature 519:66-69.
[54] A.D. Crcoles, E. Magesan, S.J. Srinivasan, A.W. Cross, M. Steffen, J.M. Gambetta, and J.M. Chow, 2015, Demonstration of a quantum error detection code using a square lattice of four superconducting qubits, Nature Communications 6:6979.
[55] D. Rist, S. Poletto, M.-Z. Huang, A. Bruno, V. Vesterinen, O.-P. Saira, and L. DiCarlo, 2015, Detecting bit-flip errors in a logical qubit using stabilizer measurements, Nature Communications 6:6983.
[56] N. Ofek, A. Petrenko, R. Heeres, P. Reinhold, Z. Leghtas, B. Vlastakis, Y. Liu, et al., 2016, Extending the lifetime of a quantum bit with error correction in superconducting circuits, Nature 536:441-445.
[57] IBM Q Team, 2018, IBM Q 5 Yorktown Backend Specification V1.1.0, https://ibm.biz/qiskit-yorktown; IBM Q Team, 2018, IBM Q 5 Tenerife backend specification V1.1.0, https://ibm.biz/qiskit-tenerife.
[58] Ibid.
[59] M.W. Johnson, M.H.S. Amin, S. Gildert, T. Lanting, F. Hamze, N. Dickson, R. Harris, et al., 2011, Quantum annealing with manufactured spins, Nature 473:194-198.
[60] D Wave, Technology Information, http://dwavesys.com/resources/publications.
[61] John Martinis, private conversation.
[62] W.D. Oliver and P.B. Welander, 2013, Materials in superconducting qubits, MRS Bulletin 38:816.
[63] D. Rosenberg, D.K. Kim, R. Das, D. Yost, S. Gustavsson, D. Hover, P. Krantz, et al., 2017, 3D integrated superconducting qubits, npj Quantum Information 3:42.
[64] B. Foxen, J.Y. Mutus, E. Lucero, R. Graff, A. Megrant, Y. Chen, C. Quintana, et al., 2017, Qubit Compatible Superconducting Interconnects, arXiv:1708.04270.
[65] J.M. Chow, J.M. Gambetta, A.D. Corcoles, S.T. Merkel, J.A. Smolin, C. Rigetti, S. Poletto, G.A. Keefe, M.B. Rothwell, J.R. Rozen, M.B. Ketchen, and M. Steffen, 2012, Universal quantum gate set approaching fault-tolerant thresholds with superconducing qubits, Physical Review Letters 109:060501.
[66] See, for example, J.W. Silverstone, D. Bonneau, J.L. OBrien, and M.G. Thompson, 2016, Silicon quantum photonics, IEEE Journal of Selected Topics in Quantum Electronics 22:390-402;
T. Rudolph, 2017, Why I am optimistic about the silicon-photonic route to quantum computing?, APL Photonics 2:030901.
Read more:
5 Essential Hardware Components of a Quantum Computer | Quantum ...
Physicists Got a Quantum Computer to Work by Blasting It With the …
The Quantinuum quantum computer.
Quantinuums quantum computer, which was used in the recent discovery.
A team of physicists say they managed to create a new phase of matter by shooting laser pulses reading out the Fibonacci sequence to a quantum computer in Colorado. The matter phase relies on a quirk of the Fibonacci sequence to remain in a quantum state for longer.
Just as ordinary matter can be in a solid, liquid, gas, or superheated plasmic phase (or state), quantum materials also have phases. The phase refers to how the matter is structured on an atomic levelthe arrangement of its atoms or its electrons, for example. Several years ago, physicists discovered a quantum supersolid, and last year, a team confirmed the existence of quantum spin liquids, a long-suspected phase of quantum matter, in a simulator. The recent team thinks theyve discovered another new phase.
Read more
Quantum bits, or qubits, are like ordinary computer bits in that their values can be 0 or 1, but they can also be 0 or 1 simultaneously, a state of ambiguity that allows the computers to consider many possible solutions to a problem much faster than an ordinary computer. Quantum computers should eventually be able to solve problems that classical computer cant.
Qubits are often atoms; in the recent case, the researchers used 10 ytterbium ions, which were controlled by electric fields and manipulated using laser pulses. When multiple qubits states can be described in relation to one another, the qubits are considered entangled. Quantum entanglement is a delicate agreement between multiple qubits in a system, and the agreement is dissolved the moment any one of those bits values is certain. At that moment, the system decoheres, and the quantum operation falls apart.
Story continues
A big challenge of quantum computing is maintaining the quantum state of qubits. The slightest fluctuations in temperature, vibrations, or electromagnetic fields can cause the supersensitive qubits to decohere and their calculations to fall apart. Since the longer the qubits stay quantum, the more you can get done, making computers quantum states persist for as long as possible is a crucial step for the field.
In the recent research, pulsing a laser periodically at the 10 ytterbium qubits kept them in a quantum statemeaning entangledfor 1.5 seconds. But when the researchers pulsed the lasers in the pattern of the Fibonacci sequence, they found that the qubits on the edge of the system remained in a quantum state for about 5.5 seconds, the entire length of the experiment (the qubits could have remained in a quantum state for longer, but the team ended the experiment at the 5.5-second mark). Their research was published this summer in Nature.
You can think of the Fibonacci sequence laser pulses as two frequencies that never overlap. That makes the pulses a quasicrystal: a pattern that has order, but no periodicity.
The key result in my mind was showing the difference between these two different ways to engineer these quantum states and how one was better at protecting it from errors than the other, said study co-author Justin Bohnet, a quantum engineer at Quantinuum, the company whose computer was used in the recent experiment.
The Fibonacci sequence is a numeric pattern in which each number is the sum of the two previous numbers (so 1, 1, 2, 3, 5, 8, 13, and so on). Its history goes back over 2,000 years and is connected to the so-called golden ratio. Now, the unique series may have quantum implications.
It turns out that if you engineer laser pulses in the correct way, your quantum system can have symmetries that come from time translation,said Philipp Dumitrescu, the papers lead author and a quantum physicist who conducted the work while at the Flatiron Institute. A time-translation symmetry means that an experiment will yield the same result, regardless of whether it takes place today, tomorrow, or 100 years from now.
What we realized is that by using quasi-periodic sequences based on the Fibonacci pattern, you can have the system behave as if there are two distinct directions of time, Dumitrescu added.
Shooting the qubits with laser pulses with a periodic (a simple A-B-A-B) pattern didnt prolong the systems quantum state. But by pulsing the laser in a Fibonacci sequence (A-AB-ABA-ABAAB, and so on), the researchers gave the qubits a non-repeating, or quasi-periodic, pattern.
Its similar to the quasicrystals from the Trinity nuclear test site, but instead of being a three-dimensional quasicrystal, the physicists made a quasicrystal in time. In both cases, symmetries that exist at higher dimensions can be projected in a lower dimension, like the tessellated patterns in a two-dimensional Penrose tiling.
With this quasi-periodic sequence, theres a complicated evolution that cancels out all the errors that live on the edge, Dumitrescu said in a Simons Foundation release. By on the edge, hes referring to the qubits farthest from the center of their configuration at any one time. Because of that, the edge stays quantum-mechanically coherent much, much longer than youd expect. The Fibonacci-pattern laser pulses made the edge qubits more robust.
More robust, longer-lived quantum systems are a vital need for the future of quantum computing. If it takes shooting qubits with a very specific mathematical rhythm of laser pulses to keep a quantum computer in an entangled state, then physicists had better start blasting.
More: What the Hell Is a Quantum Computer and How Excited Should I Be?
Read the rest here:
Physicists Got a Quantum Computer to Work by Blasting It With the ...