Page 3,162«..1020..3,1613,1623,1633,164..3,1703,180..»

Expansion project to grow computer science learning, research at Algoma University – Northern Ontario Business

FedNor announces $1.98 million for Sault Ste. Marie institution

A major expansion of Algoma Universitys School of Computer Science and Technology will result in modernized labs for students, expanded capacity for increased enrolment, and new applied research partnerships with industry.

Although the project has been in the works for some time, the arrival of COVID-19 had put it in jeopardy, noted Algomas president, Asima Vezina. But funding of $1.98 million from FedNor has meant that the project can move ahead.

What were looking at is the development of what will be a critical ICT (information and communications technology) ecosystem in the North, Vezina said during an online announcement on Jan. 26.

We want to contribute to building that ecosystem to ensure that our region and our community will have the talent and the graduates that are going to be required to compete globally in what is being coined as the economy of the future.

The funds provided by FedNor will be used to expand the physical space and enable the school to increase program capacity.

Space in its programming is currently tapped out, Vezina noted, and yet graduates have close to a 100 per cent employment rate, putting them in high demand.

One of the things that the School of Computer Science has made very clear is they want to create spaces that really produce creative, innovative thinkers in our graduates, she said.

They want us to be partnering with industry, with the community to help solve real problems, with our students at the forefront.

Construction on the physical space is currently underway, and its expected to be ready to welcome students in September 2021.

Want to read more stories about business in the North? Subscribe to our newsletter.

Dr. Simon Xu, the schools director, said theyll now have additional capacity in teaching and tutorial labs, as well as a unique gaming lab and spaces for student collaboration.

Programming is currently offered in the Sault and at the schools Brampton campus, but a near-term goal is to also offer it in Timmins.

Creating space for high-end computers additionally means that Algoma will be able to support programming at the Bachelor, Masters and certificate levels, Xu noted.

He also expects to see an increased ability to embark on more research projects.

In addition to delivering an innovative, cutting-edge computer science program, our faculty are engaged in research in a number of growing areas within the ICT sector, such as computer interface, wireless networking, computer gaming, robotics, and software evolution, Xu said.

I believe this project will allow faculty and students to expand current research and focus on the new research areas.

Vezina said this project is in line with Algomas strategic plan, which has a goal to increase enrolment to 3,000 students within five years.

That goal became more precarious after COVID-19 hit, with the school predicting a 20 to 30 per cent decline in enrolment, she said.

To overcome that challenge, she said, its going to be critical that we are driven by vibrant and innovative programming.

Read the original here:

Expansion project to grow computer science learning, research at Algoma University - Northern Ontario Business

Read More..

Tech 24 – Welcome to the quantum era – FRANCE 24

Issued on: 25/01/2021 - 13:19Modified: 25/01/2021 - 14:18

The first quantum revolution gave way to lasers and transistors while the secondushered in MRIs and GPS. But the technology still holds much more promisefor the future. We tell you why quantum computing is becomingsuch a strategic sector.

Quantum physics constitutes a huge change in how one understands the world and conceives reality. There is a shift from the intuitive, straightforward classical paradigmto the quantum world that describesmuch more complex, counterintuitive and amazing phenomena. In this edition, we attempt to explain the fundamental mechanism of quantum physics, a demonstration of how little we actually know about our world.

We dig deeper into the prospect of quantum computers with Eleni Diamanti, a senior researcher at LIP6 Sorbonne. She tells us how much this technology is set to revolutionise certain sectors like communications, medtech and theInternet of Things, plus how nations and companies are now engaged in an arms race for quantum supremacy.

And in Test 24, wetake a look at the French startup Vaonis' latest deviceVespera, a perfect hybrid between a smart telescope and a camera that picked up the best innovation award at this year's CES trade show.

See the rest here:

Tech 24 - Welcome to the quantum era - FRANCE 24

Read More..

IBMs top executive says, quantum computers will never reign supreme over classical ones – The Hindu

(Subscribe to our Today's Cache newsletter for a quick snapshot of top 5 tech stories. Click here to subscribe for free.)

Crunch numbers fast and at scale has been at the centre of computing technology. In the past few decades, a new type of computing has garnered significant interest. Quantum computers have been in development since the 1980s. They use properties of quantum physics to solve complex problems that cant be solved by classical computers.

Companies like IBM and Google have been continuously building and refining their quantum hardware. Simultaneously, several researchers have also been exploring new areas where quantum computers can deliver exponential change.

In the context of advances in quantum technologies, The Hindu caught with IBM Researchs Director Gargi Dasgupta.

Dasgupta noted that quantum computers complement traditional computing machines, and said the notion that quantum computers will take over classical computers is not true.

Quantum computers are not supreme against classical computers because of a laboratory experiment designed to essentially [and almost certainly exclusively] implement one very specific quantum sampling procedure with no practical applications, Dasgupta said.

Also Read: Keeping secrets in a quantum world and going beyond

For quantum computers to be widely used, and more importantly, have a positive impact, it is imperative to build programmable quantum computing systems that can implement a wide range of algorithms and programmes.

Having practical applications will alone help researchers use both quantum and classical systems in concert for discovery in science and to create commercial value in business.

To maximise the potential of quantum computers, the industry must solve challenges from the cryogenics, production and effects materials at very low temperatures. This is one of the reasons why IBM built its super-fridge to house Condor, Dasgupta explained.

Quantum processors require special conditions to operate, and they must be kept at near-absolute zero, like IBMs quantum chips are kept at 15mK. The deep complexity and the need for specialised cryogenics is why at least IBMs quantum computers are accessible via the cloud, and will be for the foreseeable future, Dasgupta, who is also IBMs CTO for South Asia region, noted.

Quantum computing in India

Dasgupta said that interest in quantum computing has spiked in India as IBM saw an many exceptional participants from the country at its global and virtual events. The list included academicians and professors, who all displayed great interest in quantum computing.

In a blog published last year, IBM researchers noted that India gave quantum technology 80 billion rupees as part of its National Mission on Quantum technologies and Applications. They believe its a great time to be doing quantum physics since the government and people are serious as well as excited about it.

Also Read: IBM plans to build a 1121 qubit system. What does this technology mean?

Quantum computing is expanding to multiple industries such as banking, capital markets, insurance, automotive, aerospace, and energy.

In years to come, the breadth and depth of the industries leveraging quantum will continue to grow, Dasgupta noted.

Industries that depend on advances in materials science will start to investigate quantum computing. For instance, Mitsubishi and ExxonMobil are using quantum technology to develop more accurate chemistry simulation techniques in energy technologies.

Additionally, Dasgupta said carmaker Daimler is working with IBM scientists to explore how quantum computing can be used to advance the next generation of EV batteries.

Exponential problems, like those found in molecular simulation in chemistry, and optimisation in finance, as well as machine learning continue to remain intractable for classical computers.

Quantum-safe cryptography

As researchers make advancement into quantum computers, some cryptocurrency enthusiasts fear that quantum computers can break security encryption. To mitigate risks associated with cryptography services, Quantum-safe cryptography was introduced.

For instance, IBM offers Quantum Risk Assessment, which it claims as the worlds first quantum computing safe enterprise class tape. It also uses Lattice-based cryptography to hide data inside complex algebraic structures called lattices. Difficult math problems are useful for cryptographers as they can use the intractability to protect information, surpassing quantum computers cracking techniques.

According to Dasgupta, even the National Institute of Standards and Technologys (NIST) latest list for quantum-safe cryptography standards include several candidates based on lattice cryptography.

Also Read: Google to use quantum computing to develop new medicines

Besides, Lattice-based cryptography is the core for another encryption technology called Fully Homomorphic Encryption (FHE). This could make it possible to perform calculations on data without ever seeing sensitive data or exposing it to hackers.

Enterprises from banks to insurers can safely outsource the task of running predictions to an untrusted environment without the risk of leaking sensitive data, Dasgupta said.

Last year, IBM said it will unveil 1121-qubit quantum computer by 2023. Qubit is the basic unit of a quantum computer. Prior to the launch, IBM will release the 433-qubit Osprey processor. It will also debut 121-qubit Eagle chip to reduce qubits errors and scale the number of qubits needed to reach Quantum Advantage.

The 1,121-qubit Condor chip, is the inflection point for lower-noise qubits. By 2023, its physically smaller qubits, with on-chip isolators and signal amplifiers and multiple nodes, will have scaled to deliver the capability of Quantum Advantage, Dasgupta said.

View original post here:

IBMs top executive says, quantum computers will never reign supreme over classical ones - The Hindu

Read More..

Record-Breaking Source for Single Photons Developed That Can Produce Billions of Quantum Particles per Second – SciTechDaily

The new single-photon source is based on excitation of a quantum dot (shown as a bulge on the bottom left), which then emits photons. A micro-cavity ensures that the photons are guided into an optical fiber and emerge at its end. Credit: University of Basel, Department of Physics

Researchers at the University of Basel and Ruhr University Bochum have developed a source of single photons that can produce billions of these quantum particles per second. With its record-breaking efficiency, the photon source represents a new and powerful building-block for quantum technologies.

Quantum cryptography promises absolutely secure communications. A key component here are strings of single photons. Information can be stored in the quantum states of these light particles and transmitted over long distances. In the future, remote quantum processors will communicate with each other via single photons. And perhaps the processor itself will use photons as quantum bits for computing.

A basic prerequisite for such applications, however, is an efficient source of single photons. A research team led by Professor Richard Warburton, Natasha Tomm and Dr. Alisa Javadi from the University of Basel, together with colleagues from Bochum, now reports in the journalNature Nanotechnologyon the development of a single-photon source that significantly surpasses previously known systems in terms of efficiency.

Each photon is created by exciting a single artificial atom (a quantum dot) inside a semiconductor. Usually, these photons leave the quantum dot in all possible directions and thus a large fraction is lost. In the photon source now presented, the researchers have solved this problem by positioning the quantum dot inside a funnel to send all photons in a specific direction.

The funnel is a novel micro-cavity that represents the real innovation of the research team: The micro-cavity captures almost all of the photons and then directs them into an optical fiber. The photons, each about two centimeters long, emerge at the end of an optical fiber.

The efficiency of the entire system that is, the probability that excitation of the quantum dot actually results in a usable photon is 57 percent, more than double that of previous single-photon sources. This is a really special moment, explains lead author Richard Warburton. Weve known for a year or two whats possible in principle. Now weve succeeded in putting our ideas into practice.

The increase in efficiency has significant consequences, Warburton adds: increasing the efficiency of single photon creation by a factor of two adds up to an overall improvement of a factor of one million for a string of, say, 20 photons. In the future, wed like to make our single-photon source even better: Wed like to simplify it and pursue some of its myriad applications in quantum cryptography, quantum computing and other technologies.

Reference: A bright and fast source of coherent single photons by Natasha Tomm, Alisa Javadi, Nadia Olympia Antoniadis, Daniel Najer, Matthias Christian Lbl, Alexander Rolf Korsch, Rdiger Schott, Sascha Ren Valentin, Andreas Dirk Wieck, Arne Ludwig and Richard John Warburton, 28 January 2021, Nature Nanotechnology.DOI: 10.1038/s41565-020-00831-x

The project was funded by the Swiss National Science Foundation, the National Center of Competence in Research Quantum Science and Technology (NCCR QSIT), and the European Union under the Horizon2020 programme.

Continue reading here:

Record-Breaking Source for Single Photons Developed That Can Produce Billions of Quantum Particles per Second - SciTechDaily

Read More..

Physicists Are Reinventing the Laser – Gizmodo

Illustration: Benjamin Currie/Gizmodo

In the 1950s, when physicists were racing to invent the first laser, they found that the rules of quantum mechanics restricted how pure the color of their light could be. Since then, physicists and engineers have always built lasers with those restrictions in mind. But new theoretical research from two independent groups of physicists indicates that nature is more lax than previously thought. The findings could lead to improved, more monochromatic lasers for applications such as quantum computing, which the researchers illustrate in two proposed laser designs.

The work overthrows 60 years of understanding about what limits lasers, said physicist Howard Wiseman of Griffith University in Australia, whose group published their work in Nature Physics last October.

A laser, in essence, is a megaphone for light. The word itself, originally an acronym, reflects this function: light amplification by stimulated emission of radiation. Send in a photon of the right frequency, and the laser makes copies of it, multiplying the original signal.

These photon clones exit the laser in sync with each other, traveling in phase, as the experts call it. You can think of it this way: Each photon is a wave, with its crest and trough lined up with its neighbor, marching together in lock-step out of the laser. This contrasts with most other light sources, such as your reading lamp or even the Sun, which both emit photons that disperse randomly.

The longer photons stay in sync, the more monochromatic the light. The color of a light source corresponds to the wavelength of its photons, with green light spanning roughly the 500 to 550 nanometer range, for example. For multiple photons to stay in sync a long time, their wavelengths must line up very preciselymeaning the photons need to be as close to one color as possible.

G/O Media may get a commission

This synchrony of laser photons, known as temporal coherence, is one of the devices most useful properties. Many technologies make use of laser lights ridiculously fast and steady rhythm, its wave pattern repeatingat hundreds of trillions of times a second for visible lasers. For example, this property underpins the worlds most precise timekeeping devices, known as optical lattice clocks.

But photons gradually lose sync after they leave the laser; how long they stick together is known as the lasers coherence time. In 1958, physicists Arthur Schawlow and Charles Townes estimated the coherence time of a perfect laser. (This is a common physicist design strategy: Consider the most ideal version of something before building a far more lacking real-world device.) They found an equation thought to represent an ultimate coherence time limit for lasers, set by the laws of physics. Physicists refer to this as the Schawlow-Townes limit.

The two new papers find that the Schawlow-Townes limit is not the ultimate limit. In principle, it should be possible to build lasers which are significantly more coherent, said physicist David Pekker of the University of Pittsburgh, who led the other group. Their paper, currently under peer review, is posted as a pre-print on arXiv.

Both groups argue that the Schawlow-Townes limit rests on assumptions about the laser that are no longer true. Schawlow and Townes basically thought of the laser as a hollow box, in which photons multiply and leave at a rate proportional to the amount of light inside the box. Put another way, the photons flow out of Schawlow and Towness laser like water drains from a hole in a barrel. Water flows faster when the barrel is fuller, and vice versa.

But Wiseman and Pekker both found that if you place a valve on the laser to control the rate of the photon flow, you can actually make a laser coherent for much longer than the Schawlow-Townes limit. Wisemans paper takes this a step further. Allowing for these photon-controlling valves, his team re-estimates the coherence time limit for the perfect laser. We show that ours is the ultimate quantum limit, said Wiseman, meaning the true physical limit dictated by quantum mechanics.

Schawlow and Towness estimate, while not the fundamental restriction on lasers physicists originally thought, was reasonable for its time, said Wiseman. No one had any means for precisely controlling the flow of light out of a laser in the way that Wiseman and Pekker propose. But todays lasers are a different story. Physicists can now control light with a multitude of devices developed for the budding quantum computing industry.

Pekker has teamed up with physicist Michael Hatridge, also of the University of Pittsburgh, to bring the new laser design to life. Hatridges expertise involves building circuits out of superconducting wire for storing and controlling microwave-frequency photons. They plan to build a microwave-emitting laserknown as a maserfor programming qubits inside a quantum computer made of superconducting circuits. Though building this new maser will take years of work and troubleshooting, Hatridge said they have all the tools and knowledge to make it possible. Thats why were excited about it, because its just another engineering project, Hatridge said.

Wiseman is looking for collaborators to build his design, also a maser. I would really, really like this to happen, but I recognize its a long-term goal, he said.

The designs are completely feasible, said physicist Steven Touzard of the National University of Singapore, who was not involved in either of the new papers. However, Pekker and Wisemans work may not directly lead to useful commercial lasers, according to Touzard. He pointed out that builders of lasers do not commonly use the Schawlow-Townes limit to direct their designs. So overturning the limit could be more of a theoretical advancement than an engineering one, he said.

Curiously, the two new designs also contradict another conventional wisdom about lasers. The devices do not produce light via so-called stimulated emission, which makes up the s and e in the acronym laser. Stimulated emission is a type of interaction between light and matter, in which a photon impinges upon an atom and stimulates the atom to emit an identical photon. If we imagine a laser as a box of light, as before, a laser that amplifies light using stimulated emission multiplies the signal proportionally to the amount of light already in the box. Another type of laser invented in 2012, known as a superradiant laser, also does not use stimulated emission to amplify light, according to Touzard.

The idea of a laser has outgrown its name. It is no longer exclusively light amplification by stimulated emission of radiation.

Of course, many such examples exist in the English language. The change in meaning is known as semantic shift and is common wherever new technology is involved, according to linguist Micha Elsner of the Ohio State University. Ships still sail across the ocean, even when no actual sails are involved, Elsner said in an email. You can still dial someones number even though your phone doesnt have a dial.

Even though a words etymologyits origincertainly gives it a starting point, it does not determine its destiny forever going forward, linguist Brian Joseph of the Ohio State University said in an email.

As Cold War goals transitioned into 21st century ones, lasers have evolved, too. Theyve been around long enough to integrate into nearly all aspects of modern life: They can correct human vision, read our grocery barcodes, etch computer chips, transmit video files from the Moon, help steer self-driving cars, and set the mood at psychedelic ragers. And now, the laser could be reinvented again. A 60-year-old device remains a symbol of a sci-fi future.

Visit link:

Physicists Are Reinventing the Laser - Gizmodo

Read More..

Insiders say Comedy Central’s top creative executives tokenized employees of color and fostered an environment – Business Insider India

On January 26, 2020, Kobe Bryant and his 13-year-old daughter, Gianna, died in a helicopter crash. The next day, a Black assistant for Comedy Central's in-house creative team was still reeling from the news.

"I could barely get on the subway," she said. "I was crying and honestly really shouldn't have gone in to work that day."

"Why the f--- would we do that? Isn't that BET's job?" the former assistant said she recalled the VP who was leading the meeting saying, referring to the Black Entertainment Television channel.

Advertisement

The assistant discussed the VP's comment with two colleagues, both of whom confirmed the conversations to Insider. The assistant said she didn't formally report the incident because she was worried it would jeopardize her career and put her at odds with executives at the company.

Despite the network's progressive content, current and former employees for Comedy Central's creative team told Insider that the company culture was not without discriminatory behavior. The insiders said that top creative executives at the New York headquarters sometimes tokenized employees of color and fostered a culture rampant with microaggressions. Of the 17 past and current employees Insider spoke to, 15 said they either witnessed or experienced inappropriate behavior they believe was influenced by colleagues' race.

Originally posted here:

Insiders say Comedy Central's top creative executives tokenized employees of color and fostered an environment - Business Insider India

Read More..

How Universes Might Bubble Up and Collide – WIRED

What lies beyond all we can see? The question may seem unanswerable. Nevertheless, some cosmologists have a response: Our universe is a swelling bubble. Outside it, more bubble universes exist, all immersed in an eternally expanding and energized seathe multiverse.

The idea is polarizing. Some physicists embrace the multiverse to explain why our bubble looks so special (only certain bubbles can host life), while others reject the theory for making no testable predictions (since it predicts all conceivable universes). But some researchers expect that they just havent been clever enough to work out the precise consequences of the theory yet.

Now, various teams are developing new ways to infer exactly how the multiverse bubbles and what happens when those bubble universes collide.

Its a long shot, said Jonathan Braden, a cosmologist at the University of Toronto who is involved in the effort, but, he said, its a search for evidence for something you thought you could never test.

The multiverse hypothesis sprang from efforts to understand our own universes birth. In the large-scale structure of the universe, theorists see signs of an explosive growth spurt during the cosmoss infancy. In the early 1980s, as physicists investigated how space might have startedand stoppedinflating, an unsettling picture emerged. The researchers realized that while space may have stopped inflating here (in our bubble universe) and there (in other bubbles), quantum effects should continue to inflate most of space, an idea known as eternal inflation.

The difference between bubble universes and their surroundings comes down to the energy of space itself. When space is as empty as possible and cant possibly lose more energy, it exists in what physicists call a true vacuum state. Think of a ball lying on the floorit cant fall any further. But systems can also have false vacuum states. Imagine a ball in a bowl on a table. The ball can roll around a bit while more or less staying put. But a large enough jolt will land it on the floorin the true vacuum.

In the cosmological context, space can get similarly stuck in a false vacuum state. A speck of false vacuum will occasionally relax into true vacuum (likely through a random quantum event), and this true vacuum will balloon outward as a swelling bubble, feasting on the false vacuums excess energy, in a process called false vacuum decay. Its this process that may have started our cosmos with a bang. A vacuum bubble could have been the first event in the history of our universe, said Hiranya Peiris, a cosmologist at University College London.

But physicists struggle mightily to predict how vacuum bubbles behave. A bubbles future depends on countless minute details that add up. Bubbles also change rapidlytheir walls approach the speed of light as they fly outwardand feature quantum mechanical randomness and waviness. Different assumptions about these processes give conflicting predictions, with no way to tell which ones might resemble reality. Its as though youve taken a lot of things that are just very hard for physicists to deal with and mushed them all together and said, Go ahead and figure out whats going on, Braden said.

Since they cant prod actual vacuum bubbles in the multiverse, physicists have sought digital and physical analogs of them.

One group recently coaxed vacuum bubble-like behavior out of a simple simulation. The researchers, including John Preskill, a prominent theoretical physicist at the California Institute of Technology, started with the [most] baby version of this problem that you can think of, as co-author Ashley Milsted put it: a line of about 1,000 digital arrows that could point up or down. The place where a string of mainly up arrows met a string of largely down arrows marked a bubble wall, and by flipping arrows, the researchers could make bubble walls move and collide. In certain circumstances, this model perfectly mimics the behavior of more complicated systems in nature. The researchers hoped to use it to simulate false vacuum decay and bubble collisions.

At first the simple setup didnt act realistically. When bubble walls crashed together, they rebounded perfectly, with none of the expected intricate reverberations or outflows of particles (in the form of flipped arrows rippling down the line). But after adding some mathematical flourishes, the team saw colliding walls that spewed out energetic particleswith more particles appearing as the collisions grew more violent.

Read this article:

How Universes Might Bubble Up and Collide - WIRED

Read More..

Copperizing the Complexity of Superconductivity – Newswise

Newswise From the perspective of a materials science physicist, a keen interest in copper oxides makes sense. The metallic compounds are versatile in their usefulnessfrom coins and antibacterials to spin dynamics and high-temperature superconductivity.

For a condensed matter expert like UC San Diegos Alex Frano, the goal of high-temperature superconductivity is to move charge currents through a material without resistance and energy loss at easily attainable temperatures. This efficiency is necessary for low-power electronics and quantum technology. But the ultra-low temperatures required for traditional materials to be superconductinghundreds of degrees below zerois a major obstacle.

While copper oxides are materials with the highest superconducting transition temperatures under normal conditions, physicists arent sure why. Frano, whose passion for physics is grounded in high-temperature superconductivity, believes this is a central problem. So while studying copper oxides recently, he and a group of research collaborators from the Max Planck Institute, Yale University, the University of British Columbia and UC Davis may have stumbled upon a major clue about how these metallic materials work. Their findings are published inNature Communicationsand could help revolutionize our understanding of these superconductive materials.

Copper involves electrons engaging with other electrons through whats called the electrostatic Coulomb interaction (when like charges repel and opposites attract). A ground state can emerge from this interaction and form a charge densitythe amount of electrical charge per unit of length or surface areawhich can modulate like waves in a sand dune. Running a sophisticated experiment using resonant inelastic X-ray scattering (RIXS), a method that investigates the electronic structure of a material, the researchers observed how X-rays scattered off their sample, making their surprising discovery possible.

Frano explained that while charge density waves are known to propagate in two well-defined directions within a plane of the material,for exampleeast-west and north-south, they observed fluctuating charge density waves propagatingin all directionswithin the plane. This is because of the Coulomb interaction, which emanates in all directions.

Nobody saw this coming, said the assistant professor in theDepartment of Physics. The Coulomb interaction governs most of the physical phenomena we have ever experienced. Most of the time, it is simple and monotoniconly consistently increasing or consistently decreasing as a function of distance between two separate charges. However, for electrons in solids, this can be non-monotonic because of the presence of other atoms.

The study showed that the electrons moved in a medium of other atoms that could be polarized, meaning that under certain conditions electrons with the same charge could even attract. This general concept of how electrons in solids interact may be key to understanding the emerging electronic phases of strongly correlated quantum materials, such as heavy fermions, iron- and copper-based high-temperature superconductors and twisted bilayer graphene.

According to Frano, high-temperature superconductivity is a majestic manifestation of quantum mechanics emerging into something so surprising and beautiful that the origin of high-temperature superconductivity is among the most important questions in solid state physics.

Not only because it could completely revolutionize the way energy is handled, but also because it is at the heart of one of the most fascinating kinds of materials called quantum materials, said Frano. What makes these interesting is that the rules of quantum mechanics govern their properties in a way that is completely unknown to all of us. They display a gamma of electronic phases like magnetism, charge density waves, and superconductivity all in the same material. And it is widely believed that the reason they are so rich in their properties is precisely why they are superconducting at such high temperatures. In other words, out of complexity comes more fascinating complexity.

This study was supported by the U.S. Department of Energy ([DOE] grant no. DESC0012704); the Advanced Light Source, a DOE Office of Science User Facility (contract no. DE-AC02-05CH11231); the National Science Foundation (grant nos. 845994 and 2034345); JSPS KAKENHI (grant no. JP17H01052) and several other research facilities, foundations and institutes.

Here is the original post:

Copperizing the Complexity of Superconductivity - Newswise

Read More..

The Convergence of Internet of Things and Quantum Computing – BBN Times

The Internet of Things (IoT) is actively shaping both the industrial and consumer worlds, and by 2023, consumers, companies, and governments will install 40 billion IoT devices globally.

Smart tech finds its way to every business and consumer domain there isfrom retail to healthcare, from finances to logisticsand a missed opportunity strategically employed by a competitor can easily qualify as a long-term failure for companies who dont innovate.

Moreover, the 2020s challenges just confirmed the need to secure all four components of the IoT Model: Sensors, Networks (Communications), Analytics (Cloud), and Applications.

One of the top candidates to help in securing IoT is Quantum Computing, while the idea of convergence of IoT and Quantum Computing is not a new topic, it was discussed in many works of literature and covered by various researchers, but nothing is close to practical applications so far. Quantum Computing is not ready yet, it is years away from deployment on a commercial scale.

To understand the complexity of this kind of convergence, first, you need to recognize the security issues of IoT, second, comprehend the complicated nature of Quantum Computing.

IoT systems diverse security issues include:

Classical computing relies, at its ultimate level, on principles expressed by a branch of math called Boolean algebra. Data must be processed in an exclusive binary state at any point in time or bits. While the time that each transistor or capacitor need be either in 0 or 1 before switching states is now measurable in billionths of a second, there is still a limit as to how quickly these devices can be made to switch state. As we progress to smaller and faster circuits, we begin to reach the physical limits of materials and the threshold for classical laws of physics to apply. Beyond this, the quantum world takes over.

In a quantum computer, several elemental particles such as electrons or photons can be used with either their charge or polarization acting as a representation of 0 and/or 1. Each of these particles is known as a quantum bit, or qubit, the nature and behavior of these particles form the basis of quantum computing.

The two most relevant aspects of quantum physics are the principles of superposition and entanglement.

Taken together, quantum superposition and entanglement create an enormously enhanced computing power. Where a 2-bit register in an ordinary computer can store only one of four binary configurations (00, 01, 10, or 11) at any given time, a 2-qubit register in a quantum computer can store all four numbers simultaneously, because each qubit represents two values. If more qubits are added, the increased capacity is expanded exponentially.

One of the most exciting avenues that researchers, armed with qubits, are exploring, is communications security.

Quantum security leads us to the concept ofquantum cryptographywhich uses physics to develop a cryptosystem completely secure against being compromised without the knowledge of the sender or the receiver of the messages.

Essentially, quantum cryptography is based on the usage of individual particles/waves of light (photon) and their intrinsic quantum properties to develop an unbreakable cryptosystem (because it is impossible to measure the quantum state of any system without disturbing that system).

Quantum cryptography uses photons to transmit a key. Once the key is transmitted, coding, and encoding using the normal secret-key method can take place. But how does a photon become a key? How do you attach information to a photon's spin?

This is where binary code comes into play. Each type of a photon's spin represents one piece of information -- usually a 1 or a 0, for binary code. This code uses strings of 1s and 0s to create a coherent message. For example, 11100100110 could correspond with h-e-l-l-o. So a binary code can be assigned to each photon -- for example, a photon that has a vertical spin ( | ) can be assigned a 1.

Regular, non-quantum encryption can work in a variety of ways but, generally, a message is scrambled and can only be unscrambled using a secret key. The trick is to make sure that whomever youre trying to hide your communication from doesnt get their hands on your secret key. But such encryption techniques have their vulnerabilities. Certain products called weak keys happen to be easier to factor than others. Also, Moores Law continually ups the processing power of our computers. Even more importantly, mathematicians are constantly developing new algorithms that allow for easier factorization of the secret key.

Quantum cryptography avoids all these issues. Here, the key is encrypted into a series of photons that get passed between two parties trying to share secret information. Heisenbergs Uncertainty Principle dictates that an adversary cant look at these photons without changing or destroying them.

With its capabilities, quantum computing can help address the challenges and issues that hamper the growth of IoT. Some of these capabilities are:

Quantum computing is still in its development stage with tech giants such as IBM, Google, and Microsoft putting in resources to build powerful quantum computers. While they were able to build machines containing more and more qubits, for example, Google announced in 2019 they achieved Quantum Supremacy, the challenge is to get these qubits to operate smoothly and with less error. But with the technology being very promising, continuous research and development are expected until such time that it reaches widespread practical applications for both consumers and businesses.

IoT is expanding as we depend on our digital devices more every day. Furthermore, WFH (Work From Home) concept resulted from COVID-19 lockdowns accelerated the deployment of many IoT devices and shorten the learning curves of using such devices. When IoT converges with Quantum Computing under Quantum IoT or QIoT, that will push other technologies to use Quantum Computing and add Quantum or Q to their products and services labels, we will see more adoption of Quantum hardware and software applications in addition to Quantum services like QSaaS, QIaaS, and QPaaS as parts of Quantum Cloud and QAI (Quantum Artificial Intelligence) to mention few examples.

A version of this article first appeared onIEEE-IoT.

Read the rest here:

The Convergence of Internet of Things and Quantum Computing - BBN Times

Read More..

Who You Really Are And Why It Matters | Practical Ethics – Practical Ethics

By Charles Foster

[This is a review of The Flip: Who you really are, and why it matters, by Jeffrey J. Kripal. Penguin, 2020]

A few years ago I dislocated my shoulder. I went off to hospital, and breathed nitrous oxide while they tried to put it back. Something very strange yet very common happened. I rose out of my body, and looked down at it. I could see the nurses centre parting and the top of my own bald head. I was aware of the pain in the shoulder, and regretted it, but it wasnt really my business.

My mind was hovering over the skull that encased my brain, and so it seemed ludicrous to say that mind and brain were identical. The experience ousted my residual materialism. Out went Aristotle: in came Plato. This change was a flip, as Kripal describes such events in this exhilarating, bold, timely, and profoundly important book.

Personal experience of this kind often produces tectonic philosophical conversions in professional philosophers and scientists. Mere reflection rarely does. This observation itself is likely to elicit howls of derision from the materialists. For them, to intrude oneself into an inquiry is necessarily to invalidate it. And of course the humanities are supremely to be mocked, for they are all to do with subjectivity.

This derision has a dated, desperate feel about it. Its the last gasp of a fundamentalism thats on the way out. In assessing the results of scientific experimentation one simply cant ignore the consciousness of the observer. The idea that one can goes back to Descartes, who split reality into two realms the mental and the material. Eighteenth century science, without any evidence whatever for the split, and ignoring an immense amount of evidence for its absence, then ignored the mental domain, and proceeded on the assumption that all that there was (or all that mattered) was a mechanical reality, unaffected by observation, and devoid of consciousness. The rules governing the operation of the machine were clear. Newton and others had defined them.

Thats where most scientists stand today at least in public, and if they want to get and keep tenure, and be published in the good journals. Newton has been joined on the pedestal by Darwin. Together they are omniscient.

There are some impressive things on the cv of post-eighteenth century science. It has made many cool gadgets, and some vindicated predictions. But its reputation depends on looking only at its successes, and ignoring the failures. Its easy to draw a neat straight line on a graph if you delete all the outliers.

Everyone knows that quantum mechanics and relativity are discordant with classical mechanics, but the significance of the discordance is not widely appreciated. Newton, after all, continues to calculate fairly accurately the momentum of car crashes and the orbits of planets.

The real significance of the difference lies in the role that each accords to the effect of the observer, and accordingly in the degree of certainty with which each says assertions about the natural world can be made. These issues were the subject of a famous debate between Niels Bohr and Albert Einstein. Einstein (despite his authorship of relativity) advocated the traditional view, inherited from Newtonian mechanics and embodied in the swaggering self-confidence of nineteenth-century science, that physics would eventually describe perfectly the weave of the world. This is the (essentially religious) belief thats voiced whenever one of sciences shortcomings is mentioned. Take consciousness, for instance. There has been no progress whatever in saying what it is, or in suggesting how it might be an emergent property of matter. Just give us time, comes the response. Our existing principles will do the job.

Youve misunderstood physics, Bohr told Einstein. Uncertainty doesnt denote an incomplete theory: it is part of the very structure of reality. Heisenberg had noted that there was no such thing as an objective real world whose smallest parts exist objectively in the same sense as stones or trees exist independently of whether we observe them

We now know that Bohr and Heisenberg were right, and Einstein was wrong at least in relation to fundamental particles. Relationship, and consequential indeterminacy, are basic constituents of the universe. Once particles have interrelated, their internal states correlate with one another, however widely separated in time or space the particles may be. Since all particles began life at or near the same place, at or near the same time, perhaps we can talk sensibly about the universe as one organism, each cell affecting the other. Many mystics many quantum physicists amongst them have spoken of the interconnection of things in terms of Mind.

There is obviously a relationship between brain and mind: between matter and consciousness. If a lorry rolls over my head it will affect my consciousness in some way. Mind, as Kripal puts it, is mattered. But this does not begin to exclude the possibility that matter is minded. William James put it beautifully: Human consciousness is a function of the brain, but function is not the same thing as production. Function can also denote transmission. A prism reflects light, but the light is not produced by the prism itself. Perhaps brains are like transmitters or receivers of mind. Perhaps they act like valves or filters, restricting the flow into us of data from an extravagantly minded world. It would make sense of much human experience not least the dramatic new perspectives (out of body experiences and near death experiences among them) that we get when the valve is compromised. Subjects who have had out of body experiences often report that they have had a 360 degree view of their own body. It sounds suspiciously as if theyve added another dimension to their perception; as if the brains usual and convenient (but mathematically nave) insistence on three spatial dimensions has been temporarily trumped.

The general materialistic framework of the sciences at the moment is not wrong, writes Kripal. It is simply half right. His book is a brilliantly successful attempt to demonstrate what might be added to our understanding of the universe and ourselves if we took seriously the insights of ordinary and extraordinary human experience. Those insights chime perfectly with Bohr and Heisenberg, and they suggest strongly that mindedness is fundamental to the cosmos, not some tangential, accidental, or recent emergent property of matter. They may indeed go further than that, and entail the conclusion that matter is an expression of some kind of cosmic Mind.

The equations of quantum physics are, for Kripal, a thrilling new genre of mystical literature. In the quantum world, matter is congealed energy, the division between space and time is illusory, and dark energy constitutes most of the universe. You can go seamlessly from those observations to the Tibetan Book of the Dead or the accounts of the post-resurrection appearances of Jesus.

Where does all this leave the humanities? If the best books on consciousness are written by physicists, does anyone who doesnt understand partial differential equations have anything to offer? Yes, says Kripal and this may be the main legacy of The Flip. The best defence advocates are those who acknowledge their clients shortcomings, and Kripal is merciless. Why, he asks, should anyone listen respectfully to a discipline whose central arguments often boil down to the claim that the only truth to have is that there is no truth? Quite right. But there is hope for non-scientific writers. The humanities, after all, have had consciousness as the, or a, central concern for thousands of years. And now their special subject is the main focus of research in the worlds best funded laboratories. Kripal proposes that we reimagine the humanities as the study of consciousness coded in culture. (Original emphasis). Thats a high calling.

An era can be considered over when its basic illusions have been exhausted, wrote Arthur Miller. The illusion of the adequacy of materialism as an explanation for the nature of the world is exhausted, and a new era of real science is surely about to begin an era in which all the available evidence is taken into account, and accordingly one that recognises that (in Kripals words) mind is an irreducible dimension or substrate of the natural world, indeed of the whole cosmos, and in which science and the humanities play a synergistic role in expounding the nature of that substrate. Kripals book will be seen as one of the foundational texts of that new synergy.

Continued here:

Who You Really Are And Why It Matters | Practical Ethics - Practical Ethics

Read More..