Page 2,202«..1020..2,2012,2022,2032,204..2,2102,220..»

What does Buddhism offer physics? – Big Think

Almost 50 years ago, two influential books on Buddhism and physics were published. First came The Dancing Wu Li Masters by Gary Zukav. Fritjof Capras The Tao of Physics followed. Both books were international bestsellers. Both attempted to show how quantum mechanics the physics of molecules, atoms, and subatomic particles recovered the core tenets of Buddhist philosophy.

This weekend, Marcelo and I will attend an amazing meeting called Buddhism, Physics, and Philosophy Redux at the University of California, Berkeley Center for Buddhist Studies. Since the meeting aims to re-examine what, if any, relationship might bind Buddhist perspectives on the nature of reality to those of modern physics, I thought this would be a great time to explain why that goal is meaningful.

I read The Tao of Physics as a student in a freshman physics class in 1981. It blew me away, but more for its excellent descriptions of quantum mechanics than for its argument that Buddhism and physics overlap. Even then, I felt the argument stretched itself too thin. As the years progressed, I got my PhD in theoretical physics and began practicing Zen Buddhism seriously. I developed a much better perspective on what Zukav and Capra were arguing for, and I bought their arguments even less.

The real problem with both books is all about interpretation; specifically, quantum interpretation. From its very start in the early 20th century, quantum mechanics was known to be weird. Classical physics builds a complete picture of the world from tiny particles bouncing off each other like nano billiard balls. Quantum mechanics, on the other hand, allows for no easy visualization.

Instead, quantum mechanics tells us that particles like atoms can be in two places at the same time until a measurement is made. It tells us that the properties of those atoms can be inherently uncertain, as if they were actually smeared out and did not have definite values. It also tells us that particles on opposite sides of the Universe can be entangled such that what happens to one instantly affects the other, even though no physical signal had time to pass between them.

For the last 100 years, physicists have scratched their heads over this basket of quantum weirdness. And over those same 100 years, they have developed different interpretations of the theory. Each interpretation paints a different picture of what is meant by an atom in terms of physical reality. In the same way, each paints a different picture of what is meant by a measurement as an interaction between something that is observed, and something else that is the observer.

The thing is, there are many of these interpretations. One of these is called the Copenhagen Interpretation. It is named after the city where Neils Bohr, one of the founders of quantum mechanics, lived.

Subscribe for counterintuitive, surprising, and impactful stories delivered to your inbox every Thursday

The interpretation does seem to have some interesting parallels with the classical philosophies that emerged from India and Asia when Buddhism was the dominant religion. In particular, the Copenhagen Interpretation seems to open a path for observers to play a strange but central role in grounding what can happen in a quantum experiment. Thus, the idea that the observer affects the observed is certainly something the Copenhagen Interpretation might seem to allow for, and this might be connected with certain tenets of Buddhism. Now, there are couple of mights in that last sentence. You can find physicists who are pro-Copenhagen Interpretation just as you can find Buddhist scholars who would disagree with it. But that was not the main problem with Capra and Zukovs thesis.

The real problem with the 1970s version of Quantum Buddhism was that it privileged the Copenhagen Interpretation. It never really addressed the fact that Copenhagen was just that an interpretation with no more validity than other interpretations (such as the Many Worlds view favored by folks like Sean Carroll). As time went on and Quantum Buddhism became a staple of New Age wackiness, that key point the Copenhagen Interpretation is just one interpretation was completely forgotten.

Fifty years later, it is now time to re-examine Buddhist philosophical perspectives and the frontiers of physics. The point is not to show that physics is confirming the truths of Buddhism. That will never happen, nor should it. Instead, once we recognize that physics has always been influenced by philosophical ideas, we can recognize that throughout its entire history those ideas have come solely from Western philosophers. But half a world away, Buddhist philosophers were encountering many similar questions, like the nature of time and causality, or how consciousness stands in relation to the world.

Because they were coming from a different history, these Buddhists explored other kinds of responses to the same questions their Western counterparts pondered. In this way, there may be perspectives in the long history of Buddhist philosophy that prove fruitful for physicists pushing at their own frontiers the places where we are stuck, or hitting paradoxes. That is why I am so very excited for what is going to happen over the next few days.

Read the original post:

What does Buddhism offer physics? - Big Think

Read More..

Rotating Lepton Model: Coupling relativity, quantum mechanics and neutrinos for the synthesis of matter – Open Access Government

For the last fifty years, the Standard Model (SM) of particle physics has provided the basis for describing the structure and composition of matter. According to the SM, protons and neutrons, which belong to the hadron family of composite particles and are the components of atomic nuclei, consist of elementary particles called quarks which are kept together by a force named Strong Force. (1) No quarks have ever been isolated and studied independently, and their masses are estimated to be comparable to those of baryons, i.e. of the order of 1 GeV/c2. These masses are 100 billion times (10,11) larger than the masses of neutrinos (10-1 to 10-3 eV/c2) (2) which are the lightest by far, as well as the most numerous, particles in our Universe.

Einsteins theory of Relativity (both special (SR) (3,4) and general (GR) (5)) is one of the most remarkable scientific achievements in the history of humanity. SR has been confirmed experimentally thousands of times and there have been also numerous confirmations of GR. Most confirmations refer to macroscopic systems and only recently (6) the amazing strength of SR and GR has been demonstrated inside hadrons, deep in the femtocosmos of the lightest elementary particles, i.e., of neutrinos.

Space contraction, time dilation and mass increase with particle speed are the main features of SR, as the particle speed with respect to an observer, at rest with the centre of rotation, approaches the speed of light c and thus the Lorentz factor , defined from = (1 v2 / c2) 1/2, approaches infinity.

Thus, upon considering three particles rotating symmetrically on a cyclic trajectory using their gravitational attraction, FG, as the centripetal force, then FG can become surprisingly strong. This is because SR dictates that a particle of rest mass mo has a relativistic mass mo, (3,4) and a longitudinal inertial mass 3mo, equal according to the equivalence principle (6,7) with its gravitational mass 3mo.(7,8) Therefore, using the definition of the gravitational mass in Newtons gravitational law it follows:

FG = Gm2o6 / (3r2) (1)

where r is the rotational radius. To find r and one must also use the de Broglie equation of Quantum Mechanics:

movr = n (2)

This is used to obtain for n=1 and mo43.7 meV/c2, estimated (6,8) from the Superkamiokande measurements, (2) that r0.63 fm and 7.163.109, thus 61.35.1059. Consequently, the rotating speed is very close to c and the gravitational force is, amazingly, according to equation (1), 59 orders of magnitude larger than normal nonrelativistic Newtonian force! (Fig. 1) This force equals 8.104 N, equal to the weight of 100 humans on earth.

In addition to causing such an astounding 6~1059 times increase in gravitational attraction, special elativity also causes an amazing ~ 7.168.109 increase in the mass of the three rotating neutrinos so that the composite particle mass increases from 3(43.7 meV/c2) to the neutron mass of 939.565 MeV/c2 (Fig. 1). Conversely, if the composite particle mass, 3mo, is that of a neutron (939.565 MeV/c2) then the rest mass, mo, of each rotating particle is that of the heaviest neutrino eigenmass, (9) i.e. 43.7 meV/c2, in good agreement with the Superkamiokande measurement of the heaviest neutrino mass. (2) Therefore, special relativity reveals that quarks are relativistic neutrinos and also shows that the neutrino gravitational mass, 3mo, is enormous, i.e. of the order of the Planck mass (c/G)1/2 = 21.7 mg per neutrino! It thus also implies, in conjunction with equation (2), that the gravitational force of equation (1) equals the strong force, c/r2, which is a factor of 137 stronger than the electrostatic force of positron -electron pair at the same distance. (1)

The RLM shows that maximisation of the Lorentz factor leads to enhanced composite particle stability by minimizing 5moc2, which is the potential energy of the rotating neutrino triad (8) and, at the same time by maximising the Lorentz factor and thus also the produced hadron mass m = 3mo = 313/12 (mP1mo2)1/3, where mP1 is the Planck mass (=(c/G)1/2 = 1.22.1028 eV/c2). This simple expression gives, amazingly, a mass value which differs less than 1% from the experimental neutron mass of 939.565 MeV/c2.

Neutrinos are well known to come in three different flavours, i.e. electron neutrinos, muon neutrinos and tau neutrinos. These flavours are obtained by mixing neutrinos from the three mass types (or mass eigenstates), i.e. m3 mass neutrinos (the heaviest), m2 mass neutrinos and m1 type neutrinos (the lightest) for the Normal Hierarchy. Using equation (1) and the experimental hadron masses, we have computed the composite particle mass values plotted in Figure 2. Agreement with the experimental composite mass values is better than 2%. Conversely, one may use the experimental hadron or boson mass values to compute the three neutrino masses. Agreement with the experimental values measured at Superkamiokande (2) is within 5%.

The fact that the gravitational Newton-Einstein equation (1) provides such a good fit to the experimental mass values of hadrons shows that when accounting for special relativity, gravity suffices to describe the strong force. The equally good fit to the experimental mass values of W, Z0 and H bosons shows that relativistic gravity also suffices to describe the weak force. Indeed, in both cases at the limit of large one obtains FG = Gm2 P1 / r2 = G(c / G) / r2 = c/r2 whichis the strong force value. (1) Similarly, for the weak force one also obtains FG = c/r2. One may thus conclude that both the strong and the weak forces have been unified with Newtonian gravity (=1) in the RLM via equation (1). (10,12)

In summary, the RLM reveals that our known Universe is a product of the combination of neutrinos, electrons, positrons, Einsteins relativity, and the dual wave-particle nature of matter, as described by the de Broglie equation of quantum mechanics. (12,13)

References

Please note: This is a commercial profile

Editor's Recommended Articles

Read the original here:

Rotating Lepton Model: Coupling relativity, quantum mechanics and neutrinos for the synthesis of matter - Open Access Government

Read More..

Odile Jacob Publishing to release today The Science of Light, a captivating journey of scientific discovery by Nobel Prize-winning physicist Serge…

Light has fascinated mankind since the dawn of time. Elucidating its properties over the centuries has been an adventure intimately linked withthe birth and development of modern science; it has led, after many surprising twists, to the theories of relativity and quantum physics whichhave profoundly changed our view of the world at the microscopic and cosmic scales alike. Placing his own career in a rich lineage of scientific discovery, Nobel Prizewinningphysicist Serge Haroche offers a literally enlightening account of what we know about light today, how we learned it, and how that knowledgehas led to countless inventions that have revolutionized daily life.

From Galileo and Newton to Einstein and Feynman, from early measurementsof the speed of light to cutting-edge work on quantum entanglement, Haroche takes a detailed and personal look at light's role in how we seeand understand the universe. The Science of Light is at once a colorful history of scientific inquiry and a passionate defense of "blue sky research"investigations conducted not inpursuit of a particular goal, but out of curiosity and faith that today's abstract discoveries may well power tomorrow's most incrediblepossibilities.

A uniquely captivating book about the thrill of discovery.

Serge Harocheis professor emeritus at the Collge de France, a member of the Acadmie des Sciences, a foreign member of the U.S. National Academy of Sciences, and winner of the 2012 Nobel Prize in Physics for discovering methods of manipulating and measuring individual quantum systems. He has taught at Paris VI University, the cole Polytechnique, the cole Normale Suprieure, Harvard University, and Yale University.

Contact:[emailprotected]

SOURCE Odile Jacob Publishing

Link:

Odile Jacob Publishing to release today The Science of Light, a captivating journey of scientific discovery by Nobel Prize-winning physicist Serge...

Read More..

A new place for consciousness in our understanding of the universe – New Scientist

To make sense of mysteries like quantum mechanics and the passage of time, theorists are trying to reformulate physics to include subjective experience as a physical constituent of the world

By Thomas Lewton

Pablo Hurtado de Mendoza

A WALK in the woods. Every shade of green. A fleck of rain. The sensations and thoughts bound in every moment of experience feel central to our existence. But physics, which aims to describe the universe and everything in it, says nothing about your inner world. Our descriptions of the wavelengths of light as they reflect off leaves capture something but not what it is like to be deep in the woods.

It can seem as if there is an insurmountable gap between our subjective experience of the world and our attempts to objectively describe it. And yet our brains are made of matter so, you might think, the states of mind they generate must be explicable in terms of states of matter. The question is: how? And if we cant explain consciousness in physical terms, how do we find a place for it in an all-embracing view of the universe?

There is no question in science more difficult and confusing, says Lee Smolin, a theoretical physicist at the Perimeter Institute for Theoretical Physics in Waterloo, Canada.

It is also one that he and others are addressing with renewed vigour, convinced that we will never make sense of the universes mysteries things like how reality emerges from the fog of the quantum world and what the passage of time truly signifies unless we reimagine the relationship between matter and mind.

Their ideas amount to an audacious attempt to describe the universe from the inside out, rather than the other way around, and they might just force us to abandon long-cherished assumptions about what everything is ultimately made of.

Modern physics

Original post:

A new place for consciousness in our understanding of the universe - New Scientist

Read More..

Physics – Seeking Diversity When Faced with Adversity – Physics

My first faculty position was at Old Dominion University in southern Virginia, where I saw racism up close and personal. Southern Virginia, being a very conservative place, was also a tough environment in which to be openly gay, so I moved, eventually ending up at the University of Connecticut.

I had a very difficult experience with harassment in Connecticut that impacted my mental and physical health. That experience politicized me and led me to be more outspoken about LGBTQ+ issues. I then moved to the California Institute of Techology, which felt to me like a nirvana for both science and inclusivity. It was a place where my husband and I were welcomed and loved. However, the funding for my position was not secure, so I had to move again.

At the University of Wisconsin, where I went next, I had supportive colleagues, but I still experienced discriminatory treatment that distracted me from science and came with high legal expenses. It was because of these experiences that I decided to start being more open about discussing sexual and gender identity issues. But the real turning point for me in taking an active advocacy role was the 2012 APS March Meeting.

See original here:

Physics - Seeking Diversity When Faced with Adversity - Physics

Read More..

Here are the Top 10 times scientific imagination failed – Science News Magazine

Science, some would say, is an enterprise that should concern itself solely with cold, hard facts. Flights of imagination should be the province of philosophers and poets.

On the other hand, as Albert Einstein so astutely observed, Imagination is more important than knowledge. Knowledge, he said, is limited to what we know now, while imagination embraces the entire world, stimulating progress.

So with science, imagination has often been the prelude to transformative advances in knowledge, remaking humankinds understanding of the world and enabling powerful new technologies.

Headlines and summaries of the latest Science News articles, delivered to your inbox

Thank you for signing up!

There was a problem signing you up.

And yet while sometimes spectacularly successful, imagination has also frequently failed in ways that retard the revealing of natures secrets. Some minds, it seems, are simply incapable of imagining that theres more to reality than what they already know.

On many occasions scientists have failed to foresee ways of testing novel ideas, ridiculing them as unverifiable and therefore unscientific. Consequently it is not too challenging to come up with enough failures of scientific imagination to compile a Top 10 list, beginning with:

By the middle of the 19th century, most scientists believed in atoms. Chemists especially. John Dalton had shown that the simple ratios of different elements making up chemical compounds strongly implied that each element consisted of identical tiny particles. Subsequent research on the weights of those atoms made their reality pretty hard to dispute. But that didnt deter physicist-philosopher Ernst Mach. Even as late as the beginning of the 20th century, he and a number of others insisted that atoms could not be real, as they were not accessible to the senses. Mach believed that atoms were a mental artifice, convenient fictions that helped in calculating the outcomes of chemical reactions. Have you ever seen one? he would ask.

Apart from the fallacy of defining reality as observable, Machs main failure was his inability to imagine a way that atoms could be observed. Even after Einstein proved the existence of atoms by indirect means in 1905, Mach stood his ground. He was unaware, of course, of the 20th century technologies that quantum mechanics would enable, and so did not foresee powerful new microscopes that could show actual images of atoms (and allow a certain computing company to drag them around to spell out IBM).

Machs views were similar to those of Auguste Comte, a French philosopher who originated the idea of positivism, which denies reality to anything other than objects of sensory experience. Comtes philosophy led (and in some cases still leads) many scientists astray. His greatest failure of imagination was an example he offered for what science could never know: the chemical composition of the stars.

Unable to imagine anybody affording a ticket on some entrepreneurs space rocket, Comte argued in 1835 that the identity of the stars components would forever remain beyond human knowledge. We could study their size, shapes and movements, he said, whereas we would never know how to study by any means their chemical composition, or their mineralogical structure, or for that matter, their temperature, which will necessarily always be concealed from us.

Within a few decades, though, a newfangled technology called spectroscopy enabled astronomers to analyze the colors of light emitted by stars. And since each chemical element emits (or absorbs) precise colors (or frequencies) of light, each set of colors is like a chemical fingerprint, an infallible indicator for an elements identity. Using a spectroscope to observe starlight therefore can reveal the chemistry of the stars, exactly what Comte thought impossible.

Sometimes imagination fails because of its overabundance rather than absence. In the case of the never-ending drama over the possibility of life on Mars, that planets famous canals turned out to be figments of overactive scientific imagination.

First observed in the late 19th century, the Martian canals showed up as streaks on the planets surface, described as canali by Italian astronomer Giovanni Schiaparelli. Canali is, however, Italian for channels, not canals. So in this case something was gained (rather than lost) in translation the idea that Mars was inhabited. Canals are dug, remarked British astronomer Norman Lockyer in 1901, ergo there were diggers. Soon astronomers imagined an elaborate system of canals transporting water from Martian poles to thirsty metropolitan areas and agricultural centers. (Some observers even imagined seeing canals on Venus and Mercury.)

With more constrained imaginations, aided by better telescopes and translations, belief in the Martian canals eventually faded. It was merely the Martian winds blowing dust (bright) and sand (dark) around the surface in ways that occasionally made bright and dark streaks line up in a deceptive manner to eyes attached to overly imaginative brains.

In 1934, Italian physicist Enrico Fermi bombarded uranium (atomic number 92) and other elements with neutrons, the particle discovered just two years earlier by James Chadwick. Fermi found that among the products was an unidentifiable new element. He thought he had created element 93, heavier than uranium. He could not imagine any other explanation. In 1938 Fermi was awarded the Nobel Prize in physics for demonstrating the existence of new radioactive elements produced by neutron irradiation.

It turned out, however, that Fermi had unwittingly demonstrated nuclear fission. His bombardment products were actually lighter, previously known elements fragments split from the heavy uranium nucleus. Of course, the scientists later credited with discovering fission, Otto Hahn and Fritz Strassmann, didnt understand their results either. Hahns former collaborator Lise Meitner was the one who explained what theyd done. Another woman, chemist Ida Noddack, had imagined the possibility of fission to explain Fermis results, but for some reason nobody listened to her.

In the 1920s, most physicists had convinced themselves that nature was built from just two basic particles: positively charged protons and negatively charged electrons. Some had, however, imagined the possibility of a particle with no electric charge. One specific proposal for such a particle came in 1930 from Austrian physicist Wolfgang Pauli. He suggested that a no-charge particle could explain a suspicious loss of energy observed in beta-particle radioactivity. Paulis idea was worked out mathematically by Fermi, who named the neutral particle the neutrino. Fermis math was then examined by physicists Hans Bethe and Rudolf Peierls, who deduced that the neutrino would zip through matter so easily that there was no imaginable way of detecting its existence (short of building a tank of liquid hydrogen 6 million billion miles wide). There is no practically possible way of observing the neutrino, Bethe and Peierls concluded.

But they had failed to imagine the possibility of finding a source of huge numbers of high-energy neutrinos, so that a few could be captured even if almost all escaped. No such source was known until nuclear fission reactors were invented. In the 1950s, Frederick Reines and Clyde Cowan used reactors to definitely establish the neutrinos existence. Reines later said he sought a way to detect the neutrino precisely because everybody had told him it wasnt possible to detect the neutrino.

Ernest Rutherford, one of the 20th centurys greatest experimental physicists, was not exactly unimaginative. He imagined the existence of the neutron a dozen years before it was discovered, and he figured out that a weird experiment conducted by his assistants had revealed that atoms contained a dense central nucleus. It was clear that the atomic nucleus packed an enormous quantity of energy, but Rutherford could imagine no way to extract that energy for practical purposes. In 1933, at a meeting of the British Association for the Advancement of Science, he noted that although the nucleus contained a lot of energy, it would also require energy to release it. Anyone saying we can exploit atomic energy is talking moonshine, Rutherford declared. To be fair, Rutherford qualified the moonshine remark by saying with our present knowledge, so in a way he perhaps was anticipating the discovery of nuclear fission a few years later. (And some historians have suggested that Rutherford did imagine the powerful release of nuclear energy, but thought it was a bad idea and wanted to discourage people from attempting it.)

Rutherfords reputation for imagination was bolstered by his inference that radioactive matter deep underground could solve the mystery of the age of the Earth. In the mid-19th century, William Thomson (later known as Lord Kelvin) calculated the Earths age to be something a little more than 100 million years, and possibly much less. Geologists insisted that the Earth must be much older perhaps billions of years to account for the planets geological features.

Kelvin calculated his estimate assuming the Earth was born as a molten rocky mass that then cooled to its present temperature. But following the discovery of radioactivity at the end of the 19th century, Rutherford pointed out that it provided a new source of heat in the Earths interior. While giving a talk (in Kelvins presence), Rutherford suggested that Kelvin had basically prophesized a new source of planetary heat.

While Kelvins neglect of radioactivity is the standard story, a more thorough analysis shows that adding that heat to his math would not have changed his estimate very much. Rather, Kelvins mistake was assuming the interior to be rigid. John Perry (one of Kelvins former assistants) showed in 1895 that the flow of heat deep within the Earths interior would alter Kelvins calculations considerably enough to allow the Earth to be billions of years old. It turned out that the Earths mantle is fluid on long time scales, which not only explains the age of the Earth, but also plate tectonics.

Before the mid-1950s, nobody imagined that the laws of physics gave a hoot about handedness. The same laws should govern matter in action when viewed straight-on or in a mirror, just as the rules of baseball applied equally to Ted Williams and Willie Mays, not to mention Mickey Mantle. But in 1956 physicists Tsung-Dao Lee and Chen Ning Yang suggested that perfect right-left symmetry (or parity) might be violated by the weak nuclear force, and experiments soon confirmed their suspicion.

Restoring sanity to nature, many physicists thought, required antimatter. If you just switched left with right (mirror image), some subatomic processes exhibited a preferred handedness. But if you also replaced matter with antimatter (switching electric charge), left-right balance would be restored. In other words, reversing both charge (C) and parity (P) left natures behavior unchanged, a principle known as CP symmetry. CP symmetry had to be perfectly exact; otherwise natures laws would change if you went backward (instead of forward) in time, and nobody could imagine that.

In the early 1960s, James Cronin and Val Fitch tested CP symmetrys perfection by studying subatomic particles called kaons and their antimatter counterparts. Kaons and antikaons both have zero charge but are not identical, because they are made from different quarks. Thanks to the quirky rules of quantum mechanics, kaons can turn into antikaons and vice versa. If CP symmetry is exact, each should turn into the other equally often. But Cronin and Fitch found that antikaons turn into kaons more often than the other way around. And that implied that natures laws allowed a preferred direction of time. People didnt want to believe it, Cronin said in a 1999 interview. Most physicists do believe it today, but the implications of CP violation for the nature of time and other cosmic questions remain mysterious.

In the early 20th century, the dogma of behaviorism, initiated by John Watson and championed a little later by B.F. Skinner, ensnared psychologists in a paradigm that literally excised imagination from science. The brain site of all imagination is a black box, the behaviorists insisted. Rules of human psychology (mostly inferred from experiments with rats and pigeons) could be scientifically established only by observing behavior. It was scientifically meaningless to inquire into the inner workings of the brain that directed such behavior, as those workings were in principle inaccessible to human observation. In other words, activity inside the brain was deemed scientifically irrelevant because it could not be observed. When what a person does [is] attributed to what is going on inside him, Skinner proclaimed, investigation is brought to an end.

Skinners behaviorist BS brainwashed a generation or two of followers into thinking the brain was beyond study. But fortunately for neuroscience, some physicists foresaw methods for observing neural activity in the brain without splitting the skull open, exhibiting imagination that the behaviorists lacked. In the 1970s Michel Ter-Pogossian, Michael Phelps and colleagues developed PET (positron emission tomography) scanning technology, which uses radioactive tracers to monitor brain activity. PET scanning is now complemented by magnetic resonance imaging, based on ideas developed in the 1930s and 1940s by physicists I.I. Rabi, Edward Purcell and Felix Bloch.

Nowadays astrophysicists are all agog about gravitational waves, which can reveal all sorts of secrets about what goes on in the distant universe. All hail Einstein, whose theory of gravity general relativity explains the waves existence. But Einstein was not the first to propose the idea. In the 19th century, James Clerk Maxwell devised the math explaining electromagnetic waves, and speculated that gravity might similarly induce waves in a gravitational field. He couldnt figure out how, though. Later other scientists, including Oliver Heaviside and Henri Poincar, speculated about gravity waves. So the possibility of their existence certainly had been imagined.

But many physicists doubted that the waves existed, or if they did, could not imagine any way of proving it. Shortly before Einstein completed his general relativity theory, German physicist Gustav Mie declared that the gravitational radiation emitted by any oscillating mass particle is so extraordinarily weak that it is unthinkable ever to detect it by any means whatsoever. Even Einstein had no idea how to detect gravitational waves, although he worked out the math describing them in a 1918 paper. In 1936 he decided that general relativity did not predict gravitational waves at all. But the paper rejecting them was simply wrong.

As it turned out, of course, gravitational waves are real and can be detected. At first they were verified indirectly, by the diminishing distance between mutually orbiting pulsars. And more recently they were directly detected by huge experiments relying on lasers. Nobody had been able to imagine detecting gravitational waves a century ago because nobody had imagined the existence of pulsars or lasers.

All these failures show how prejudice can sometimes dull the imagination. But they also show how an imagination failure can inspire the quest for a new success. And thats why science, so often detoured by dogma, still manages somehow, on long enough time scales, to provide technological wonders and cosmic insights beyond philosophers and poets wildest imagination.

Read more here:

Here are the Top 10 times scientific imagination failed - Science News Magazine

Read More..

Is 2022 the year encryption is doomed? – TechRepublic

Image: Cisco Talos

Quantum technology that the worlds superpowers are developing, if successful, will render many current encryption algorithms obsolete overnight. Whoever has access to this technology will be able to read almost any encrypted data or message.

Organizations need to pay attention to this emerging technology and take stock of the encryption algorithms in use, while planning to eventually upgrade these. Quantum computers already exist as proof-of-concept systems. For the moment, none are powerful enough to crack current encryption, but the private and public sectors are investing billions of dollars to create powerful systems that will revolutionize computing.

Nobody knows when a powerful quantum computer will become available, but we can predict the effects on security and prepare defenses.

Classical computers operate using bits of information. These bits exist in one of two states, either 1 or 0. Quantum computers operate in a different, but analogous way, operating with qubits. A qubit exists in a mixed state that is both partly 1 and partly 0 at the same time, only adopting a final state at the point when it is measured. This feature allows quantum computers to perform certain calculations much faster than current computers.

Quantum computers cannot solve problems for which current systems are unable to find solutions. However, some calculations take too long for practical application with current computers. With quantum computings speed, these calculations could become trivial to perform.

One example is finding the prime factors of large numbers. Any number can be expressed as multiples of prime numbers, but finding these prime numbers currently takes an incredibly long time. Public-key encryption algorithms rely on this fact to ensure the security of the data they encrypt.

It is the impractical amount of time involved, not the impossibility of the calculation, which secures public-key encryption. An approach named Shors algorithm can rapidly find such prime factors but can only be executed on a sizable quantum computer.

We know that we can break current public-key encryption by applying Shors algorithm, but we are waiting for a suitably powerful quantum computer to become available to implement this. Once someone develops a suitable quantum computer, the owner could break any system reliant on current public-key encryption.

SEE: Google Chrome: Security and UI tips you need to know (TechRepublic Premium)

Creating a working, sizable quantum computer is not a trivial matter.A handful of proof-of-concept quantum computing systems have been developed in the private sector. Although quantum research has been identified as a strategic priority for many countries, the path forward is less clear. Nevertheless, China has made quantum technology part of their current five-year plan and is known to have developed functional quantum systems to detect stealth aircraft and submarines, and have deployed quantum communication with satellites.

We know the difficulties in creating a sizable quantum system. What we dont know is if one of the global superpowers has overcome these and succeeded. We can expect that whoever is first to create such a system will be keen to keep it secret. Nevertheless, we can anticipate clues that will indicate a threat actor has developed a functional system.

Anyone possessing the worlds most powerful decryption computer will find it difficult to resist the temptation to put it to use. We would expect to see a threat actor seeking to collect large quantities of encrypted data in transit and data at rest, possibly by masquerading as criminal attacks.

Currently, experts do not observe the volume of network redirection attacks that would be expected for the large-scale collection of data, nor do we see the large-scale exfiltration of stored encrypted data. This is not to say that such attacks dont happen, but they are less frequent or audacious than might be expected if a state-sponsored threat actor was collecting data at scale.

Nobody knows when current encryption techniques will become obsolete. But we can prepare by upgrading encryption algorithms to those believed to be resistant to quantum attack. NIST is preparing standards for post-quantum encryption. In the meantime, the NSA has produced guidelines that offer guidance before relevant standards are published.

Encrypted, archived data is also at risk. Organizations may wish to consider if old data is still required. Wiping obsolete data may be the best defense against having the data stolen.

Until a sizable quantum computer is built and made available for research, we cannot be certain about the capabilities of such a system. It is possible that physical constraints will mean that such a system is not practical to build. Certainly, programming quantum computers will require new software engineering practices. It is also possible that programming shortcuts will be found that allow the practical breaking of encryption with a smaller quantum computer than currently expected.

Post-quantum standards and advice from governmental entities are welcome to guide organizations in transitioning to a quantum-secure environment. However, such advice may not reflect the state-of-the-art of malicious actors.

SEE: Password breach: Why pop culture and passwords dont mix (free PDF) (TechRepublic)

At some point, many current encryption algorithms will become instantly vulnerable to attack. In anticipation of this moment, organizations should take stock of the encryption algorithms they use and the associated key lengths. Where possible, systems should migrate to use AES-256 encryption, use SHA-384 or SHA-512 for hashing, and extend key lengths beyond 3072 bits as an interim measure.

Anyone implementing encryption software should consider the algorithm life span and provide users with the ability to change encryption strength and algorithm as necessary.

Quantum computing is a major focus of research and investment. Physical constraints mean that current chip architectures are difficult to advance further. Practical quantum computer systems will bring large gains in computing power and allow new computational techniques to be applied to solve problems that are currently impractical to calculate.

One application of a new quantum computer will be breaking encryption. When such a system is developed, its existence is likely to be kept secret. However, there are likely to be indicators in the actions of sophisticated threat actors that will betray the systems operation.

Reviewing and improving encryption implementations well in advance of the deployment of a functional quantum computer is vital to ensure the continued confidentiality of information. Take stock of encryption currently in use and plan how to upgrade this if necessary.

We might not be able to predict when such a system will be deployed against us, but we can prepare in advance our response.

For more information, visit the Cisco Newsrooms Q&A with Martin.

Author Martin Lee is technical lead of security research within Talos, Ciscos threat intelligence and research organization. As a researcher within Talos, he seeks to improve the resilience of the Internet and awareness of current threats through researching system vulnerabilities and changes in the threat landscape. With 19 years of experience within the security industry, he is CISSP certified, a Chartered Engineer, and holds degrees from the universities of Bristol, Cambridge, Paris and Oxford.

See the original post:
Is 2022 the year encryption is doomed? - TechRepublic

Read More..

Google Meet gets in-meeting reactions, PiP, end-to-end encryption and more – TechCrunch

Google announced a major update to Google Meet today that includes a number of long-requested features and plenty that you didnt even know you needed. There is a long list here, but the main additions are likely in-meeting reactions to give immediate updates to the Meet companion mode, emoji-based feedback, the ability to use Meet right inside of Docs, Sheets and Slides, as well as a new picture-in-picture mode so you can more easily ignore a meeting and the ability to stream a meeting to YouTube.

Security is another highlight of todays announcement. Starting in May, Google is rolling out client-side encryption in Meet, which is currently still in beta. With this, users have full control over the encryption keys and the identity provider used to access those keys. Later this year, Google will also introduce option end-to-end encryption for all meetings. Currently, all Meet data is encrypted in transit.

Image Credits: Google

Since 2020, its become increasingly clear that human connection is crucial, said Dave Citron, Googles director of product management for Google Meet and Voice in a press briefing ahead of todays announcement. We know we need solutions that help people build connections that can bridge the gap between physical spaces and the somewhere else.

He noted that a lot of these updates today focus on collaboration equity, that is, the ability to contribute to meetings regardless of location, role, experience level, language and device preference. One example for this is companion mode, which launched earlier this year and allows users to join a video meeting on a second screen. Now, Google is updating this with personal video tiles for every participant in a hybrid meeting, even if they are in a conference room with other participants. This update will work towards making those in physical space have the same experience as those who are working remotely, Citron explained.

Image Credits: Google

Like too many features Google announces these days, these updates will roll out later this year. This also means youll have to wait until next month to regale your co-workers with emojis during a meeting to help teams celebrate wins, offer support and share the love, as a Google spokesperson called it.

Picture-in-picture mode will also roll out next month, while automatic noise cancellation on Google Meet hardware is now rolling out to all users on Meet-enabled Logitech, Acer and Asus hardware.

The ability to stream to YouTube, which most companies will probably use for webinars and similar outward-facing meetings, is coming later this year.

Google also today announced a couple of updates to Spaces, but youre probably using Slack, so you can find more information about those here.

Image Credits: Google

Continue reading here:
Google Meet gets in-meeting reactions, PiP, end-to-end encryption and more - TechCrunch

Read More..

Skiff lands $10.5M to build out its end-to-end encrypted workspaces – TechCrunch

Six months after launching its end-to-end encrypted document editor, Skiff has bagged another $10.5 million in fresh funding to build out private and collaborative workspaces for its burgeoning customer base.

We wrote about Skiff last year ahead of its launch: Skiff is a web app that has much of the same document-writing and sharing capabilities as Google Docs but is built on a foundation of end-to-end encryption, so Skiff does not have access to users documents like Google does. The startup already has more than 20,000 people using its platform, leaps ahead of the 8,000 waitlisted users it had when we first spoke to the company last May.

But its the end-to-end encryption platform that Skiff relies on that holds the keys to the companys future. Now with $10.5 million in Series A funding in the bank, Skiffs co-founders Andrew Milich and Jason Ginsberg tell TechCrunch that the company is working toward becoming the application layer for the decentralized web.

A core part of the companys efforts have been on decentralization, a process that allows its users to take ownership of their data. Over the past year Skiff has partnered with Protocol Labs to offer decentralized storage, known as IPFS, or the Interplanetary File System, which allows Skiff to encrypt their documents and scatter them across a network of storage hosts, as well as integrating MetaMask, letting users sign in to Skiff using a portable crypto wallet instead of an email address.

The way we look at it is Web 2.0 is really about moving information around and web3 is about moving value around, said Ginsberg, Skiffs CTO, in a call. Data is the most valuable thing on the internet, and our goal is that you really should own your own data.

Ginsberg said the company is focused on growing its product offering, such as communication, and allowing users to share more kinds of data on its platform.

We see hundreds of millions of people choosing privacy products not really meeting the needs of working together remotely, and so thats really where we see Skiff coming in. Theres tons of different products that we could do along those lines. Were most interested right now in exploring products that not just deal with the document side of things, but also the communication side, said Ginsberg.

Milich, the startups chief executive, said the round led by Sequoia as a returning investor will help the company build out those new products that also rely on end-to-end encryption, like communication. Skiff currently has a team of 15 employees dotted across the globe, Milich said. The Series A brings Skiffs total funding to about $14 million.

Skiff is building an amazing team and visionary products to lead this moment, said Konstantine Buhler, a partner at Sequoia. We couldnt be more excited to double down.

View original post here:
Skiff lands $10.5M to build out its end-to-end encrypted workspaces - TechCrunch

Read More..

Encryption is key to data protection, but not all strategies look alike – Healthcare IT News

Cyber threats against healthcare organizations have been ramping up in the past few years, with highly publicized ransomware attacks leading to weeks-long network shutdowns at some institutions.

Experts warn that the situation may only worsen as bad actors become more sophisticated and as some get a boost from state-sponsored entities.

Anurag Lal, CEO of NetSfere which provides companies with security and message-delivery capabilities caught up with Healthcare IT News to discuss what he sees as the most pressing cyber threat, how organizations can protect themselves and how his experience as director of the U.S. National Broadband Task Force helped shape his perspective on these issues.

Q. Why are healthcare organizations particularly vulnerable to attacks?

A. Healthcare organizations are more at risk for cyber threats for a number of reasons. One, their systems are typically outdated and slower, and less secure as a result. Additionally, the pandemic accelerated the digitization of the healthcare industry, and an estimated 93% of healthcare organizations experienced some sort of data breach over the past two years.

These rushed transformation processes and outdated systems, combined with less centralized workplaces due to remote and hybrid work, create a large amount of risk for attacks.

Another reason healthcare organizations are more vulnerable is because their data is extremely valuable to hackers. Medical records and billing info create a huge target on the back of healthcare systems. Stolen health records may sell [for] up to 10 times more than credit card information on the dark web.

Q. What steps can organizations take to protect themselves?

A. Communicating efficiently and securely to protect patient and company data should remain a top priority as healthcare organizations become more digital. When deploying new communication channels, both internally between employees and with patients and providers, encryption is key.

Not all encryption is the same, though. End-to-end encryption is the gold standard when it comes to safe communications, verifying that messages are protected through every step of the process.

Its also important to educate employees on the dangers of phishing scams, as the majority of security breaches are a result of human error.

Q. On a related note, how can an organization be cognizant of protecting its communications with providers and patients?

A. Similarly to protecting themselves, healthcare organizations can protect their communications with providers and patients by modernizing communication channels and ensuring compliance. Regulations like the Health Insurance Portability and Accountability Act require healthcare organizations to follow specific (and stringent) standards for Protected Health Information, including sensitive patient information like medical histories and test results.

At the end of the day, the patient and their information are the priority and should be protected as such.

Q. What actions should the federal government be taking to address this threat?

A. The government should proactively implement safeguards to protect U.S. institutions from an inevitable cyberattack attempt.

One example is encouraging organizations to require Zero Trust Security and end-to-end-encryption [E2EE]. The idea behind the Zero Trust Security model is to "never trust, always verify"to protect data and intellectual property most securely. All resources are continuously authenticated, verified and authorized.

As I mentioned earlier, with E2EE, data is encrypted on the sender's system or device, and only the intended recipient is able to decrypt and read the message. Ensuring that business communication is locked down in this way applies zero-trust principles to mobile messaging and collaboration.

Q. You were director of the U.S. National Broadband Task Force under the Obama administration. How did that experience help shape your perspective on these issues?

A. During my time working on the Task Force, I saw in real time the very serious threats that exist and saw how cyberattacks affected other governments. For example, [bad actors linked to the] Russian government hacked the Ukrainian power grid, resulting in nationwide outages. Later, [they] installed malware on Ukraines accounting software, causing billions of dollars in damages.

Q. Do you have any predictions for the next few years in the cybersecurity sector?

A. I predict that cyberattacks will become more technologically advanced, so our ability to protect organizations and governments will need to become more advanced alongside them. This is evidenced by skyrocketing cyberattacks with 1,862publicly reported breachesin the U.S. in 2021, up more than 68% from 2020.

Kat Jercich is senior editor of Healthcare IT News.Twitter: @kjercichEmail: kjercich@himss.orgHealthcare IT News is a HIMSS Media publication.

Link:
Encryption is key to data protection, but not all strategies look alike - Healthcare IT News

Read More..