Page 2,429«..1020..2,4282,4292,4302,431..2,4402,450..»

Here’s how the universe could end in a ‘false vacuum decay’ – Space.com

This is the way the world ends: not with a bang, but with a quantum vacuum decay of the ground state of the universe to its true minimum.

The universe underwent radical phase transitions in the past. These transitions eventually led to the division of the four fundamental forces of nature and the panoply of particles we know today. All of that occurred when the universe was less than a second old, and it has been stable ever since.

But it might not last forever.

Our universe: Big Bang to now in 10 easy steps

To understand the stability of the universe, first we need to talk about phase transitions. Phase transitions are when a substance undergoes a rapid, radical transformation. They happen all the time. You boil water, and it transforms from a liquid into a gas. You cool that same water, and it turns into a block of ice.

Perhaps the most exotic phase transitions are those that happen to quantum fields. Quantum fields are the fundamental building blocks of the universe. Every kind of particle say, a photon or an electron is really just a local manifestation of an underlying field. That field soaks all of space and time like bread dipped in olive oil. The way those fields interact and communicate with each other makes up the forces and physics of our existence.

That existence is based on four fundamental forces: gravity, the weak force, electromagnetism and the strong force. But it hasn't always been this way. In the earliest moments of the cosmos, those forces were united. As the universe expanded and cooled, the quantum fields underwent phase transitions, splitting apart one by one.

The last phase transition occurred when the electromagnetic force split from the weak force. That splitting gave rise to the photon and the W and Z bosons, the carriers of those two forces.

Since that event, which happened when the universe wasn't even a second old, everything's been stable no more splitting, no more phase transitions. The four forces of nature went on to shape and sculpt the evolution of the cosmos for billions of years.

As far as everything looks, it's all stable for now, anyway.

Related: Is there anything beyond the universe?

The stability of the universe is tricky to measure. Sure, it's been over 13 billion years since anything as interesting as a phase transition has occurred. Yes, 13 billion years is a really long time, but in the world of quantum fields, anything can happen.

Our best bet at probing the stability of the universe is through the mass of the Higgs boson. The Higgs is a very interesting field; its presence in the universe is what separated the electromagnetic force from the weak force and what maintains that split today. Without the Higgs boson, those forces would merge right back together.

In quantum physics, the more massive an entity is, the more unstable it is. Massive particles quickly decay into lighter ones, for example. So, if the Higgs is very massive, it might not be as stable as it seems, and it might decay into something else someday. But if the Higgs is light enough, it's likely to hang out forever, and there's nothing more to say about the future of the quantum fields of the universe.

Measurements of the Higgs have found that its mass puts the universe smack in between the "really, honestly stable" and "Oh no, it looks a little unstable" regimes. Physicists call this state "metastable" a situation that is stable for now but could quickly deteriorate if something were to go wrong.

The apparent metastability of the quantum fields of the universe is a little unsettling. Although it could mean that the universe could persist for billions, even trillions, of years without anything going wrong at all, it could also mean that the universe is already beginning to transform. All it would take is one little shake in the wrong direction, in some random patch of the universe, where the Higgs falls apart and the underlying quantum fields find a new, more stable configuration. That region of "new" universe would then propagate outward at nearly the speed of light through the "old" universe.

This kind of phase transition is called a false vacuum decay. It references the idea that the vacuum of our universe is a false one its not as stable as it might appear, and it will someday decay into something new.

By the time we received any information that the phase transition was upon us, it would already be happening.

What would be on the other side of that new universe? It's impossible to say. It might be totally mundane, with the new quantum fields looking exactly like the old quantum fields and nothing amiss. It could be just a slight adjustment, like a little tuning to the nature of dark energy or a slight adjustment to the masses of neutrinos. Or, it could be radically different, with a universe filled with brand-new forces, fields and particles which would make life (and chemistry and atomics) as we know it impossible.

Of course, we're not even 100% sure about the metastability criterion. We know that the Standard Model of particle physics is incomplete. A complete version could rewrite our understanding of quantum fields and where the "stable-unstable" line is drawn.

Learn more by listening to the "Ask A Spaceman" podcast, available on iTunes and askaspaceman.com. Ask your own question on Twitter using #AskASpaceman or by following Paul @PaulMattSutter and facebook.com/PaulMattSutter.

Here is the original post:

Here's how the universe could end in a 'false vacuum decay' - Space.com

Read More..

Why Neuroscientist Solms Is No Materialist: Information Theory – Walter Bradley Center for Natural and Artificial Intelligence

Arjuna, the host of the Theology Unleashed broadcast with South African neuropsychologist Mark Solms and Stonybrook neurosurgeon Michael Egnor on the mind vs. the brain (October 22, 2021) begins this portion by offering a Hindu (Hare Krishna) perspective view of the whole question of mind vs. matter and he finds considerable common ground with the other two non-materialists! The true implications of quantum mechanics and information theory in refuting materialism are only beginning to be understood.

Summary to date: In the first portion, Solms, author of The Hidden Spring (2021), began by asserting in his opening statement that the source of consciousness in the brain is in fact in the brain stem, not the cerebral cortex, as is almost universally assumed. Dr. Egnor then responded that his clinical experience supports the view that brain is not mind.

Then Solms pointed to the reality that discussing the fact that the brain is not the mind can be a career-limiting move in neuroscience even though clinical experience supports the view. Egnor and Solms agreed that the further a neuroscientist gets from actual patients, the easier it is to adopt the view that the mind is just what the brain does (naturalism). Solms, who trained as a psychoanalyst as well, then described how he understands consciousness the capacity to feel things, for example, the redness of red (qualia) Talk then turned to the miraculous nature of life and Spinozas God., with Solms saying that he believes in Spinozas God, as did Albert Einstein. Egnor then explained why Christians see God as a Person: The most remarkable thing about us is personhood.

This portion begins at 01:24:30. A partial transcript and notes, plus summaries and links to date follow.

Arjuna: We talked about what consciousnesses is. Maybe we can talk about what matter is. Theres this idea that, Oh, its just matter. Matter explains that.

In Krishna Consciousness, we talk about God having inconceivable potencies. The materialist scientists attribute inconceivable potency to matter. They think matter has all these magical powers, like it can produce consciousness Its as if that answers the question and theres no further questions to be asked. [01:25:00]

Mark Solms: I must be careful not to exceed my credentials. Im not a physicist. But even as a non-physicist, I can say that it is astonishing that this is such a widespread view. It links with what Michael was saying earlier about scientists having very poor metaphysics and not even realizing that theyre starting from metaphysical assumptions of any kind, let alone unquestionable ones. [01:26:00]

Even I know that its really been a long time now in physics that the idea that matter is a fundamental concept has been transcended. I mean, Einsteins famous equation, E equals MC squared, makes the point that matter is derivative. Its a state of energy.

This naive idea that the fundamental stuff is matter 100 years ago we realized in physics that thats not true. I think the next really big development, beyond relativity and the basic insights of quantum physics that Michael was referring to, has been Shannons insight about information. [01:27:00]

Information, in neuroscience, is a crucial concept, and its very hard to think about quantum physics and the big questions that are unsolved that flow from it without the concept of information which, I hasten to draw your attention to the fact, is not matter. Im not a materialist for exactly that reason. [01:27:30]

I dont believe that the mind can be reduced to matter. Matter is an appearance. If youre wanting to make connections between mind and body and see them both as appearances, then you cant be a materialist. We must always remember, as I keep saying, that these are concepts. These are abstractions. These are inferences. These are words that we use to try to articulate these profound things. I think that, among those tools, the concept of information, in the sense that Shannon introduced it into physics in 1948, has not yet begun to The implications, the importance, the value of this concept has not begun to fully reveal itself. [01:28:30]

Note: Who was Claude Shannon (19162001)? The American mathematician and computer scientist who conceived and laid the foundations for information theory. His theories laid the groundwork for the electronic communications networks that now lace the earth Shannon was the person who saw that the binary digit was the fundamental element in all of communication, said Dr. Robert G. Gallager, a professor of electrical engineering who worked with Dr. Shannon at the Massachusetts Institute of Technology. That was really his discovery, and from it the whole communications revolution has sprung. IEEE Information Theory Society The binary digit is a mathematical concept, not a material thing.

Michael Egnor: The information concept dovetails very nicely with what a number of philosophers of science have pointed out, mainly philosopher of science Bruce Gordon He points out that when you look at the quantum world, matter doesnt exist. Nothing in the quantum world is matter [01:29:30]

I think the way where we went wrong in this was with Descartes and his notion of everything in nature as a machine extended in space, except for the spirit, the human mind, which is this kind of ghost thing.

Note: French mathematician Ren Descartes was famous for that view. But it came back to haunt us all, so to speak, when Gilbert Ryle (19001976) ridiculed the ghost in the machine, helping to establish the materialist dogma that the mind is simply what the brain does. The concept became the title of a book by a famous Arthur Koestler, unpacking that view.

Mark Solms: When you read Shannons paper Again, as I say, I always find it very valuable to go back and actually read what my forebears wrote. The title of his [1948] paper is A Mathematical Theory of Communication not of information but communication

Well, let me just cut to the chase. What we forget is that information doesnt exist without there being a question that the information is an answer to. And this perhaps goes back to what you were saying earlier about personhood and some of the other profound matters that we were touching on. Then the issue becomes more: Where does question-asking come from?

Next: Reclaiming the non-materialist dimension in science (Hint: Stephen Hawking was a fine writer but not a very good philosopher )

The discussion to date

Heres the first portion of the debate/discussion, where neuropsychologist Mark Solms shares his perspective: Consciousness: Is it in the cerebral cortex or the brain stem? In a recent discussion/debate with neurosurgeon Michael Egnor, neuropsychologist Mark Solms offers an unconventional but evidence-based view, favouring the brain stem. The evidence shows, says Mark Solms, author of The Hidden Spring, that the brain stem, not the cerebral cortex is the source of consciousness.

And Michael Egnor responds:

1.2. Neurosurgeon and neuropsychologist agree: Brain is not mind Michael Egnor tells Mark Solms: Neuroscience didnt help him understand people; quite the reverse, he had to understand people, and minds, to make sense of neuroscience. Egnor saw patients who didnt have most of their frontal lobes who were completely conscious, in fact, rather pleasant, bright people.

1.3. Then Solms admits what all know but few say: Neuroscientist: Mind is not just brain? Thats career limiting! Neuropsychologist Mark Solms and neurosurgeon Michael Egnor agreed that clinical experience supports a non-materialist view but that the establishment doesnt. Mark Solms: science is an incredibly rigid sort of its like a mafia. You have to go along with the rules of the Don, otherwise youve had it.

In the second portion, they offer definitions of consciousness:

2.1 Materialist neuroscientists dont usually see real patients. Neurosurgeon Michael Egnor and neuropsychologist Mark Solms find common ground: The mind can be merely what the brain does in an academic paper. But not in life. Egnor takes a stab at defining consciousness: Following Franz Brentano, he says, A conscious state is an intentional state. Next, it will be Solmss turn.

2.2 A neuropsychologist takes a crack at defining consciousness. Frustrated by reprimands for discussing Big Questions in neuroscience, Mark Solms decided to train as a psychoanalyst as well. As a neuropsychologist, he sees consciousness, in part, as the capacity to feel things, what philosophers call qualia the redness of red.

3.1 Einstein believed in Spinozas God. Who is that God? Neuropsychologist Mark Solms admits that life is miraculous and sees Spinozas God, embedded in nature, as the ultimate explanation. In a discussion with Solms, neurosurgeon Michael Egnor argues that it makes more sense to see God as a Person than as a personification of nature.

3.2 Egnor and Solms: What does it mean to say God is a Person? Mark Solms and Michael Egnor discuss and largely agree on what we can rationally know about God, using the tools of reason. Egnor argues that, if the most remarkable thing about us is our personhood (I am), it Makes sense to think of God as a Person (I AM).

You may also wish to read: Your mind vs. your brain: Ten things to know

View post:

Why Neuroscientist Solms Is No Materialist: Information Theory - Walter Bradley Center for Natural and Artificial Intelligence

Read More..

Counting the Risks for IonQ and Quantum Computing Technology – InvestorPlace

Investing is like a box of chocolates, you never know what you gonna get a top-line from the movie Forrest Gump. The actual phrase was about life, but investing is a part of life. IonQ (NYSE:IONQ) went public via a special purpose acquisition company (SPAC) deal with dMY Technology Group. It is the first publicly traded quantum computing firm and its goals are high but its risks might be higher.

2021 has been a year full of trends for investors. There was all kinds of excitement and drama, as well as new questions posed as to what matters in investing. Such as the power of retail investors to move stocks defying the fundamentals and causing new trends to emerge. One such new cutting-edge trend to the investing world is quantum computing. And IonQ is developing and commercializing this innovative technology, and claims that this is the future.

So, what should you know about IONQ stock now?

This first publicly traded purely quantum computing company has a mission to achieve significant things, from building the worlds best quantum computers, to solving the worlds most complex problems. Its quantum computing systems can solve complicated problems, using quantum physics, much faster compared to classical computers.

First, a small bit of quantum computing 101. Quantum computing uses qubits, quantum bits, rather than the classic bits, where each bit in classical computing has a value of either zero or one. Qubits can have a value of zero, one, or a linear combination of them. This superposition state allows quantum computers to bypass many of the traditional limits seen in classical computers. This in turn, gives them significant advantages over their classical counterparts.

There are plenty of applications where quantum computing can be helpful, cybersecurity, financial modeling, artificial intelligence, medicine research, even predicting the weather. There is just a lot of potential utility for quantum computing development.

However, this potential utility though comes with severe risks and challenges.

Quantum computers have several notable problems. As with any revolutionary technology, there are specific challenges and hurdles until its infancy period passes.

These main problems for quantum computers are both structural and operational. They can be simply summarized as hard to engineer, construct and program to run without any significant errors.

Another big problem currently for quantum computers is their price they are far more expensive than the classical computers that persons and businesses use for performing various tasks.

However, the biggest problem faced by quantum computers is a loss of coherence when making quantum calculations (aka decoherence). Decoherence means that these highly advanced computers are subject to factors such as vibrations, the temperature of their working environment, and any electromagnetic waves, making them operate less efficiently than they are built-in for.

Simply speaking, theyre extremely delicate.

Thus, the cost of competitiveness and innovation is incredibly high, and at this moment, makes it difficult to evaluate.

The market of any company it operates in, and its sector dynamics are of paramount importance for its financial performance and business excellence. The quantum computing market is expected to grow to $8.6 billion in 2027, from only $412 million in 2020. That is a compound annual growth rate of 50.9% for these six years.

IonQ defines itself as a leader in quantum computing. However, there are several factors that raise concerns about its stock now. There are at least four main fundamental worries I want to mention.

To start, IonQ in its third-quarter 2021 financial results reported a revenue of a measly $451,00 year-to-date. For a company with a market capitalization of $4 billion as of Dec. 9, the revenue reported is not only meaningless, but reinforces how expensive IONQ stock is now.

Second, I consider the total contract value (TCV) bookings of $15.1 million year-to-date, also not substantial.

Third, IonQ is an unprofitable company with widening losses in Q3 2021 compared to the second quarter of 2021. For the nine months ended Sept. 30, 2021 IonQ reported a net loss of $32,102,000, whereas for Q2 2020 the net loss was about a third of it, $10,493,000.

Fourth, IonQs research and development, sales, and marketing costs rose significantly in Q3 2021 and as a result, the loss from operations was wider than in Q2 2020. The companys operational losses in Q3 2021 were $10,524,000, compared to a loss of $3,576,000 in Q2 2020.

I see no significant revenue, and yet, IONQ stock has an incredibly high market capitalization. With net losses widening, and operating expenses increasing, this quantum computing firm has not yet made its commercialization that effective.

IONQ stock is a very high-risk play right now. IonQ is a firm with a technology that may seem revolutionary, but the expectations for high growth do not reflect a strong case in favor based on its fundamentals.

On the date of publication, Stavros Georgiadis, CFA did not have (either directly or indirectly) any positions in the securities mentioned in this article. The opinions expressed in this article are those of the writer, subject to the InvestorPlace.comPublishing Guidelines.

Stavros Georgiadis is a CFA charter holder, an Equity Research Analyst, and an Economist. He focuses on U.S. stocks and has his own stock market blog atthestockmarketontheinternet.com/. He has written in the past various articles for other publications and can be reached onTwitterand onLinkedIn.

More:

Counting the Risks for IonQ and Quantum Computing Technology - InvestorPlace

Read More..

A career built on the strongest force in the universe – EurekAlert

image:Latifa Elouadrhiri view more

Credit: DOE's Jefferson Lab

Latifa Elouadrhiri has spent her career pursuing a passion for experimental physics, investing nearly three decades in work at the Department of Energy's Thomas Jefferson National Accelerator Facility. She has also devoted herself to passing on her love of science to other women and underrepresented groups, including conferences that encourage undergraduate women to pursue physics degrees and careers.

Such efforts and her numerous professional successes havent gone unnoticed. Elouadrhiri was just presented with the 2021 Jesse W. Beams Research Award, which recognizes especially significant or meritorious research in physics that has earned the critical acclaim of peers from around the world. The award was established by the Southeastern Section of the American Physical Society (SESAPS) in 1973. Elouadrhiri is only the second woman to receive it.

This is just a great honor for me and for the science we do, Elouadrhiri said. Not just me this award is also a recognition of the team of scientists, including the technical staff and the students, that started at Christopher Newport University and continues at Jefferson Lab.

Elouadrhiri first arrived at the lab in 1994 in a joint position with CNU. She joined the experimental hall staff in 2001, and today is senior staff scientist in Hall B.

Elouadrhiri and other experimentalists use the labs powerful Continuous Electron Beam Accelerator Facility (CEBAF) to probe ever deeper into the proton that sits inside the atomic nucleus. CEBAF is a DOE user facility built to support research in nuclear physics.

In 2018, the CEBAF completed an upgrade that doubled its top design energy to 12 billion electron-volts, or 12 GeV, providing unequaled access to the mysterious elements of subatomic matter. The upgrade also enabled the experimental program in Hall B to be restructured with a novel detector called CLAS12. Elouadrhiri oversaw the full lifecycle of CLAS12 construction and commissioning.

That same year, Elouadrhiri and her team were lauded for achieving the first measurement of the pressure distribution inside the proton a finding that the quarks that make up the proton are subject to a crushing pressure 10 times that in the heart of a neutron star. Their results were published in the journal Nature and opened up an entirely new direction of exploration in nuclear and particle physics.

In announcing the Beams award, the SESAPS selection committee cited Elouadrhiris fundamental and lasting contributions to the development of experimental equipment in forefront nuclear science.

I followed my heart

Elouadrhiris journey to Jefferson Lab was an unlikely one. She was born in Morocco, the sixth of eight children, to a mother who could neither read nor write but who believed in the power of education.

She had never been to school, but she had a vision, said Elouadrhiri. She could see far into the future and created the right environment for us, making education particularly for women as central. She understood the importance of educating girls and finding our way, and supported us in anything we did.

Of the eight siblings, seven would go to college and such careers as diplomat, physician, college professor, computer engineer, artist and economist.

Elouadrhiri took her first physics class in high school and was just fascinated by the topic. It combines mathematics, science and also some philosophy how the world works.

At age 15, at a local flea market, she acquired her first physics book: a work by Werner Heisenberg, a German physicist and 1932 Nobel laureate responsible for the namesake Heisenbergs uncertainty principle of quantum mechanics.

And I was hooked, Elouadrhiri said. Since then, she said, that book travels with her everywhere.

She earned her undergraduate degree and then a masters in theoretical physics at Mohammed V University of Rabat. She moved to France to continue her studies toward a Ph.D., conducting experiments at the Saclay Nuclear Research Centre and also the Paul Scherrer Institute in Switzerland. She was accepted at the University of Massachusetts Amherst for her first postdoctoral position.

It was during an American Physical Society meeting that her work caught the attention of Nathan Isgur, then chief scientist at Jefferson Lab when it was still known simply as CEBAF. Isgur invited her to give a seminar on her research, then suggested she apply for the joint JLab/CNU position.

She was offered that position at the same time another offer for a permanent position came from the prestigious French National Centre for Scientific Research (CNRS).

I just followed my heart, Elouadrhiri said. My heart told me that I should stay here.

The Beams award, she said, is a big recognition for the science that we do. It really inspires me and motivates me to further develop experimental techniques toward understanding the way protons and neutrons, which are the building blocks of all atomic nuclei, are held together by the strong force. And with the CLAS12 science program we will be building a deeper understanding of these forces.

Personally, this award now helps in sharing my love of scientific learning with women throughout the world, and also continuing my work in broadening scientific participation across genders, ethnicities, religions, cultures and geographies. Im very excited.

Further ReadingHall B Staff Bios - Latifa ElouadrhiriMoroccan Physicist Latifa Elouadrhiri Makes Ground-Breaking Nuclear Physics DiscoveryW&M, Jefferson Lab host conference to support women undergrads in physicsQuarks Feel the Pressure in the Proton

By Tamara Dietrich

-end-

Jefferson Science Associates, LLC, operates the Thomas Jefferson National Accelerator Facility, or Jefferson Lab, for the U.S. Department of Energy's Office of Science.

DOEs Office of Science is the single largest supporter of basic research in the physical sciences in the United Statesand is working to address some of the most pressing challenges of our time. For more information, visithttps://energy.gov/science.

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.

More:

A career built on the strongest force in the universe - EurekAlert

Read More..

The future of scientific research is quantum – The Next Web

Over the past few years, the capabilities of quantum computers have reached the stage where they can be used to pursue research with widespread technological impact. Through their research, the Q4Q team at the University of Southern California, University of North Texas, and Central Michigan University, explores how software and algorithms designed for the latest quantum computing technologies can be adapted to suit the needs of applied sciences. In a collaborative project, the Q4Q team sets out a roadmap for bringing accessible, user-friendly quantum computing into fields ranging from materials science, to pharmaceutical drug development.

Since it first emerged in the 1980s, the field of quantum computing has promised to transform the ways in which we process information. The technology is centered on the fact that quantum particles such as electrons exist in superpositions of states. Quantum mechanics also dictates that particles will only collapse into one single measurable state when observed by a user. By harnessing these unique properties, physicists discovered that batches of quantum particles can act as more advanced counterparts to conventional binary bits which only exist in one of two possible states (on or off) at a given time.

On classical computers, we write and process information in a binary form. Namely, the basic unit of information is a bit, which takes on the logical binary values 0 or 1. Similarly, quantum bits (also known as qubits) are the native information carriers on quantum computers. Much like bits, we read binary outcomes of qubits, that is 0 or 1 for each qubit.

However, in a stark contrast to bits, we can encode information on a qubit in the form of a superposition of logical values of 0 and 1. This means that we can encode much more information in a qubit than in a bit. In addition, when we have a collection of qubits, the principle of superposition leads to computational states that can encode correlations among the qubits, which are stronger than any type of correlations achieved within a collection of bits. Superposition and strong quantum correlations are, arguably, the foundations on which quantum computers rely on to provide faster processing speeds than their classical counterparts.

To realize computations, qubit states can be used in quantum logic gates, which perform operations on qubits, thus transforming the input state according to a programmed algorithm. This is a paradigm for quantum computation, analogous to conventional computers. In 1998, both qubits and quantum logic gates were realized experimentally for the first time bringing the previously-theoretical concept of quantum computing into the real world.

From this basis, researchers then began to develop new software and algorithms, specially designed for operations using qubits. At the time, however, the widespread adoption of these techniques in everyday applications still seemed a long way off. The heart of the issue lay in the errors that are inevitably introduced to quantum systems by their surrounding environments. If uncorrected, these errors can cause qubits to lose their quantum information, rendering computations completely useless. Many studies at the time aimed to develop ways to correct these errors, but the processes they came up with were invariably costly and time-consuming.

Unfortunately, the risk of introducing errors to quantum computations increases drastically as more qubits are added to a system. For over a decade after the initial experimental realization of qubits and quantum logic gates, this meant that quantum computers showed little promise in rivalling the capabilities of their conventional counterparts.

In addition, quantum computing was largely limited to specialized research labs, meaning that many research groups that could have benefited from the technology were unable to access it.

While error correction remains a hurdle, the technology has since moved beyond specialized research labs, becoming accessible to more users. This occurred for the first time in 2011, when the first quantum annealer was commercialized. With this event, feasible routes emerged towards reliable quantum processors containing thousands of qubits capable of useful computations.

Quantum annealing is an advanced technique for obtaining optimal solutions to complex mathematical problems. It is a quantum computation paradigm alternative to operating on qubits with quantum logic gates.

The availability of commercial quantum annealers spurned a new surge in interest for quantum computing, with consequent technological progress, especially fueled by industrial capitals. In 2016, this culminated in the development of a new cloud system based on quantum logic gates, which enabled owners and users of quantum computers around the world to pool their resources together, expanding the use of the devices outside of specialized research labs. Before long, the widespread use of quantum software and algorithms for specific research scenarios began to look increasingly realistic.

At the time, however, the technology still required high levels of expertise to operate. Without specific knowledge of the quantum processes involved, researchers in fields such as biology, chemistry, materials science, and drug development could not make full use of them. Further progress would be needed before the advantages of quantum computing could be widely applied outside the field of quantum mechanics itself.

Now, the Q4Q team aims to build on these previous advances using user-friendly quantum algorithms and software packages to realize quantum simulations of physical systems. Where the deeply complex properties of these systems are incredibly difficult to recreate within conventional computers, there is now hope that this could be achieved using large systems of qubits.

To recreate the technologies that could realistically become widely available in the near future, the teams experiments will incorporate noisy intermediate-scale quantum (NISQ) devices which contain relatively large numbers of qubits, and by themselves are prone to environmental errors.

In their projects, the Q4Q team identifies three particular aspects of molecules and solid materials that could be better explored through the techniques they aim to develop. The first of these concerns the band structures of solids which describe the range of energy levels that electrons can occupy within a solid, as well as the energies they are forbidden from possessing.

Secondly, they aim to describe the vibrations and electronic properties of individual molecules each of which can heavily influence their physical properties. Finally, the researchers will explore how certain aspects of quantum annealing can be exploited to realize machine-learning algorithms which automatically improve through their experience of processing data.

As they apply these techniques, the Q4Q team predicts that their findings will lead to a better knowledge of the quantum properties of both molecules and solid materials. In particular, they hope to provide better descriptions of periodic solids, whose constituent atoms are arranged in reliably repeating patterns.

Previously, researchers struggled to reproduce the wavefunctions of interacting quantum particles within these materials, which relate to the probability of finding the particles in particular positions when observed by a user. Through their techniques, the Q4Q team aims to reduce the number of qubits required to capture these wavefunctions, leading to more realistic quantum simulations of the solid materials.

Elsewhere, the Q4Q team will account for the often deeply complex quantum properties of individual molecules made up of large groups of atoms. During chemical reactions, any changes taking place within these molecules will be strongly driven by quantum processes, which are still poorly understood. By developing plugins to existing quantum software, the team hopes to accurately recreate this quantum chemistry in simulated reactions.

If they are successful in reaching these goals, the results of their work could open up many new avenues of research within a diverse array of fields especially where the effects of quantum mechanics have not yet been widely considered. In particular, they will also contribute to identifying bottlenecks of current quantum processing units, which will aid the design of better quantum computers.

Perhaps most generally, the Q4Q team hopes that their techniques will enable researchers to better understand how matter responds to external perturbations, such as lasers and other light sources.

Elsewhere, widely accessible quantum software could become immensely useful in the design of new pharmaceutical drugs, as well as new fertilizers. By ascertaining how reactions between organic and biological molecules unfold within simulations, researchers could engineer molecular structures that are specifically tailored to treating certain medical conditions.

The ability to simulate these reactions could also lead to new advances in the field of biology as a whole, where processes involving large, deeply complex molecules including proteins and nucleic acids are critical to the function of every living organism.

Finally, a better knowledge of the vibrational and electronic properties of periodic solids could transform the field of materials physics. By precisely engineering structures to display certain physical properties on macroscopic scales, researchers could tailor new materials with a vast array of desirable characteristics: including durability, advanced interaction with light, and environmental sustainability.

If the impacts of the teams proposed research goals are as transformative as they hope, researchers in many different fields of the technological endeavor could soon be working with quantum technologies.

Such a clear shift away from traditional research practices could in turn create many new jobs with required skillsets including the use of cutting-edge quantum software and algorithms. Therefore, a key element of the teams activity is to develop new strategies for training future generations of researchers. Members of the Q4Q team believe that this will present some of the clearest routes yet towards the widespread application of quantum computing in our everyday lives.

This article was authored by the Q4Q team, consisting of lead investigator Rosa Di Felice, Anna Krylov, Marco Fornari, Marco Buongiorno Nardelli, Itay Hen and Amir Kalev, in Scientia. Learn more about the team, and find the original article here.

See the article here:

The future of scientific research is quantum - The Next Web

Read More..

When science mixes with politics, all we get is politics – Big Think

To be ignorant of causes is to be frustrated in action. So wrote Francis Bacon, counsel to Queen Elizabeth I of England and key architect of the scientific method. In other words, do not base your actions on ignorance, or your actions will fail and cause damage. Bacon proposed that careful observation of natural phenomena, combined with experimentation and data collection and analysis, could be used to obtain knowledge of the mechanisms of nature. His method, known as the inductive method, looked at particulars (e.g., observations) to achieve the general (e.g., laws). When King Charles II founded the Royal Society in 1660, Bacons ideas were taken as the guiding principles of natural philosophy (the old name for science).

The method is not foolproof. No scientific theory based on the inductive method can be equated with the final truth on a subject. But, and this is an enormous but, the method is incredibly efficient at gathering evidence that is then used to formulate general principles that describe the operations of the natural world. Once vetted by the scientific community, scientific knowledge is the only way to develop technological applications that will serve society, from antibiotics and vaccines to cell phones and electric cars.

The only reason you step into an airplane with confidence is because, knowing it or not, you trust science. You trust the hydrodynamics used to design wings, you trust the chemical physics of combustion, and you trust the guidance system an incredibly complex system that involves radar, GPS, intricate electromagnetic circuitry, and even the Theory of Relativity to achieve amazing levels of precision navigation. You trust the expert, the pilot, who has training in the operation of the airplane and its instrumentation.

The paradox of our age is that although we live in a world that depends in essential ways on science and its technological applications, the credibility of science and of scientists is being questioned by people with no expertise whatsoever in science or how it works. This is not just about silly attacks on social media. It is about questioning knowledge that is painstakingly obtained by years of hard work and study to then superficially decide that this knowledge is wrong or worse, manipulative. How did we get ourselves into this mess?

After the Second World War, scientists enjoyed an all-time high in public perception. The technological inventions that decided the outcome of the war depended heavily on cutting-edge science: quantum and nuclear physics, radar, computers and code-breaking, effective explosives, aeronautical technology, faster planes and ships, and deeper-diving submarines. The list goes on. There was an intensified alliance between science and the State, which has been present in Western history since Greek times think of Archimedes and his catapults and fire-inducing mirrors, applied to protect Syracuse from Roman invaders.

The Cold War amplified this prestige, and defense support has sustained a large part of the scientific research budget. There was also an understanding that basic science is the cornerstone of technological innovation, so that even more abstract topics were worthy of funding.

As science advanced, it also became more technical, complicated, and arcane, moving farther away from general understanding. Quantum physics, genetics, biochemistry, AI, and machine learning are all part of our everyday life, even if few know much about any of these fields. Even the experts are siloed inside their research areas. Specialization is how new knowledge is produced, given the enormous amount of detail within each subfield. An astrophysicist who specialized in black holes knows practically nothing about the physics of graphene or quantum optics. Specialization has a dual role: It strengthens its own subfield but weakens the global understanding of a question. Specialization makes it harder for scientists to be a public voice for their fields in ways that are engaging to the general public.

To complicate things, the relationship between science and society changed. Beginning roughly in the 1960s, scientists started to use their findings to caution people and governments about the dangers of certain products or of unchecked industrialization and population growth. Cigarettes are bad for you. There will be a shortage of energy and water as more and more humans fill up the world. Climate change is going to create hell on Earth. Plastics are evil. Pollution of waterways, oceans, and the atmosphere will make people sick, kill animals, and destroy natural resources. Meanwhile, we, as a species even if we claim to be the most intelligent on this planet cannot act collectively to change what we are doing to our own environment.

These discoveries (some of them predating the 1960s by decades) were inconvenient to many. They were inconvenient to the tobacco industry, the auto industry, the fossil fuel industry, and the chemical industry. So, scientists, the darlings of the 1950s, became the harbingers of annoying news, threatening peoples way of life and the profitability of large sectors of the economy. They had to be stopped!

Scientists sounded the alarm, denouncing how the tobacco and fossil fuel industries developed a corrosive strategy to undermine sciences credibility, attacking scientists as opportunists and manipulators. Politicians aligned with these industries jumped in, and a campaign to politicize science took over the headlines. Scientific knowledge became a matter of opinion, something that Francis Bacon fought against almost 400 years ago. The media helped, often giving equal weight to the opinion of the vast majority of scientists and to the opinion of a small contrarian group, confusing the general public to no end. The growth of social media compounded the damage, as individuals with no or little scientific training jumped in ready to make a name for themselves as defenders of freedom and liberty, conflating lies with the American ideal of individual freedom.

The results, not surprisingly, have been catastrophic. From Flat-Earthers to antivaxxers to climate deniers, scientific authority and knowledge became a free-for-all, a matter of individual opinion aligned with political views, often sponsored by corporate interest groups and opportunist politicians.

To get out of this mess will take a tremendous amount of work, especially from the scientific community, the media, and educators. Science needs more popular voices, people that have a gift to explain to the general public how and why science works. Scientists need to visit more schools and talk to the children about what they do. Educators need to reenergize the science curriculum to reflect the realities of our world, inviting more scientists to visit classes and telling more stories about scientists that are engaging to students. This humanizes science in the process.

Historians often say that history swings back and forth like a pendulum. Lets make sure that we do not allow the pendulum of scientific knowledge to swing back to the obscurantism of centuries past, when the few with power and means controlled the vast majority of the population by keeping them in ignorance and manipulating them with fear.

Read the rest here:

When science mixes with politics, all we get is politics - Big Think

Read More..

I wrote the book on warp drive. No, we didn’t accidentally create a warp bubble. – Big Think

In perhaps his most famous quip of all time, celebrated physicist Richard Feynman once remarked, when speaking about new discoveries, The first principle is that you must not fool yourselfand you are the easiest person to fool. When you do science yourself, engaging in the process of research and inquiry, there are many ways you can become your own worst enemy. If youre the one proposing a new idea, you must avoid falling into the trap of becoming enamored with it; if you do, you run the risk of choosing to emphasize only the results that support it, while discounting the evidence that contradicts or refutes it.

Similarly, if youre an experimenter or observer whos become enamored with a particular explanation or interpretation of the data, you have to fight against your own biases concerning what you expect (or, worse, hope) the outcome of your labors will indicate. As the more familiar refrain goes, If the only tool you have is a hammer, you tend to see every problem as a nail. Its part of why we demand, as part of the scientific process, independent, robust confirmation of every result, as well as the scrutiny of our scientific peers to ensure were all doing our research properly and interpreting our results correctly.

Recently, former NASA engineer Harold Sonny White, famous (or infamous) for his previous dubious claims about physics-violating engines, has made a big splash, claiming to have created a real-life warp bubble: an essential step toward creating an actual warp drive, as made famous by Star Trek. But is this claim correct? Lets take a look.

Warp drive started off as a speculative idea. Rather than being bound by the limits of special relativity where massive objects can only approach, but can never reach or exceed, the speed of light warp drive recognized the novel possibility brought about by general relativity: where the fabric of space is curved. In special relativity, we treat space as being indistinguishable from flat, which is an excellent approximation almost everywhere in the Universe. Only near extremely dense and massive objects do the effects of curved space typically become important. But if you can manipulate the matter and energy in the Universe properly, its possible to cause space to curve in intricate, counterintuitive ways.

Just as you could take a flat sheet of paper and fold it, it should be possible, with enough matter and energy in the right configuration, to warp the fabric of space between any two points. If you warp space properly, the reasoning goes, you could potentially shorten the amount of space you need to traverse between any two points; all youd need is the right amount of energy configured in the right way. For a long time, the theoretical solutions that shortened the journey from one point to another were limited to ideas like wormholes, Einstein-Rosen bridges, and black holes that connected to white holes at the other end. In all of these cases, however, there was an immediate problem: Any spacecraft traveling through these mechanisms would violently be torn apart by the irresistible gravitational forces.

But all of this changed in 1994, when physicist Miguel Alcubierre put forth a paper that showed how warp drive could be physically possible. Alcubierre recognized that the presence of matter and/or energy always led to positive spatial curvature, like the heavily curved space just outside a black holes event horizon. However, negative spatial curvature would also be possible if, instead of matter and/or energy, we had some sort of negative-mass matter or negative energy. By playing around with these two ingredients, instead of just the usual one, Alcubierre stumbled upon an idea that was truly brilliant.

By manipulating large amounts of both positive and negative energy, Alcubierre showed how, without wormholes, a spaceship could travel through the fabric of space at an arbitrarily large speed: unbounded by the speed of light. The way this would work is that both types of energy positive and negative would be present in equal quantities, compressing the space in front of the spacecraft while simultaneously rarifying the space behind it by an equal amount. Meanwhile, the spacecraft itself would be encased in a warp bubble where space was indistinguishable from flat on the interior. This way, as the spacecraft and the bubble moved together, they would travel through the compressed space, shortening the journey.

One way to envision this is to imagine we wanted to travel to the TRAPPIST-1 system: a stellar system with a red dwarf star, containing at least seven Earth-sized planets in orbit around it. While the innermost planets are likely to be too hot, akin to Mercury, and the outermost planets are likely frozen over like Pluto, Triton, or Enceladus, some of the intermediate planets might yet be just right for habitability, and may possibly even be inhabited. The TRAPPIST-1 system is approximately 40 light-years away.

Without warp drive, youd be limited by special relativity, which describes your motion through the fabric of space. If you traveled quickly enough, at, say, 99.992% the speed of light, you could make the journey to TRAPPIST-1 in just six months, from your perspective. If you looked around, assessed the planet, and then turned around and came home at precisely the same speed, 99.992% the speed of light, it would take you another six months to return. Those individuals aboard the spacecraft would experience only one year of times passage, but back here at home, everyone else would have experienced the passage of 81 years.

When youre limited by the speed of light, this problem cannot be avoided: Even if you could travel arbitrarily close to the speed of light, slowing your own aging through time dilation and shortening your journey through length contraction, everyone back home continues to age at the normal rate. When everyone meets up again, the effects are dramatic.

With warp drive, however, this problem goes away almost entirely. The way that relativity works dictates that your passage through space and time are related: that the faster you move through space, the slower time passes for you, while remaining completely stationary in space causes time to pass at the maximum possible rate. By warping space itself, you can actually change it so that what was previously a 40-light-year journey in front of you might now appear as though it were only a 0.5-light-year journey. If you travel that distance, now, at 80% the speed of light, it still might take about six months to get to TRAPPIST-1. When you stop, turn around, and come back, with space warped again in your forward direction of motion, it again will take six months. All told, youll have aged one year on your journey.

But this time, because of how you undertook your journey, someone back on Earth would still be older, but not by very much. Instead of witnessing you traveling through space at nearly the speed of light, a terrestrial observer would witness the space in front of your spacecraft be continually shrunk, while the space behind you would continually be expanded. Youd be moving through space, but the warping of space itself would far and away be the dominant effect. Everyone back at home would have aged about 1 year and 8 months, but (almost) everyone you knew and loved would still be alive. If we want to undertake interstellar journeys and not say a permanent goodbye to everyone at home, warp drive is the way to do it.

In 2017, I authored the book Treknology: The Science of Star Trek from Tricorders to Warp Drive, where I presented nearly 30 different technological advances envisioned by the Star Trek franchise. For each technology, I evaluated which ones had already been brought to fruition, which ones were on their way, which ones were still a ways off but were physically possible, and which one would require something novel and presently speculative as far as science was concerned in order to become possible. Although there were only four such technologies that were currently impossible with our present understanding of physics, warp drive was one of them, as it required some type of negative mass or negative energy, which at present is purely speculative.

Today, however, its recognized that whats needed isnt necessarily negative mass or negative energy; that was simply the way that Alcubierre recognized one could induce the needed opposite type of curvature to space from what normal mass or energy causes. However, theres another possibility for this that stems from a realization that didnt yet exist back in 1994, when Alcubierre first put his work forth: that the default amount of energy in space isnt zero, but some positive, non-zero, finite value. It wasnt until 1998 that the effects of this energy were first robustly seen, manifesting itself in the accelerated expansion of the Universe. We know this today as dark energy, and its a form of energy intrinsic to the fabric of space itself.

Now, keep that in mind: Theres a finite amount of energy to the fabric of space itself. In addition to that, theres a famous calculation that was done back in the 1940s, in the early days of quantum field theory, by Hendrik Casimir, that has remarkable implications. Normally, the quantum fields that govern the Universe, including the electromagnetic field, exist everywhere in space; theyre intrinsic to it, and they cannot be removed. But if you set up certain boundary conditions Casimir first envisioned two parallel, conducting plates as an example certain modes of that field would be excluded; they had the wrong wavelength to fit between the plates.

As a result, the energy inherent to the space outside of the plates would be slightly greater than the energy inside the plates, causing them to attract. The effect wasnt experimentally confirmed until almost 50 years after it was proposed, when Steve Lamoreaux successfully did it, and the Casimir effect has now been calculated and measured for many systems and many configurations. It may be possible, with the proper configuration, to use the Casimir effect in a controlled fashion to substitute for Alcubierres original idea of exotic matter that possessed some type of negative energy.

However, one must be careful as stated earlier, its easy to fool yourself. The Casimir effect isnt equivalent to a warp bubble. But in principle, it could be used to warp space in the negative fashion that would be needed to create one.

The article, thankfully, published in the open access (but often dubious) European Physical Journal C, is publicly available to anyone wishing to download it. (Link here.) Using micron-scale electrical conductors in a variety of shapes, including pillars, plates, spheres and other cavities, teams of researchers were able to generate electric potentials (or changes in voltage) of a few hundred microvolts, completely in line with what previous experiments and theoretical predictions both indicate. Thats what the DARPA-funded project was for, and thats what the experimental research surrounding this idea accomplished: in a custom Casimir cavity.

However, theres an enormous difference between what teams working on Casimir cavities do experimentally and the numerical calculations performed in this paper. Thats right: This isnt an experimental paper, but rather a theoretical paper, one with a suspiciously low number (zero) of theoretical physicists on it. The paper relies on the dynamic vacuum model a model typically applicable to single atoms to model the energy density throughout space that would be generated by this cavity. They then use another technique, worldline numerics, to assess how the vacuum changes in response to the custom Casimir cavity.

And then it gets shady. Wheres my warp bubble? They didnt make one. In fact, they didnt calculate one, either. All they did was show that the three-dimensional energy density generated by this cavity displayed some qualitative correlations with the energy density field required by the Alcubierre drive. They dont match in a quantitative sense; they were not generated experimentally, but only calculated numerically; and most importantly, they are restricted to microscopic scales and extremely low energy densities. Theres a lot of speculation and conjecture, and all of it is unproven.

That isnt to say this might not be an interesting idea that might someday pan out. But the most generous thing I can say about it is this: it isnt fully baked. The most worrisome part, as a scientist familiar with Dr. Whites grandiose claims surrounding physics-violating engines in the past, is that hes making new grand claims without adequate supporting evidence. Hes going to be looking at tiny, low-power systems and attempting to make measurements right at the limit of what his equipment will be able to detect. And, in the very recent past, he has fooled himself (and many others) into believing a novel effect was present when, in fact, it was not. An error, where his team failed to account for the magnetic and electric fields generated by the wires powering his previous apparatus, was all he wound up measuring.

In science, the mindset made famous by The X-Files series, I want to believe, is frequently the most dangerous one we can have. Science is not about what you hope is true; its not about the way youd like reality to be; its not about what your gut tells you; and its not about the patterns you can almost see when you ignore the quantitative details. At its core, science is about what is true in our reality, and what can be experimentally and/or observationally verified. Its predictions are reliable when youre using established theories within their established range of validity, and speculative the instant you venture beyond that.

As much as Id love it if we had created a warp bubble in the lab, that simply isnt what happened here. A lack of appropriately healthy skepticism is how we wind up with scams and charlatans. As soon as you no longer bear the responsibility of rigorously testing and attempting to knock down your own hypotheses, youre committing the cardinal sin of any scientific investigation: engaging in motivated reasoning, rather that letting nature guide you to your conclusions. Warp drive remains an interesting possibility and one worthy of continued scientific investigation, but one that you should remain tremendously skeptical about given the current state of affairs.

Remember: The more you want something to be true, the more skeptical you need to be of it. Otherwise, you are already violating the first principle about not fooling yourself. When you want to believe, you already are the easiest person to fool.

More here:

I wrote the book on warp drive. No, we didn't accidentally create a warp bubble. - Big Think

Read More..

Global Network Attached Storage (NAS) Market Analytics 2021-2026 – Increasing Adoption of the Cloud is Hindering Market Growth -…

DUBLIN--(BUSINESS WIRE)--The "Global Network Attached Storage (NAS) Market - Growth, Trends, COVID-19 Impact, and Forecasts (2021-2026)" report has been added to ResearchAndMarkets.com's offering.

The Global Network Attached Storage (NAS) Market is estimated to grow at a CAGR of about 19.5% during the forecast period 2021-2026.

The COVID-19 pandemic has increased the demand for cloud-based solutions, owing to remote working models being adopted by enterprises. The pandemic has also acted as a catalyst for the growth of data usage. Enterprises across various end-user industries, especially telecom companies across emerging nations, have witnessed a surge in data usage due to lockdowns imposed by the governments. In such times, organizations are seeking an established and secured data storage solution.

Key Highlights

Select Market Trends

Competitive Landscape

The network-attached storage market is highly fragmented. Earlier, the big players dominated the NAS market. However, the growing demand from enterprises for data storage is also attracting many new players into the market, making the market competitive. Recent developments include:

Companies Profiled

For more information about this report visit https://www.researchandmarkets.com/r/xp0147

See the original post here:
Global Network Attached Storage (NAS) Market Analytics 2021-2026 - Increasing Adoption of the Cloud is Hindering Market Growth -...

Read More..

WinZip SafeMedia Gives Organizations the Flexibility to Securely Store, Manage, and Transport Files on Removable Media and Cloud Storage Platforms -…

OTTAWA, Dec. 08, 2021 (GLOBE NEWSWIRE) -- Introducing WinZip SafeMedia, the latest version of WinZips encryption and compression software for removable media and cloud storage platforms. Trusted by governments and regulated industries, WinZip SafeMedia quickly secures and compresses data stored on removable devices and features enhanced administrative controls, enabling organizations to easily customize and scale this flexible solution to meet and support their business needs and security initiatives.

WinZip SafeMedia (formerly WinZip Secure Burn Enterprise) combines the compression and encryption capabilities of the WinZip engine with unique data burning capabilities to simplify the process of securing data on external media. Plus, easy drag-and-drop tools, powerful data encryption and password protection, file copy auditing, and administrative tools empower IT departments to quickly enforce security policies across all levels of an organization.

With the WinZip engine, users can set controls to compress files so more data can be stored on removable devices. WinZip SafeMediasdata burning capability ensures users can reliably burn data on CDs, DVDs, Blu-ray Discs, and USB thumb and external drives.

As companies continue to face the reality of remote and hybrid workforces, they need to empower employees to work anytime, from anywhere, in order to maintain productivity, said Henry Monteiro, Head of WinZip Product Management. With WinZip SafeMedia, businesses can protect and control data anywhere it goes so they can be confident that sensitive, confidential information is secure on removable media devices. IT admins can set controls to maximize storage capacity and secure data with FIPS AES 256 military-grade encryption.

WinZipSafeMedia provides IT admins with the tools they need to safeguard their business against online and offline threats. Key benefits of WinZipSafeMediainclude:

WinZipSafeMediaenables users to store, manage, and transport data wherever they go, quickly and safely.Feature highlights include:

Enterprise organizations can benefit from licensing, upgrade, and maintenance programs designed to provide the greatest possible return on investment. Take advantage of enterprise-wide controls, a guided installation, setup, and deployment process, and access to 24/7 support, feature updates, and version flexibility (including access to previous versions.)

For information about WinZip SafeMedia or to sign up for a free trial, visitwww.winzip.com/safemedia.

About WinZip

WinZip products are trusted by millions of businesses and consumers to boost productivity, simplify file sharing, and keep information private. Offering apps for all of todays most popular platforms and devices, WinZip gives users a better way to manage and share files in the cloud, email, and instant messaging. The WinZip product line also includes powerful utilities to improve system performance and help keep Mac and Windows PCs secure. WinZip is a division of Corel Corporation. For more information about WinZip, please visitwww.winzip.com.

Corel products enable millions of connected knowledge workers around the world to do great work faster. Offering some of the industry's best-known software brands, we give individuals and teams the power to create, collaborate and deliver impressive results. Our success is driven by an unwavering commitment to deliver a broad portfolio of innovative applicationsincluding CorelDRAW, MindManager, Parallels and WinZipto inspire users and help them achieve their goals. To learn more about Corel, please visitwww.corel.com.

2021 Corel Corporation. All rights reserved. Corel, WinZip, the WinZip logo, SafeMedia, CorelDRAW, and MindManager are trademarks or registered trademarks of Corel Corporation in Canada, the U.S., and/or elsewhere. Parallels is a trademark or registered trademark of Parallels International GmbH in Canada, the U.S., and elsewhere. All other company, product and service names, logos, brands, and any registered or unregistered trademarks mentioned are used for identification purposes only and remain the exclusive property of their respective owners. Use of any brands, names, logos, or any other information, imagery, or materials pertaining to a third party does not imply endorsement. We disclaim any proprietary interest in such third-party information, imagery, materials, marks, and names of others.

Media Contact

Saeed Ismail SaeedPR Managersaeed.saeed@corel.com http://www.winzip.com

Original post:
WinZip SafeMedia Gives Organizations the Flexibility to Securely Store, Manage, and Transport Files on Removable Media and Cloud Storage Platforms -...

Read More..

Personal and Entry Level Storage Market Expected to Expand Up to USD 100 Billion by 2028 – GlobeNewswire

New York, US, Dec. 09, 2021 (GLOBE NEWSWIRE) -- Market Overview: According to a comprehensive research report by Market Research Future (MRFR), Personal and Entry Level Storage Market information by Product, by Storage System, by Technology and Region forecast to 2028 market size to reach USD 100 billion, growing at a compound annual growth rate of 30% by 2028.

Market Scope: The global personal and entry level storage market is growing rapidly. Personal and entry level storage (PELS) systems transform and enhance business operations by offering a comprehensive storage solution that integrates and refreshes the existing IT infrastructure while reducing costs. Besides, rising uses of smart & connected devices and cloud-based technologies have allowed the market to garner significant prominence.

Moreover, emerging cloud-based storage technologies create significant market opportunities. The increasing need for additional storage is a major driving force. Additionally, the increasingly growing demand for cloud-based computing technologies for several applications is a major trend positively impacting the market rise.

Dominant Key Players on Personal and Entry Level Storage Market Covered are:

Get Free Sample PDF Brochure: https://www.marketresearchfuture.com/sample_request/7691

Market USP Exclusively Encompassed:Market DriversIncreasing uses of personal and entry level storage across hosting types, providers hosting and user hosting influences the market value. The growing adoption of PELS technologies and advances in personal devices increase the market share. On the other side, the lack of awareness for the benefits of PELS systems and credible security concerns impede the market growth.

Also, the huge R&D investments required to develop PELS systems pose major challenges for the market players. Nevertheless, the rising demand for efficient storage would support the market growth over the next few years. Furthermore, increased demand for pre-installed storage services is expected to influence the market revenues. Simultaneously, advances in broadband connectivity would drive the PELS market.

AI and big data storage solutions simplify the artificial intelligence (AI) and big data infrastructure. Entry-level storage solutions offer global hybrid cloud data access and enterprise storage services, meeting mission-critical data requirements. These solutions also provide a streamlined way to discover, secure, protect and manage data from the edge to the public cloud.

Software-defined storage (SDS) solutions provide flexible storage options needed for hybrid cloud, digital transformation and more. Data protection and resiliency maximize backup storage efficiency, data security, and performance with maximum uptime at a lower cost.

Browse In-depth Market Research Report (100 Pages) on Personal and Entry Level Storage Market:https://www.marketresearchfuture.com/reports/personal-entry-level-storage-market-7691

Segmentation of Market Covered in the Research:The market is segmented into product, storage system, technology, vertical, and regions. The product segment is bifurcated into cloud, non-cloud, solid-slate drives, hard disk drives, flash drives, recordable disks, and others.

The storage system segment is bifurcated into unified storage, direct-attached storage, software-defined storage, network-attached storage, cloud storage & cloud storage, and others. The technology segment is bifurcated into solid stage storage, entry-level storage, and others.

The vertical segment is bifurcated into defense & government, IT & telecommunications, life science & healthcare, entertainment & media, consulting & business, research & education, manufacturing, utilities, BFSI, retail, consumer goods, and others. The region segment is bifurcated into the Americas, Asia Pacific, Middle East & Africa, Europe, and others.

Talk to Expert: https://www.marketresearchfuture.com/ask_for_schedule_call/7691

Regional AnalysisNorth America dominates the global personal and entry level storage market. Large advances and demand for memory technologies boost the market size. Besides, the growing demand, especially from the media and entertainment industry, substantiates the regions market shares. The market is also led by the strong presence of notable players and user base in the region.

Also, the broad uptake of cloud-based storage technologies across major applications fosters market revenues. Furthermore, substantial R&D investments for the development of memory technologies support the growth of the regional market. With rising numbers of businesses, the US leads the regional market, followed by Canada.

COVID-19 Impact on the Global Personal and Entry Level Storage MarketThe onset of the coronavirus influenced the entry-level storage uses significantly. The pandemic changed the way companies work and operate securely remotely. Moreover, the shift to work from home and remote working mandates forced users to rely increasingly on PELS systems. Increasing numbers of remote working employees and network traffic propelled the PELS market size. Resultantly, the market garnered vast revenues throughout 2020.

Competitive AnalysisThe PELS market witnesses strategic initiatives such as partnerships, expansion, mergers & acquisitions, collaboration, and product & technology launches. Also, key market players make strategic investments in research and development activities and foster their expansion plans.

For instance, recently, on Sept. 27, 2021, Toshiba America Electronic Components, Inc. (TAEC), a global technology leader, launched the OCZ TL100 SATA solid-state drive (SSD) series. The new OCZ TL100 series is designed for entry-level users with traditional hard disk drive (HDD) storage and provides the performance of SSD technology at an attractive price point, seeking an affordable upgrade solution.

Share your Queries:https://www.marketresearchfuture.com/enquiry/7691

About Market Research Future:Market Research Future (MRFR) is a global market research company that takes pride in its services, offering a complete and accurate analysis regarding diverse markets and consumers worldwide. Market Research Future has the distinguished objective of providing the optimal quality research and granular research to clients. Our market research studies by products, services, technologies, applications, end users, and market players for global, regional, and country level market segments, enable our clients to see more, know more, and do more, which help answer your most important questions.

Follow Us:LinkedIn|Twitter

View post:
Personal and Entry Level Storage Market Expected to Expand Up to USD 100 Billion by 2028 - GlobeNewswire

Read More..