Page 3,234«..1020..3,2333,2343,2353,236..3,2403,250..»

Global Cloud Storage Service Market: Current Status, In-depth Analysis and Forecast Outlook 2026 By Red Hat, Inc., Zadara Storage, IBM Corporation,…

The report on Global Cloud Storage Service Market is a holistic guide to understand various other elements that play crucial role in growth progression. Details pertaining to competitor strategies, vendor landscape and details on trend assessment have all been discussed at length to derive logical deductions based on which new market aspirants as well as established vendors in global Cloud Storage Service market can maneuver and deliver growth supportive business decisions.

Request a sample of Cloud Storage Service Market report @ https://www.orbismarketreports.com/sample-request/123056?utm_source=Maia

Also, considering the volatile nature of the market owing to immense technological disruptions, this report is also designed to harness an intelligent investment guide for established market players eying sustainable revenue pools and market stability in Cloud Storage Service market.

Key Plyares Analyis: Global Cloud Storage Service Market

Red Hat, Inc.Zadara StorageIBM CorporationVMware, Inc.Microsoft CorporationEMC CorporationQumulo, Inc.Amazon Web Services, Inc.RubrikOracle CorporationNasuni CorporationGoogle Inc.Rackspace Hosting, Inc.Hewlett Packard Enterprise Development LP

Browse the complete report @ https://www.orbismarketreports.com/global-cloud-storage-service-market-size-share-growth-analysis-and-forecast-outlook-by-2026?utm_source=Maia

Segment-based Assessment: Global Cloud Storage Service Market

The end-use application segment is thoroughly influenced by fast transitioning end-user inclination and preferences. Product and application based segments clearly focus on the array of novel changes and new investments made by Cloud Storage Service market forerunners towards improving product qualities to align with end-use needs.

Cloud Storage Service Market Analysis by Types:

Primary storageDisaster recovery and backupCloud storage gatewayData archiving

Cloud Storage Service Market Analysis by Applications:

Large EnterprisesSmall & Medium Enterprises

The subsequent sections of the report includes a detailed assessment of core vendors, manufacturers and stakeholders that play decisive roles in maintaining steady growth and revenue stability in global Cloud Storage Service market. Details about aspirants eying seamless penetration, growth stimulating strategies, expansion schemes deployed by established players have also been echoes in the report to mediate growth proficient business decisions in global Cloud Storage Service market.

Major Report Highlights:

1. The Cloud Storage Service market report documents high end data concerning volume and value based developments encompassing crucial elements such as regional, product based and application oriented insights to influence high revenue generation and growth.

2. The Cloud Storage Service market report also specifically outlines key parameters encapsulating market drivers and restraining factors that deflate growth.

3. Key challenges faced by market players have also been broadly discussed in this report section to locate untapped Cloud Storage Service market opportunities.

4. The Cloud Storage Service market report also mirrors exact growth strategies and tactical business decisions braced by frontline players.

5. A top-down assessment of competition spectrum and regional growth hubs have also been pointed out in great detail to underpin lucrative business decisions.

Report Investments: Logical Guide

1. Investment in this Cloud Storage Service market report is a highly time saving discretion as the report houses crucial market relevant information that play integral role in growth process.

2. The report features business priorities at length to draw logical relations with business strategies, needed to induce high growth in global Cloud Storage Service market.

3. The report allows readers to align their pipeline investments and ongoing ones in complete co-ordination with growth objectives.

4. The Cloud Storage Service market report also helps readers to design and implement optimum decision making aligning with commercial viability of products and services echoing consumer interests.

Make an enquiry of this report @ https://www.orbismarketreports.com/enquiry-before-buying/123056?utm_source=Maia

Major Points from Table of Content:

1 Cloud Storage Service Market Overview2 Global Cloud Storage Service Market Landscape by Player3 Players Profiles4 Global Cloud Storage Service Production, Revenue (Value), Price Trend by Type5 Global Cloud Storage Service Market Analysis by Application6 Global Cloud Storage Service Production, Consumption, Export, Import by Region (2014-2019)7 Global Cloud Storage Service Production, Revenue (Value) by Region (2014-2019)8 Cloud Storage Service Manufacturing Analysis9 Industrial Chain, Sourcing Strategy and Downstream Buyers10 Market Dynamics11 Global Cloud Storage Service Market Forecast (2019-2026)Continued

ABOUT US:

With unfailing market gauging skills, Orbis Market Reports has been excelling in curating tailored business intelligence data across industry verticals. Constantly thriving to expand our skill development, our strength lies in dedicated intellectuals with dynamic problem solving intent, ever willing to mold boundaries to scale heights in market interpretation. We are equally backed by an elongated list of success stories and case studies that vouch for our extraordinary market research skills and milestones. Orbis Market Reports is a one-stop-solution to all market queries.

CONTACT US:

Address :- 6200 Savoy Drive,Suite 630 Houston, TX 77036Phone :- +1 210-667-2421Mail us: [emailprotected]

Excerpt from:
Global Cloud Storage Service Market: Current Status, In-depth Analysis and Forecast Outlook 2026 By Red Hat, Inc., Zadara Storage, IBM Corporation,...

Read More..

Online Harms in the UK: Significant new obligations for online companies and fines of up to 10% of annual global turnover for breach – Lexology

Yesterday was a busy day for Europe in terms of tech regulation. In addition to the ground-breaking announcements made by the European Commission of the new Digital Services Act and the Digital Markets Act, the UK government also published its long-anticipated final response to the Online Harms White Paper. The Online Harms consultation ran from 8 April 2019 to 1 July 2019 and received over 2,400 responses from stakeholders across the technology industry, including online platforms, charities, think-tanks, publishers, individuals and small/medium sized enterprises.

The governments response contains a significant amount of detail and gives us a real insight into the stance the UK will adopt post-Brexit on digital regulation.

As the lie of the land becomes clearer, in future articles we will be breaking down the detail into practical points businesses should be aware of. But, for now, here are some of the headline points that jump out when reading the governments response:

What next?

Unsurprisingly, the devil will be in the detail. The draft Online Safety Bill is expected in the new year, and there are sure to be further twists and turns as it is developed and scrutinised by Parliament during 2021. Look out for further articles over the coming weeks in which well be analysing what the governments response means for businesses and how to prepare for whats to come.

View original post here:
Online Harms in the UK: Significant new obligations for online companies and fines of up to 10% of annual global turnover for breach - Lexology

Read More..

10 Ways To Boost Cloud Performance – CRN

The coronavirus pandemic has proved an endorsement of the need for organizations to move to the cloud to maintain business continuity, service their customers and ensure their remote workforces remain connected and can access the right tools to stay productive.

The percentage of overall IT spending dedicated to cloud is expected to continue to increase at an accelerated pace post-pandemic, with Gartner forecasting that cloud spending will account for 14.2 percent of the total global enterprise IT spending market in 2024, up from 9.1 percent this year.

Cloud providers, independent software vendors and solution providers continue to churn out new products and services to meet the rising demand and expand organizations cloud capabilities.

Heres a look at 10 recent offerings from Amazon Web Services, Cockroach Labs, CyberArk, Google Cloud, HashiCorp, LogDNA, Microsoft Azure, NetApp, SkyKick and TriggerMesh that fit that bill.

AWS Gateway Load Balancer is designed to make it easier and more cost-effective to deploy, scale and run third-party virtual appliances such as firewalls, intrusion detection/prevention systems and deep packet inspection systems on AWS. It provides a single gateway to distribute traffic across virtual appliances while scaling them based on demand to increase availability and eliminate potential points of network failure. It performs health checks on the virtual appliances and reroutes traffic flows when it discovers unhealthy ones.

CockroachDB 20.2 is the latest version of Cockroach Labs cloud-native, distributed SQL database. It includes new capabilities for spatial data for net-new workloads for IoT, transportation and environmental applications; a new CockroachDB for Kubernetes option that packages the database with a custom operator optimized for orchestrated deployments; and the addition of enterprise backup-and-restore functionality to the free community option. The release extends CockroachDBs TPC-C Benchmark performance up to 140,000 warehouses with a maximum throughput of 1.7 million transactions per minute, a 40 percent improvement over the past year, according to the company.

The cloud-agnostic CyberArk Cloud Entitlements Manager is a privilege-based, artificial intelligence-powered service designed to strengthen the security of cloud environments by removing excessive cloud permissions. It continuously detects hidden, misconfigured and unused cloud permissions to improve security through a least-privilege, zero-trust approach. It features a centralized dashboard with a single view of permissions across Amazon Web Servicesincluding Amazon Elastic Kubernetes Service Google Cloud Platform and Microsoft Azure environments.

Google Clouds serverless Document AI platform is a unified console for document processing that allows users to quickly access all of the cloud providers form, table and invoice parsers, tools and offeringsincluding Google Clouds Procurement DocAI and Lending DocAIwith a unified API. The document understanding offering uses artificial intelligence and machine learning to classify, extract and enrich data from scanned and digital documents at scale, including structured data from unstructured documents, making it easier to understand and analyze.

HashiCorp Waypoint is a new open-source project providing developers with a consistent workflow to build, deploy and release applications across any platform. It enables developers to get their applications from development to production in a single file and deploy them using a single command: waypoint up. Waypoint supports Kubernetes, HashiCorp Nomad, EC2, Google Cloud Run, Azure Container Instances, Docker and Buildpacks, among others. It can be extended with plugins to target any build/deploy/release logic.

LogDNAs new Kubernetes Enrichment for log management provides detailed insight into Kubernetes cluster environments. Its designed to enhance existing application logs in LogDNA with Kubernetes events and resource metrics, empowering developers to resolve deployment issues without deep specialized knowledge of Kubernetes tooling. It provides development teams with a single-pane-of-glass observability experience between a users underlying Kubernetes infrastructure and deployed services.

Microsoft Cloud For Healthcare is designed to help health-care customers and partners deliver better patient experiences, insight and care while improving workflow efficiency, streamlining collaboration and connecting data across sources. It incorporates offerings from Microsoft Azure, Microsoft 365, Microsoft Dynamics 365 and Microsoft Power Platform in addition to partner health-care solutions. Its architecture has built-in governance and privacy capabilities supporting compliance with HIPAA, GDPR, HITRUST CSF and other regulatory requirements.

Spot Storage by NetApp is a fully managed, continuously optimized block-and-file storage offering for cloud workloads. It automatically matches storage volume size and performance to application requirements, using NetApp technology for volume shaping, thin provisioning, compression and deduplication to deliver storage volumes that reduce storage costs. It will be integrated with Spot optimization products including Ocean (for containers and Kubernetes) and Elastigroup (for virtual machines) to allow automated, application-driven storage and compute management.

The new Cloud Manager is SkyKicks next-generation no-code and low-code automation, workflow and management application designed to help IT service providers securely administer and manage their cloud customers across SaaS, IaaS, PaaS and devices. Features include Command Center, a help desk automation application to aid tier one support desks in resolving cloud tickets faster; and WorkBench, a lowcode workflow automation engine that enables admins to turn PowerShell into Command Center applications with a click.

TriggerMesh Cloud Native Integration Platform 1.0 automates Kubernetes, cloud-native and on-premises infrastructures. The architecture-agnostic, production-ready SaaS offering allows users to integrate services, automate workflows and build event-driven applications out of any on-premises application or cloud service. Features include a declarative API, event transformation to forward events from one format to another, and new bridges to automate infrastructure and bridge workflows between Salesforce, OracleDB, Zendesk, Datadog, Oracle Cloud Security Logs, Splunk and others.

See the original post:
10 Ways To Boost Cloud Performance - CRN

Read More..

Quantum Mechanics, the Mind-Body Problem and Negative Theology – Scientific American

Heres how I distinguish science from philosophy. Science addresses questions that can be answered, potentially, through empirical investigation. Examples: Whats the best way to defeat COVID-19? What causes schizophrenia, and how should it be treated? Can nuclear power help us overcome climate change? What are the causes of war, and how can we end it?

Philosophy addresses questions that probably cant be solved, now or ever. Examples (and these are of course debatable, some philosophers and scientists insist that science can answer all questions worth asking): Why is there something rather than nothing? Does free will exist? How does matter make a mind? What does quantum mechanics mean?

This final question has absorbed me lately because of my ongoing effort to learn quantum mechanics. Quantum mechanics represents reality at its most fundamental level, that of particles darting through space. Supposedly. Thats why science writer and astrophysicist Adam Becker calls his recent book about quantum mechanics What Is Real?

I suspect well never have final, definitive answers to what quantum mechanics means and what is real. My reasoning is primarily inductive. For more than a century, experts have sought to interpret quantum mechanics, to specify what it tells us about matter and energy, time and space, the infrastructure of existence.

Physicists and philosophers have come up with lots of possibilities, notably the Copenhagen interpretation, the many-worlds hypothesis and the Bohmian pilot-wave model. Ive just become aware of a hypothesis called quantum Bayesianism, or QBism (pronounced cubism), which proposeswell, check it out yourself.

Unfortunately, most interpretations dont offer testable predictions to distinguish them from rivals. (An exception is a quantum model proposed by Nobel laureate Roger Penrose, certain versions of which are reportedly ruled out by a recent experiment.) Hence adherents favor one interpretation over others for largely subjective, aesthetic reasons.

You dig the austere minimalism of the Copenhagen interpretation. I favor the pilot-wave model, which insists that particles are particles, and not probabilistic blurs. If Im feeling frisky, I might go with John Wheelers metaphysically extravagant it from bit proposal, which fuses quantum mechanics and information theory. Arguments about which interpretation is true cannot be resolved, because our preferences are matters of taste, not truth.

When I say a problem is unsolvable, I dont mean we should abandon it. Far from it. I love reading, writing and arguing about intractable puzzles. For example, I dont believe in God, certainly not the God of my Catholic childhood. But I enjoy smart, imaginative theology (defined as the study of God) in the same way that I enjoy good science fiction. Two of my favorite theologians are physicist Freeman Dyson and psychedelic adventurer Terence McKenna.

Im especially fond of what is known as negative theology. Negative theology assumes that God exists but insists that He/She/It/They transcends human language and concepts. Negative theologians try to sayover and over again, and sometimes with great eloquencewhat they acknowledge cannot be said.

Negative theology is an outgrowth of mysticism. Mystical experiences, as defined by William James in The Varieties of Religious Experience, possess two seemingly contradictory properties. They are on the one hand noetic, that is, you feel you are gaining profound insight into and knowledge of reality. They are on the other hand ineffable, meaning you cannot convey your revelation in words.

Mystical aphorisms often emphasize ineffability. He who knows, does not speak, the ancient Chinese sage Lao Tzu says, violating his own dictum. He who speaks, does not know. Pseudo-Dionysius the Areopagite, a medieval monk, describes mystical knowledge as being at one with Him Who is indescribable.

I suspect Wittgenstein had his own mystical experiences in mind when he wrote at the end of his cryptic prose-poem Tractatus Logico-Philosophicus, Whereof one cannot speak, thereof one must remain silent. (After a friend, a philosopher, quoted that line to me, I replied: Then why are you still speaking? The friend hasnt spoken to me since.)

In 1999, while doing research for my book Rational Mysticism, I attended a symposium on mysticism at the University of Chicago. At a session on negative theology, a speaker said hed arrived by mistake a day early. Upon entering the empty auditorium, he thought, This is taking negative theology too far. Another speaker described mystical literature as that which contests its own possibility.

Negative theology can serve as a model for scientists and philosophers trying to solve quantum mechanics and another enigma I posed above: How does matter make a mind? This is the mind-body problem, which investigates, as I argue in a recent book, what we really are, can be and should be, collectively and as individuals. Are we really matter, mind, some combination of the two or, perhaps, none of the above?

When we wrestle with quantum mechanics, were also taking on the mind-body problem. Quantum paradoxes like Schrdingers cat and the measurement problem raise questions about the connection between matter and mind, and their status relative to each other. Is matter self-sufficient, as materialists insist, or does reality require mind too?

Mind is essential, according to QBism, it from bit and other quantum hypotheses. I have disparaged these mind-centric frameworks as neo-geocentrism, throwbacks to the ancient assumption that the universe revolves around us. But I enjoy mulling them over, just as I enjoy thinking about theodicies, which seek to explain why a loving, all-powerful God would create such a painful, unfair world. (Ive even come up with a drug-inspired theodicy of my own.)

Many, most, scientists and philosophers who dwell on quantum mechanics and the mind-body problem have faith that these conundrums can and will be solved, eventually. They crave answers, they want to know. If they cannot know during their lifetime, they want at least to feel that their efforts are taking us closer to the truth.

Philosopher David Chalmers, who has rejected strictly materialist solutions of what he calls the hard problem of consciousness, nonetheless insists that one day well crack it. So does another thinker I admire, philosopher-novelist Rebecca Goldstein. They and other seekers will probably dismiss negative theology as a model for inquiry, and I understand why. I share their craving for a revelation so profound that it dissipates the weirdness of the world.

But Ive also become increasingly wary of our craving for absolute knowledge, and absolute certainty, especially when it comes to riddles like what is reality and what are we. People convinced that they possess ultimate knowledge can become self-righteous fanatics, capable of enslaving and exterminating others in the name of truth.

Negative theology helps us avoid fanaticism by keeping us humble. We acknowledge, as an axiom, that ultimate truth will always elude us. Those who have a hard time accepting this anti-truthand hence the premise of negative theologyshould keep two points in mind. First, if we cannot grasp ultimate truth, we can pursue it forever, never losing sight of the mystery at the heart of things.

Second, Im not proposing negative theology as a model for science as a whole. Science has answered, conclusively, many questions, and it will answer many more, including, I hope, those listed at the beginning of this column. Problems related to infectious disease, mental illness, climate change and war will surely yield to dogged empirical inquiry. Although science will never entirely explain reality, it can make it more bearable.

Further Reading:

For more ruminations on quantum mechanics, the mind-body problem and mysticism, see my new book Pay Attention: Sex, Death, and Science.

Read more:

Quantum Mechanics, the Mind-Body Problem and Negative Theology - Scientific American

Read More..

Quantum Interference Phenomenon Identified That Occurs Through Time – SciTechDaily

Credit: ULB

Since the very beginning of quantum physics, a hundred years ago, it has been known that all particles in the universe fall into two categories: fermions and bosons. For instance, the protons found in atomic nuclei are fermions, while bosons include photons which are particles of light as well as the BroutEnglert-Higgs boson, for which Franois Englert, a professor at ULB, was awarded a Nobel Prize in Physics in 2013.

Bosons especially photons have a natural tendency to clump together. One of the most remarkable experiments that demonstrated photons tendency to coalesce was conducted in 1987, when three physicists identified an effect that was since named after them: the Hong-Ou-Mandel effect. If two photons are sent simultaneously, each towards a different side of a beam splitter a sort of semitransparent mirror one could expect that each photon will be either reflected or transmitted.

Logically, photons should sometimes be detected on opposite sides of this mirror, which would happen if both are reflected or if both are transmitted. However, the experiment has shown that this never actually happens: the two photons always end up on the same side of the mirror, as though they preferred sticking together! In an article published recently in US journal Proceedings of the National Academy of Sciences, Nicolas Cerf a professor at the Centre for Quantum Information and Communication (cole polytechnique de Bruxelles) and his former PhD student Michael Jabbour now a postdoctoral researcher at the University of Cambridge describe how they identified another way in which photons manifest their tendency to stay together. Instead of a semi-transparent mirror, the researchers used an optical amplifier, called an active component because it produces new photons. They were able to demonstrate the existence of an effect similar to the Hong-Ou-Mandel effect, but which in this case captures a new form of quantum interference.

Quantum physics tells us that the Hong-Ou-Mandel effect is a consequence of the interference phenomenon, coupled with the fact that both photons are absolutely identical. This means it is impossible to distinguish the trajectory in which both photons were reflected off the mirror on the one hand, and the trajectory in which both were transmitted through the mirror on the other hand; it is fundamentally impossible to tell the photons apart. The remarkable consequence of this is that both trajectories cancel each other out! As a result, the two photons are never observed on the two opposite sides of the mirror. This property of photons is quite elusive: if they were tiny balls, identical in every way, both of these trajectories could very well be observed. As is often the case, quantum physics is at odds with our classical intuition.

The two researchers from ULB and the University of Cambridge have demonstrated that the impossibility to differentiate the photons emitted by an optical amplifier produces an effect that may be even more surprising. Fundamentally, the interference that occurs on a semi-transparent mirror stems from the fact that if we imagine switching the two photons on either sides of the mirror, the resulting configuration is exactly identical. With an optical amplifier, on the other hand, the effect identified by Cerf and Jabbour must be understood by looking at photon exchanges not through space, but through time.

When two photons are sent into an optical amplifier, they can simply pass through unaffected. However, an optical amplifier can also produce (or destroy) a pair of twin photons: so another possibility is that both photons are eliminated and a new pair is created. In principle, it should be possible to tell which scenario has occurred based on whether the two photons exiting the optical amplifier are identical to those that were sent in. If it were possible to tell the pairs of photons apart, then the trajectories would be different and there would be no quantum effect. However, the researchers have found that the fundamental impossibility of telling photons apart in time (in other words, it is impossible to know whether they have been replaced inside the optical amplifier) completely eliminates the possibility itself of observing a pair of photons exiting the amplifier. This means the researchers have indeed identified a quantum interference phenomenon that occurs through time. Hopefully, an experiment will eventually confirm this fascinating prediction!

Reference: Two-boson quantum interference in time by Nicolas J. Cerf and Michael G. Jabbour, 11 December 2020, Proceedings of the National Academy of Sciences.DOI: 10.1073/pnas.2010827117

See more here:

Quantum Interference Phenomenon Identified That Occurs Through Time - SciTechDaily

Read More..

9 Most Confusing Sci-Fi Movies That Feel Like You Need a PhD in Quantum Physics – FandomWire

Science-fiction is a genre that defies all human imagination. It forces the human mind to think and wander, unlike the other genres. Its an ell-encompassing genre that covers everything from parallel universes to world destroying viruses. And then there are movies the go above and beyond. They are so convoluted and conniving they leave our brains in knots by the time credits roll.

Before we begin, this list is not going to pander to mainstream audiences. We will not be talking about Interstellar, Inception, and predestination here. So if you were expecting a typical list of sci-fi movies, this list is not for you.

Arthouse movies are not everyones cup of tea. Alphaville sits atop that sci-fi arthouse throne. So its a very, very niche movie. Legendary experimental film maker Jean-Luc Goddard gives us this classic. Detective Lemmy Caution is sent to infiltrate a dystopian city named Alphaville. Alphaville is governed by a sentient A.I called Alpha 60. Using mind control and brainwashing, Alpha 60 controls the entire citys population. The movie explores several tropes like Surrealism, French New Wave Cinema, and the concept of individuality and creativity. Its a movie that relies on a solid story and incredible concepts to drive the plot forward. And its damn interesting to watch.

From the outside, Ad Astra looks like any normal sci-fi flick. But theres a deeper narrative, something most people will miss in their first watch. Ad Astra is not about a distant son trying to get his estranged father home. Its about relationships across distance, the distance being in lightyears. Does a father remain a father to a son if he is no longer within the boundaries of our solar system? Where does love and passion end and madness begin? The most important part of the story is its slow, unnerving pace. You may think you have figured out the movie but re-watch it again and you will find something interesting every time.

The movie begins like a typical exploration journey. A mysterious meteor strike leads to a region of the United states being enclosed in a quarantined space called the Shimmer. Literally nothing about the Shimmer is ever truly explained. A group of scientists enter the Shimmer in search of another search party. Each of them end up dying in super mysterious ways. add to that the movies non linear narrative and super short foreshadowing and you have a movie as mysterious as it can get.

If you are not familiar with Kabbalah, a Jewish esoteric discipline and school of thought, do not even consider watching this film. The lead protagonist is also the narrator. He is unreliable, a paranoid schizophrenic, and talks in a language that only scientists would understand. The movie revolves around the search for a mysterious universe that could explain literally everything happening in the universe. The lead protagonists paranoia inserts itself into every scene, making an already harder to understand movie nigh impossible to decipher.

Upstream Color is the anti-thesis of convoluted sci-fi. Its beautiful, well-planned and linear. A group of criminals use mysterious parasites to induce a hyper hypnotic state in their victims. The victims are then vulnerable to suggestion. They will do anything the criminals ask them to. Things become complicated when a parasite is inserted into a pig. Theres also a telepathic link between two people who believe they are in love. But later they realize they are both victims of the same parasite attack. The narrative is also extremely obscured that never gives us a definitive conclusion.

This movie is considered to be one of the greatest sci-fi masterpieces of all time. Its also one of the most confusing sci-fi movies of all time. A Black Monolith appears out of nowhere and prehistoric humanity is taught how to use tools. Flash forward to the future and another Monolith appears somewhere else in our solar system and humans send a space-ship to investigate. Meanwhile theres also a crazed A.I aboard the humans need to deal with. A star-gate scene further complicates matters. Theres also a very disconcerting space-faring baby whose true purpose is as bewildering as it gets.

Darren Aronofskys metaphysical drama went through a lot of hoops before it finally hit the theaters. the movie is about the human acceptance of mortality and death. There are three different storylines in one movie, each running parallelly to the other two. Hugh Jackman and Rachel Weisz play the lead characters in the three arcs. The movie is filled up to the brim with powerful symbolism that might bounce off of your heads. Theres also various historical allegories that are force fit into the movies non-linear pace. This movie will leave your head spinning.

The best movies do not rely in CGI or star power. They rely on clever story-telling with maximum use of whatever resources it has at its disposal. Coherence is a movie that deals with alternate realities. A comet passes over the earth and the power goes out. People attending a dinner party go to the only house in the neighborhood that still has power, leaving a note. When they come back, they find the same note in front of their home. Its not long before they realize that the comet has opened a portal into parallel worlds. The real question is who amongst the people that went out and came back are actually from an alternate universe.

Irony is Sci-fi has always been a genre of high value projects. To make one, you need big pockets. but the most confusing movie of all time was made on a shoestring budget. It explores the most disturbingly difficult science fiction plot element time travel. Two part time inventors accidentally come up with time travel technology. What follows next is literal chaos. Multiple timelines exist simultaneously and there are so many parallel version of the same time traveler that time itself could fragment into a zillion pieces out of sheer confusion.

View post:

9 Most Confusing Sci-Fi Movies That Feel Like You Need a PhD in Quantum Physics - FandomWire

Read More..

Expanding the Scope of Electronic-Structure Theory – Physics

December 16, 2020• Physics 13, 196

An efficient new approach makes density-functional simulations feasible over larger length scales.

R. Godby/University of York

R. Godby/University of York

Density-functional theory (DFT) has, since the 1970s, had a huge impact on our understanding of condensed-matter physics through its ability to describe the effect of the electrons mutual interaction on the electronic structure of matter. However, in solids in which successive crystal unit cells are no longer exact repetitions of one another, the usual approach for implementing DFT can run out of steam. Now, Tristan Mller at the Max Planck Institute of Microstructure Physics, Germany, and colleagues have devised an efficient new way to implement DFT in the presence of such a long-wavelength variation [1]. Their technique potentially extends the scope of DFT to encompass phenomena of technological interest, such as skyrmions and magnetic domain walls.

The key that unlocks a materials electronic structure is an almost magical result known as Blochs theorem, which greatly assists the solution of Schrdingers equation [2]. Rather than having to take account of arbitrary mixing of the atomic wave functions of the solids infinite number of atoms, Bloch showed that the atoms in each unit cell of the solid contribute equally to any wave function. The contribution of each unit cell differs from its neighbors only by a phase factor that is a fixed characteristic of a given wave function. This phase factor is normally described through a Bloch wave vector, the k point. In essence, this idea means that solving Schrdingers equation for electrons in a periodic solid is little costlier than solving it for a single unit cell. The efficiency of this approach facilitated the development of electronic-structure calculations for solids in the early decades of quantum mechanics [3].

The first such calculations did not take into account the effect of the electrostatic repulsion between electrons on the materials electronic structure. Correcting this shortcoming is where DFT comes in [4, 5]. In DFT, a Schrdinger equation is still solved for each electron in turn, but with the periodic potential felt by the electrons now modified by the periodic density of the electrons themselves within each unit cell. Crucially, the power of Blochs theorem is preserved. The combination of quantitative accuracy and efficiency fueled the explosion in applications of DFT to crystalline solids from the 1970s onwards [6].

What if the system under study is not a periodic solid but is nevertheless infinite? Often, the concept of a supercell is usefula larger unit cell within which a periodic atomic arrangement can still be assumed to a good approximation. (The simplest example would be an antiferromagnetic material, in which the alternating spins of neighboring atoms double the periodicity.) The power of Blochs theorem is then regained, albeit with increased computational cost reflecting the presence of, perhaps, dozens of atoms in the new unit cell rather than just one or two. The cost typically scales with the cube of this number of atoms, so supercell calculations can be very (even prohibitively) costly. If, for example, the supercell is 10 times larger than the basic unit cell in each direction, then the reciprocal lattice becomes 10 times finer in each direction. This expansion greatly increases the number of coefficients that must be calculated for each electronic wave function and for the corresponding DFT potentials.

Tackling this scaling problem is the purpose of the new work by Mller and colleagues. To achieve their goal, the researchers developed a flexible approach that is closely related to the concept of satellite peaks in x-ray crystallography and electron diffraction. There, the observed image is a series of diffraction peaks that is essentially the Fourier transform of the structure of the solid under observation. If the solid is modulated by acquiring a new, longer periodicity, then each diffraction peak becomes surrounded by a finely spaced set of a few satellite peaks (Fig. 1).

Away from each original peak position, the intensity of the satellites falls off quickly, provided that the modulation has a long wavelength. In the language of a supercell DFT calculation, this behavior means that much information can be neglected to a good approximation: only a few satellite coefficients need be calculated in place of each original coefficient (that describe the electronic wave function or charge or magnetization density for the original periodic solid).

Mller and colleagues, then, address the situation of a periodic solid upon which an additional spatial variation on a long length scale is imposedeither an externally applied potential or a spontaneous internal adjustment of the electrons themselves, such as a charge-density wave. Their approach is equivalent to a DFT supercell calculation plus certain well-founded approximations arising from the retention of a limited set of satellite coefficients, which is the key to the efficiency. As is common in DFT, the local electronic structure is represented using a compact set of functions within each unit cell. Meanwhile, the satellite aspects of the wave functions and densities are naturally described using long-wavelength plane waves, which allows these parts of the calculation to benefit from the numerical efficiency of fast Fourier transforms.

As a demonstration of their technique, Mller and colleagues present three examples: the spin-spiral state of the phase of Fe; coupled spin and charge-density waves in Cr; and LiF with an externally applied potential. It is noteworthy that their method need not start from any rigid assumption about the modulation of the original solid, other than that it is on a length scale of many unit cells. When the electronic ground state, as given by DFT, is found, the nature of the modulation (spin spiral or charge-density wave, for example) emerges naturally from the calculation.

When the researchers compare their models results with full supercell calculations, it is clear that the two methods are not yet in perfect alignment. However, given sufficient computer power, this mismatch should narrow. Looking beyond materials electronic ground states, Mller and colleagues foresee the application of their approach to the time dependence of such modulated solids, making use of time-dependent DFT [7]. This ability should enable the ab initio simulation of the dynamic coupling between the electronic wave functions on an atomic scale with, say, electromagnetic waves on a longer length scale in a plasmonic optoelectronic device. For designers of such nanostructures, the electromagnetic waveforms emitted in response to some intense applied pulse could therefore take proper account of the quantum-mechanical motion of the electrons, without the limitations of perturbation theory.

Rex Godby is an emeritus professor of theoretical physics in the Department of Physics, University of York. His research focuses on the quantum dynamics of systems of interacting electrons and other complex systems, including studies of the exact Kohn-Sham potential of time-dependent density-functional theory (TDDFT), together with the development of improved approximate TDDFT functionals for dynamical problems. After completing his Ph.D. at the University of Cambridge in 1984, Godby was a postdoctoral researcher at Bell Labs, New Jersey, and then returned to Cambridge in 1986 as a research fellow, moving to York in 1995.

A useful metric for characterizing the topological behavior of fermions can be extended to bosonic systems as well. Read More

Read more here:

Expanding the Scope of Electronic-Structure Theory - Physics

Read More..

Physicists attempt to unify all forces of nature and rectify Einstein’s biggest failure – Livescience.com

In his waning years, Albert Einstein spent his time tilting at windmills, trying to unify all the forces of nature. He died disappointed, and his attempt would go down in history as his biggest failure.

But Einstein's failed dream could ultimately become his ultimate triumph, as a small group of theoretical physicists rework his old ideas. It won't necessarily bring all the forces of the universe together, but it could explain some of the most pressing issues facing modern science.

The most successful theory of gravity known to humanity is Einstein's famous theory of general relativity. Einstein spent more than seven years developing it, and it was worth the wait. On the surface, general relativity is deceptively simple. All of the drama of the universe takes place on the grand, four-dimensional stage called space-time. Matter and energy the actors and actresses of the cosmos run around doing their thing, saying their lines. Matter and energy deform space-time, causing it to warp and curve. That warping in turn tells the matter and energy how to move and behave.

Related: 8 ways you can see Einstein's theory of relativity in real life

And voila: general relativity! The constant dialogue between space-time stage and matter and energy is what we see as the force of gravity.

Einstein's theory has passed every observational test thrown at it, which is why it's survived for the century since its birth. It has predicted and explained strange phenomena across the universe, including the bending of light around massive objects and the formation of black holes.

And yet, we know that it's broken. While general relativity says that black holes should exist, it completely breaks down when it tries to describe their singular hearts. We have no description of gravity at such a subatomic scale where quantum mechanics holds sway. At this scale, when gravity gets both strong and short-range, general relativity can't even make predictions - the math just falls apart.

Those are places where we know that general relativity breaks down. But beyond that, astronomers have noticed two phenomena that also aren't completely explained by general relativity: Most of the matter in the universe (so-called dark matter) doesn't interact with light; and the expansion of the universe is accelerating every single day (which is thought to be caused by as-yet-unknown dark energy). In order to explain dark matter and dark energy, we have two choices. Either general relativity is perfectly correct, but our cosmos is filled with strange new substances, or general relativity is flat-out wrong.

Einstein himself tried to push past the limits of general relativity. But he wasn't motivated by the puzzles of black hole singularities or an accelerating universe nobody knew that those existed, let alone would be major theoretical challenges.

Instead, Einstein was motivated by a higher purpose: an attempt to unify all the (known) laws of physics in a single mathematical framework. In his case, he had gravity on one side, represented by his now-famous general relativity, and electromagnetism on the other, represented by Maxwell's equations that described everything from magnets and electrical currents to light itself.

In his attempts to make a super-theory of everything, Einstein introduced General Relativity 2.0. The basic version of relativity only cares about space-time's curvature. But Einstein's reboot also paid attention to space-time's twistiness, or torsion. There was no need to include torsion in his original theory, because it turned out that all you needed was curvature to explain gravity. But now that Einstein was trying to explain more than gravity, he had to include additional effects.

Related: The 18 biggest unsolved mysteries in physics

Einstein had hoped that the twistiness of space-time would somehow be connected to electromagnetism (the same way that the curvature of space-time is connected to gravity) but alas, he couldn't find any solutions and his new theory died with him.

But other physicists never gave up the dream, and they have been attempting to unify physics ever since. One of the most well-developed concepts is called string theory, which claims that all particles are really tiny little vibrating strings. Oh, and our universe has extra spatial dimensions that are all tiny and curled up.

String theory was never based on Einstein's original idea of the twistiness of space-time, but now physicists are giving that old idea, which is called teleparallel gravity, a second look.

The name "teleparallel" comes from Einstein's original work that examined the nature of distant parallel lines in his geometric framework, exploring how both the curvature and twistiness of space-time affected the motion of matter and energy. Physicists nowadays don't think teleparallel gravity can unify physics (even Einstein himself eventually gave up on the idea), but it may be an interesting candidate for a new theory of gravity.

That's because theorists have been using teleparallel gravity to explain things like the accelerated expansion of the universe, the early period after the Big Bang when the universe ballooned, called"inflation" and more recent problems such as an observed conflict between different measurements of the expansion rate of the cosmos. In other words, teleparallel gravity has proven to be pretty predictive.

But what about those early dreams of a unified theory? Teleparallel gravity may be an interesting and useful new approach to gravity, but it doesn't get us any closer to understanding a more fundamental law of physics. Instead, physicists have been using the language of string theory to do that job, so naturally the question came up: Does string theory which claims to be an ultimate theory of everything in any way connect to teleparallel gravity? In other words, if teleparallel gravity can potentially solve all these nasty problems like dark matter and dark energy, does it flow as a natural consequence of string theory, or are these two separate lines that don't have any connection to each other?

Recently, theoretical theorists have begun to tie teleparallel gravity to string theory, providing a motivation for the theory within the stringy universe, as reported in a paper appearing in the preprint journal arXiv in November. In their work, they showed how teleparallel gravity can be a consequence of string theory. This is an important insight, because string theory should be able to explain all laws of physics, and if teleparallel gravity is a better version of general relativity, and ultimately turns out to be correct, then you should be able to derive teleparallelism from the math of string theory.

Here's an analogy. Let's say police identify a murder weapon at a crime scene (general relativity). They have a prime suspect (string theory) that they want to connect to the murder weapon. But new analysis of the crime scene reveals that a different weapon (teleparallelism) actually caused the murder. Can the prime suspect still be connected to the new murder weapon?

The short answer is: yes.

There's a lot more work to be done. String theory isn't finished yet (and may never be finished, if we never figure out firm mathematical solutions), so any connection it can make to reality is useful. If teleparallel gravity turns out to be a useful way to explain some of the current shortcomings of general relativity, and we can derive teleparallelism from string theory, then that's one more step in achieving Einstein's ultimate dream of unification not the way he envisioned it, but it still counts.

Originally published on Live Science.

The rest is here:

Physicists attempt to unify all forces of nature and rectify Einstein's biggest failure - Livescience.com

Read More..

Black dwarf supernovae: The last explosions in the Universe – SYFY WIRE

Here's a happy thought: The Universe may end in a whimper and a bang. A lot of bangs.

Calculations done by an astrophysicist indicate that in the far future, the Universe will have sextillions of objects called black dwarfs, and that eventually they can explode like supernovae. In fact, they may represent the very last things the Universe can do.

But this won't happen for a long time. A very, very, very long time*. So long from now I'm having difficulty figuring out how to explain how long it'll be. I'll get to it your brain will be stomped flat by it, I promise but we need to talk a bit first about stars, and nuclear fusion, and matter.

Stars like the Sun release energy as they fuse hydrogen atoms into helium atoms in their cores. It's very much like the way a hydrogen bomb works, but on a massively larger scale; the Sun outputs about the equivalent energy of one hundred billion one-megaton bombs. Every second.

Eventually the hydrogen runs out. A lot of complicated things can happen then depending on how massive the star is, what's in it, and more. But for stars up to about 8 10 times the mass of the Sun the outer layers all blow away, exposing the core to space; a core that has become a ball of material so compressed weird quantum mechanics rules come into play. It's still made up of atomic nuclei (like oxygen, magnesium, neon, and such) and electrons, but they're under incredible pressures, with the nuclei practically touching. We call such a material degenerate matter, and the object itself is called a white dwarf.

For stars like this, that's pretty much the end of the road. The kind of fusion process they enjoyed for billions of years thermonuclear fusion, where (hugely simplified) the atomic nuclei are so hot they slam into each other and fuse can't work any more. The white dwarf is born very hot, hundreds of thousands of degrees Celsius, but without an ongoing heat source it begins to cool.

That process takes billions of years. White dwarfs that formed in the early Universe are just now cool enough to be red hot, around 4,000 C.

But the Universe is young, only about 14 billion years old. Over very long periods of time, those white dwarfs will cool further. Eventually, they'll cool all the way down to just about absolute zero: -273C. That will take trillions of years, if not quadrillions. Much much longer than the Universe has already existed.

But at that point the degenerate matter objects won't emit any light. They'll be dark, which is why we call them black dwarfs.

So is that it? Just black dwarfs sitting out there, frozen, forever?

Well, maybe not, and this is where things start to get weird (yes, I know, they're already weird, but just you wait a few paragraphs). Currently, physicists think that protons, one of the most basic of subatomic particles, can decay spontaneously. On average this takes a very long time. Experimental evidence has shown that the proton half-life may be at least 1034 years. That's a trillion trillion times longer than the current age of the Universe.

If true, that means that the protons inside the atomic nuclei in the black dwarfs will decay. If they do, then after some amount of time, 1035 or more years, the black dwarfs will evaporate. Poof. Gone. At that point all that will be left are even denser neutron stars and black holes.

But proton decay, while predicted by current particle theory, hasn't yet been observed. What if protons don't decay? What happens to black dwarfs then?

That's where this new paper comes in. It turns out that there are other quantum mechanics effects that become important, like tunneling. Atomic nuclei are loaded with protons, which have a positive charge, so the nuclei repel each other. But they are very close together in the center of the black dwarf. Quantum mechanics says that particles can suddenly jump in space very small distances (that's the tunneling part, and of course it's far more complicated than my overly simple synopsis here), and if one nucleus jumps close enough to another, kablam! They fuse, form a heavier element nucleus, and release energy.

This is different than thermonuclear fusion, which needs lots of heat. This kind doesn't need heat at all, but it does need really high density, so it's called pycnonuclear fusion (pycno in ancient Greek means dense).

Over time, the nuclei inside the black dwarf fuse, very very slowly. The heat released is minimal, but the overall effect is that they get even denser. Also, like in normal stars, the nuclei that fuse create heavier nuclei, up to iron.

That's a problem. The effects holding the star up against its own intense gravity is degeneracy pressure between electrons. When you try to fuse iron it eats up electrons. If enough iron fuses the electrons go away, the support for the object goes with it, and it collapses.

This happens with normal stars too. They have to be pretty massive, more than 810 times the mass of the Sun (so the core is at least 1.5 or so times the Sun's mass). But for stars like those the core suddenly collapses, the nuclei smash together and form a ball of neutrons, what we call a neutron star. This also releases a lot of energy, creating a supernova.

This will happen with black dwarfs too! When enough iron builds up, they too will collapse and explode, leaving behind a neutron star.

But pycnonuclear fusion is an agonizingly slow process. How long will that take before the sudden collapse and kablooie?

Yeah, I promised earlier that I'd explain this number. For the highest mass black dwarfs, which will collapse first, the average amount of time it takes is, well, 101,100 years.

That's 10 to the 1,100th power. Written out, it's a 1 followed by eleven hundred zeroes.

I I don't have any analogies for how long that is. It's too huge a number to even have any kind of rational meaning to the pathetic globs of meat in or skulls.

I mean, seriously, here it is written out:

That's a lot of zeroes. Feel free to make sure I got the number right.

I tried to break it down into smaller units that make sense, but c'mon. One of the largest numbers we named is a googol, which is 10100, a one followed by 100 zeroes.

The number above is a googol11, a googol to the 11th power.

And that's the black dwarfs that go first. The lowest mass ones take much longer.

How much longer? I'm not terribly glad you asked. They collapse after about 1032,000 years.

That's not a typo. It's ten to the thirty-two-thousandth power. A one with 32,000 zeroes after it.

OK then.

I'll note that this is for stars that start out more massive than the Sun. Stars like ours aren't massive enough to get the pycnonuclear fusion going they don't have enough mass to squeeze the core into the density needed for it so when they turn into black dwarfs, that's pretty much it. After that, nothing.

Assuming protons don't decay, I'll note again. They probably do, so perhaps this is all just playing with physics without an actual outcome we can see (not that we'll be around to anyway). Or maybe we're wrong about protons, and in that unimaginably distant future the Universe will consists of neutron stars, black holes, low mass black dwarfs like the Sun, and something like a sextillion black dwarfs that will one day collapse and explode.

Black holes, I'll note, evaporate as well, and the last of those should go in less than a googol years. If so, then black dwarf supernovae may be the last energetic events the Universe can muster. After that, nothing. Heat death. Infinite cold for infinite time.

Oh hey, it gets worse. The Universe is expanding, but the part of it that we can see, the observable Universe, is actually shrinking. This has to do with dark energy and the accelerated expansion of the Universe, which I have explained elsewhere. But by the time the black dwarfs start to explode, the Universe we can see will have shrunk to the size of our own galaxy. Well, what's left of it by then. Odds are the black dwarfs will be scattered so far by then that we there won't even be one in our observable frame.

That's a rip-off. You'd think that waiting that long would have some payoff.

So why go through the motions to calculate all this? I actually think it's a good idea. For one thing, science is never wasted. It's possible this may all be right.

Also, the act of doing the calculation could yield interesting side results, things that have implications for the here-and-now that might be observable (like the decay of protons). There could be some tangible benefit.

But really, for my money, this act of spectacular imagination is what science is all about. Push the limits! Exceed the boundaries! Ask, "What's next? What happens after?" This expands our borders, pushes back at our limitations, and frees the brain within the limits of the known physics and math to pursue avenues otherwise undiscovered.

Seeking the truth can be a tough road, but it does lead to understanding, and there's beauty in that.

*That links to an article written by my SYFY WIRE colleague Jeff Spry about this topic when it first came out a while back. He gives a good summation of it, but after reading the paper myself I wanted to do a deeper dive. And, to be honest, I could write an article three times this long on this topic. There's a lot going on here.

Link:

Black dwarf supernovae: The last explosions in the Universe - SYFY WIRE

Read More..

Orford 17-year-old is among brightest young minds in north west – Warrington Guardian

A 17-year-old boy from Orford was invited to an online celebration of some of the brightest young minds in the region thanks to his competition essay on the theories of quantum mechanics.

Thomas Shaw, who studies A-Level Biology, Chemistry and Maths, entered his 2,000-word essay on the fundamental theory in physics in the Pembroke North Essay competition

The competition is run by Pembroke College in Oxford, aiming to support and inspire the next generation of undergraduates as they consider their university choices.

Thomas said: I found the event interesting and I particularly enjoyed being able to ask questions to students about life on campus within the current COVID-19 restrictions.

Although I didnt win the prize I was able to further develop my writing style in anticipation of completing my Extended Project as well as gain a deeper understanding about quantum mechanics.

The former Cardinal Newman RC School pupil entered the competition over summer after hearing about it through Priestley Colleges Graduate programme, which is designed to improve students chances of securing places at the UKs top universities.

Ian Hughes, who helps Priestley students who are aiming for the most competitive universities, said Thomas was on his way to achieve his goal of studying Chemistry at Oxford.

He said: The dedication, knowledge and skills he has shown in producing this essay proves he has the calibre to achieve whatever he sets his heart on.

All participants received feedback from the postgraduates who marked their essays.

The online celebration covered Oxford admissions as well as a lecture from Peter Claus, who is the access fellow at Pembroke, discussing the history of Eugenics.

Priestley College is part of The Challenge Academy Trust in Warrington, offering a mix of A-Levels and vocational options and soon to be one of the first colleges in the country to offer T-Levels.

Priestly was also the first sixth form college in the country to be awarded STEM assured status, meaning it provides some of the best Science, Technology, Engineering and Maths education in the country.

Read the rest here:

Orford 17-year-old is among brightest young minds in north west - Warrington Guardian

Read More..