Page 3,470«..1020..3,4693,4703,4713,472..3,4803,490..»

The universe might have a fundamental clock that ticks very, very fast – Science News

Like a metronome that sets the tempo for a musician, a fundamental cosmic clock may be keeping time throughout the universe. But if such a clock exists, it ticks extremely fast.

In physics, time is typically thought of as a fourth dimension. But some physicists have speculated that time may be the result of a physical process, like the ticking of a built-in clock.

If the universe does have a fundamental clock, it must tick faster than a billion trillion trillion times per second, according to a theoretical study published June 19 in Physical Review Letters.

In particle physics, tiny fundamental particles can attain properties by interactions with other particles or fields. Particles acquire mass, for example, by interacting with the Higgs field, a sort of molasses that pervades all of space (SN: 7/4/12). Perhaps particles could experience time by interacting with a similar type of field, says physicist Martin Bojowald of Penn State. That field could oscillate, with each cycle serving as a regular tick. Its really just like what we do with our clocks, says Bojowald, a coauthor of the study.

Headlines and summaries of the latest Science News articles, delivered to your inbox

Time is a puzzling concept in physics: Two key physics theories clash on how they define it. In quantum mechanics, which describes tiny atoms and particles, time is just there. Its fixed. Its a background, says physicist Flaminia Giacomini of the Perimeter Institute in Waterloo, Canada. But in the general theory of relativity, which describes gravity, time shifts in bizarre ways. A clock near a massive object ticks slower than one farther away, so a clock on the surface of the Earth lags behind one aboard an orbiting satellite, for example (SN: 12/10/18).

In attempts to combine these two theories into one theory of quantum gravity, the problem of time is actually quite important, says Giacomini, who was not involved with the research. Studying different mechanisms for time, including fundamental clocks, could help physicists formulate that new theory.

The researchers considered the effect that a fundamental clock would have on the behavior of atomic clocks, the most precise clocks ever made (SN: 10/5/17). If the fundamental clock ticked too slowly, these atomic clocks would be unreliable because they would get out of sync with the fundamental clock. As a result, the atomic clocks would tick at irregular intervals, like a metronome that cant keep a steady beat. But so far, atomic clocks have been highly reliable, allowing Bojowald and colleagues to constrain how fast that fundamental clock must tick, if it exists.

Physicists suspect that theres an ultimate limit to how finely seconds can be divided. Quantum physics prohibits any slice of time smaller than about 10-43 seconds, a period known as the Planck time. If a fundamental clock exists, the Planck time might be a reasonable pace for it to tick.

To test that idea, scientists would need to increase their current limit on the clocks ticking rate that billion trillion trillion times per second number by a factor of about 20 billion. That seems like a huge gap, but to some physicists, its unexpectedly close. This is already surprisingly near to the Planck regime, says Perimeter physicist Bianca Dittrich, who was not involved with the research. Usually the Planck regime is really far away from what we do.

However, Dittrich thinks that theres probably not one fundamental clock in the universe, but rather there are likely a variety of processes that could be used to measure time.

Still, the new result edges closer to the Planck regime than experiments at the worlds largest particle accelerator, the Large Hadron Collider, Bojowald says. In the future, even more precise atomic clocks could provide further information about what makes the universe tick.

See the original post here:

The universe might have a fundamental clock that ticks very, very fast - Science News

Read More..

Epigenetics and pandemics: How allopathy can turn into a curse from a cure – The Times of India Blog

As a Gen-X child, I am lucky to have watched the birth of genetics and also its golden period when there was a phase (similar to that experienced by classic physics during Newtonian era) that we had a feeling that we were on the verge of unveiling the ultimate secret of life.

When Watson and Crick discovered DNA, the code of life, it was a serendipitous shock as scientists felt that they now have a key to understand everything about life.

As genetics moved forward, it appeared as if each life-form was constructed using a set of instructions and nothing more, and the quest was all about reading that code.

As genetic expression was presumed to be based only on the code available in the DNA, it was felt that body construction and pathology it will lead to was completely and totally governed by the code with no real way of altering it. For example, if you have a gene with a specific error, say one more (third) copy of chromosome 21, you have no escape from developing Downs syndrome.

As genetics offered a very clear cause-and-effect relationship model, it made allopathy feel very happy with itself, because it strengthened the belief that doctors already had thanks to discovery of pathogens that cause diseases.

So, the early days of genetics was also the golden age of allopathy that was already empowered by antibiotics that killed pathogens and cured diseases and now knew that finding a way to correct genetic errors would cover the rest of the systemic malaises.

Unfortunately for us today, both these optimistic beliefs of allopathy have taken a severe beating, and allopathy is now on the verge of breakdown.

As evolution has started blunting the edge of antibiotics, allopathy is now desperately trying to find newer toxins to be a step ahead of the pathogens that are fast developing resistance, but it looks like a hopeless quest now.

While evolution is beating allopathy (on a front it had arguably won some great battles), on the genetic front, the situation is not looking too good thanks to a newly discovered concept called epigenetics.

In the early years of genetics, DNA looked like an instruction manual written in a linear way to build a life-form. Each protein had its code and each process had a fixed assembly line, so there was a clear one-to-one relationship and hence the comfort of predictable cause-and-effect logic that science thrives on was available.

Unfortunately, scientists soon realised that the book of life was not as simple or linear. It was actually a book that you have to keep flipping through because it had multiple options for a given decoding.

The science of epigenetics is based in this new understanding that DNA code is read by life depending on the given situation.

In simple terms, it is like a book where you read the instructions of what to do on the page 32 if a man coming at you is wearing a white shirt; but, if he is wearing a blue short, you need to read the instructions on page 245.

Similarly, the book of genes gets read depending on external circumstances, and hence genetics is now added with an epi, i.e. outside of to describe it more correctly.

So, epigenetics did what quantum physics did to classical physics. It destroyed the hope of having a deterministic view of a life-form, and what nCovid19 has done today is to tell that secret to the whole world.

While biology or genetics is not mainstream information that the masses are aware of, thanks to the coronavirus pandemic, the whole world saw how the great allopathy that claimed having the best cause-and-effect understanding of human body and its diseases actually failed completely in answering even the most simple questions.

It is about time allopathy recognises that the local cause-and-effect model it is pursing is not the only way to look at health and healthcare.

There are deeper and bigger systems at play in each illness and hence the brute-force cure of antibiotics or same-treatment-for-everyone cant be looked at as a future of healthcare.

We need a new allopathy that is ready to grow beyond the current idea of local cause-and-effect and widen its scope to understand the larger global systems that impact behaviour of the micro-systems it is focusing on.

If allopathy is not re-invented soon, it will cause far too many disruptions in larger systems (like what antibiotics have done to the web of life) and if they are agitated to cascade (as we can see with the HIV or coronavirus pandemics) into a problem, they have the power to send our species down the path of existence in a jiffy.

Allopathy may have cured a billion individuals in its golden age, but it is about to turn into a curse from a cure for human species at large. It needs to grow into becoming a holistic system that recognises the idea of optimisation in this chaotic interwoven universe instead of struggling to find cause-and-effect relationships in local systems.

DISCLAIMER : Views expressed above are the author's own.

Continue reading here:

Epigenetics and pandemics: How allopathy can turn into a curse from a cure - The Times of India Blog

Read More..

What is the most important phrase in all science according to the Nobel Prize for Physics Richard Feynman and why – Explica

.What would be your message?

If, in some cataclysm, all scientific knowledge were destroyed but we had the opportunity to pass on a single sentence to the next generations of creatures, what should that sentence be?

That is the question that physicist Richard Feynman posed on the shoulders of some undergraduate students one day in 1961, in one of his legendary lectures given at the California Institute of Technology or Caltech.

If you are taking pity on the poor students, put aside the pity.

Not only did Feynman himself answer the question immediately, but they were fortunate enough to stand before who is widely regarded as the most influential physicist since Albert Einstein.

On top of that, he was the most charismatic, fun and irreverent teacher they could have had.

In short, one of the most extraordinary scientists of the 20th century and someone to whom it hurts to compare.

He was born in 1918, during the Depression, into a working class family outside of New York, USA and, at age 17, he won a math contest in which his talent in that subject was clear .

That same year, he went to study at the Massachusetts Institute of Technology, MIT, and then moved to Princeton, achieving a top score on the mathematics and physics entrance exam, an unprecedented feat.

But soon after, he received sad news: Arline Greenbaum, his girlfriend, had tuberculosis, a disease for which there was no cure at the time. Feynman decided to marry her so that he could take care of her.

Science Photo LibraryRichard and Arline married in 1942, when he was 24 years old and she, 22, under the shadow of a disease that was incurable at the time.

Soon, another threat loomed over the couple: A few months before Richard and Arline were married, the United States was embroiled in World War II, after the Pearl Harbor bombing.

Feynman was asked to join a top-secret project based at a government laboratory in Los Alamos, New Mexico. Code-named Manhattan, their goal was to build an atomic bomb.

Germany was the intellectual center of theoretical physics and we had to make sure that they did not rule the world. I felt like I should do it to protect civilizationFeynman said.

Extraordinary physicists of the stature of Julius Robert Oppenheimer, Niels Bohr, and Enrico Fermi combined their intellectual abilities, but the challenge of developing an atomic bomb so quickly was a titanic task.

A fundamental problem was the large volume of calculations required. Without computers, everything had to be done manually, greatly hampering progress.

Science Photo LibraryAs part of the Manhattan Project, Feynman made human computer equipment work at an inhuman pace.

Feynman devised a way to do calculations in parallel, reducing problem solving time exponentially.

He became a key member of the team, but he also made a name for himself by playing tricks like opening locks behind which top-secret documents were kept just to show that he could.

When he was in Los lamos, he received the sad news that his wife, who was confined to a nearby sanitarium, died.

She was 25 years old. He, 27 and a broken heart.

Shortly after, he was forced to face the reality of what he had helped create.

.The devastation left behind by the bomb he had helped create.

The bomb exploded over the Japanese city of Hiroshima on August 6, 1945. It killed more than 80,000 people. Three days later, a second bomb was detonated, in Nagasaki.

Feynman was deeply disturbed to have contributed to the deaths of so many.

In the months after the double trauma, was plunged into a dark depression.

In the fall of 1945, Feynman was invited to become a professor in the Physics Department at Cornell University.

He was still shocked by the events of that summer, but reflected and remembered that I used to enjoy physics and mathematics because I played with them, so I decided that I was going to do things just for fun.

Science Photo LibraryHaving fun was a priority.

While Feynman was rediscovering the fun in physics, science was in crisis.

New discoveries about atoms had caused confusion in physics.

The old assumptions about the world were wrong and there was a new problem area called Quantum Mechanics.

Quantum mechanics, in many ways, was the most profound psychological shock that physicists have had in all of history.

Isaac Newton was not right: you can know everything there is to know about the world, and yet you cannot predict with perfect precision what will happen next.

Quantum mechanics had revealed the problems of anticipating the behavior of atoms and their electromagnetic forces.

And since they are the fundamental building blocks of nature, everything else was also in doubt.

.Electromagnetism is one of the four fundamental forces of the known universe and it is everywhere.

Think about it: everything that happens around you, apart from gravity, is due to electromagnetism.

When two atoms come together to form a molecule, that is electromagnetism, so all chemistry is electromagnetism. And if all chemistry is electromagnetism, then all biology is electromagnetism.

Literally everything around us is a manifestation of electromagnetism in one way or another.

To greatiwe show features

To try to make sense of electromagnetism and subatomic matter, a new field called quantum electrodynamics or QED, for its acronym in English.

The problem was that while sometimes it seemed to work, other times it didnt make any sense. He was confusing the smartest physicists on the planet, even QEDs father Paul Dirac.

Science Photo LibraryEnglish theoretical physicist Paul Dirac (left) conversing with Feynman in 1962 at the International Conference on Relativistic Theories of Gravitation in Warsaw, Poland. Dirac and Feynman won the Nobel Prize in Physics in 1933 and 1965, respectively.

Feynman had read a book by Dirac, describing problems that no one knew how to solve.

I didnt understand the book very well. But there, in the last paragraph of the book, it said: Some new ideas are needed hereso I started thinking of new ideas, Feynman recalled in an interview.

Typically, Feynman approached the matter in an unconventional way: with drawings.

He found a pictorial way of thinking, inventing a brilliant way to bypass the complicated calculations necessary for QED.

The result were Feynman diagrams, which put the finishing touches on QED, the most numerically accurate physical theory ever invented.

The diagrams turned out to be so useful that today they are applied in completely different fields to particle physics, such as calculating the evolution of galaxies and large-scale structure in the Universe.

Drawing, in fact, would later become another of his hobbies, in addition to playing bongos, which for him were what the violin for Einstein and the piano for Werner Heisenberg.

He decided to learn to draw in his fourth decade of life, helped by an artist friend, and was so enthusiastic that he adopted a topless bar as his secondary office, where he sketched the girls and physics equations.

But it was the QED related drawings that made him deserving of the Nobel Prize in Physics, which he shared with Julian Schwinger and Shinichiro Tomonaga, in 1965.

.Although he accepted the Nobel Prize and had fun at the gala dancing with Gweneth Howarth, his third wife and mother of their two children, Feynman always said that the true award was the pleasure of discovery and seeing that it is useful to other people.

Among those who live in the quantum world, Feynman is also known for works that amaze us, such as the theory of quantum electrodynamics and the physics of superfluidity of subcooled liquid helium.

Let us stay with knowing that he was one of the pioneers in the field of quantum computing and that introduced the concept of nanotechnology.

And his involvement in 1986, when he was already fatally ill, in the Space Shuttle Challenger disaster investigation, when he revealed what NASA was reluctant to accept: the cause of the ships disintegration 73 seconds after its launch put him in the center of public attention.

The phrase with which he summarized his conclusions became famous: For a successful technology, reality must prevail over public relations, since you cant fool nature

But it was his solution to another problem related to physics, this time in university classrooms, that would reveal his gift for spreading the science that would make him famous in the outside world.

In the early 1960s, Caltech was struggling as it failed to attract students to physics classes. Looking for ways to get them excited about the subject, they asked Feynman to redo the curriculum.

His work was a series of lectures that were so engaging that they were edited and published under the title The Feynman Lectures of Physics, one of the most popular physics books in history.

It was in the first of those classes that, after confirming that if they wanted to be physicists, they would have a lot to study (200 years on the fastest developing field of knowledge that exists) and warn them that it would take many more years to learn it (Theyll have to go to graduate school!), He wondered where to start and asked them that question.

But, What was for Feynman the statement that would contain the most information in the fewest words?

BBCCaltech made all of Feynmans legendary lectures available to the public on the website The Feynman Lectures on Physics http://www.feynmanlectures.caltech.edu/.

I think it is the atomic hypothesis (or the atomic fact, or whatever you want to call it) that all things are made of atoms: small particles that move in perpetual motion, attracting each other when they are within walking distance, but repelling when trying to press them against each other

Why?

In that single sentence there is an enormous amount of information about the world, if only a little imagination and thought is applied

If you know that all matter is made of atoms that are constantly moving, you can start to understand phenomena like temperature, pressure and electricity.

They all have to do with the speed at which the atoms are moving and how many and / or what parts of them are doing it.

ANDit can only lead you to discover, for example, the power of steam, the pressure of gases, weather patterns and inventing things like motors, telephones and electric light.

Science Photo LibraryWith his lively and lucid explanations, Feynman made abstract concepts tangible, and his warm presence inspired (and continues to inspire thanks to books and films) the interest and wonder of even the most science-averse.

The final part of his sentence, which refers to the way atoms interact with each other (attracting and repelling each other) reveals the chemistry to you.

Once you understand how Atoms come together to form molecules, you can do it to create antibiotics, vaccines, gasoline and air mixed together form an explosive mixture (combustion engines), batteries, asphalt, steel and even the essence of life: amino acids, carbohydrates, DNA.

For all that Feynman chose that phrase as a legacy for creatures to start again, after everything was lost (and to spark his students interest in physics).

Of course, that is not the only answer.

In fact, there are those who criticize it, such as neuroscientist Daniel Toker who pointed out in an article that strictly speaking, the atomic hypothesis turns out to be false, because according to the theory of the quantum field, a discipline in which Feynman played a key role in development, () subatomic particles are not actually particles, but simply local excitations of quantum fields.

Fortunately, science is not a dogma and as it develops it constantly throws up new possibilities.

Six decades later, the question remains intriguing. And the spirit of the second part of Feynmans answer, eternal.

It will always be urgent to bequeath to the new generations clues so that, with a little imagination and thought, they can discover the world.

Remember that you can receive notifications from BBC Mundo. Download the new version of our app and activate them to not miss our best content.

Continue reading here:

What is the most important phrase in all science according to the Nobel Prize for Physics Richard Feynman and why - Explica

Read More..

Pure math takes PhD across the world – University of Victoria News

Discovering pure mathematics

Nine years after transferring to UVic as an undergraduate student, Chris Bruce is leaving with a PhD in Mathematics and a prestigious NSERC Banting Postdoctoral Fellowship.

In that time, he has proven himself to be an exceptional mathematician.

Born and raised in Victoria, Bruce started his undergraduate degree at Camosun College, and had quite a different path in mind. I thought Id major in business or economics, he says. He had already started an online business, selling parts for mountain bikes. But after transferring to UVic and taking an introduction to abstract algebra course, he knew his path was changing.

He explores connections between two areas of mathematics algebraic number theory, an ancient field of mathematics which deals with prime numbers, rational numbers, and their generalizations, and operator algebras, a relatively new field of math that was originally developed to model systems in quantum physics.

If you can come up with a strong enough connection between these two areas, you can give new approaches to solving problems and potentially solve some of the worlds most famous unresolved problems in pure mathematics, such as the Riemann hypothesis or Hilbert's 12th problem.

While one might imagine mathematics as being a solitary activity at a desk, Bruces experiences belie that.

Working with peoplediscussing problems and having a back and forth of ideas, working on a proofthat is the most enjoyable part of mathematics for me, Bruce says. To that end, he started a graduate-level seminar in his department, giving graduate students, post-docs and visitors a chance to present to their peers.

During his undergraduate degree, he completed a semester in Moscow. Since then, hes taken courses at the University of Wollongong in Australia, attended workshops at the Hausdorff Research Institute for Mathematics in Germany, and developed collaborations in the United Kingdom and Japan.

Once Bruce is able to travel internationally, hell be continuing onto Queen Mary, University of London, with a prestigious NSERC Banting Postdoctoral Fellowship, a prize which provides the best applicants with $70 000 per year in funding for two years.

Read more from the original source:

Pure math takes PhD across the world - University of Victoria News

Read More..

Light From Inside the Tunnel: Advance in Steering and Monitoring the Light-Driven Motion of Electrons – SciTechDaily

Light emission (blue) from the current associated with light-induced electronic tunneling inside a transparent dielectric material due to excitation with a strong optical field (red). Credit: University of Rostock, B. Liewehr

Steering and monitoring the light-driven motion of electrons inside matter on the time-scale of a single optical cycle is a key challenge in ultrafast light wave electronics and laser-based material processing.

Physicists from the Max Born Institute in Berlin and the University of Rostock have now revealed a so-far overlooked nonlinear optical mechanism that emerges from the light-induced tunneling of electrons inside dielectrics. For intensities near the material damage threshold, the nonlinear current arising during tunneling becomes the dominant source of bright bursts of light, which are low-order harmonics of the incident radiation.

These findings, which have just been published in Nature Physics, significantly expand both the fundamental understanding of optical non-linearity in dielectric materials and its potential for applications in information processing and light-based material processing.

Our current understanding of non-linear optics at moderate light intensities is based on the so-called Kerr non-linearity, which describes the non-linear displacement of tightly bound electrons under the influence of an incident optical light field. This picture changes dramatically when the intensity of this light field is sufficiently high to eject bound electrons from their ground state. At long wavelengths of the incident light field, this scenario is associated with the phenomenon of tunneling, a quantum process where an electron performs a classically forbidden transit through a barrier formed by the combined action of the light force and the atomic potential.

Already since the 1990s and pioneered by studies from the Canadian scientist Franois Brunel, the motion of electrons that have emerged at the end of the tunnel, which happens with maximal probability at the crest of the light wave, has been considered as an important source for optical non-linearity. This picture has now changed fundamentally.

In the new experiment on glass, we could show that the current associated with the quantum mechanical tunneling process itself creates an optical non-linearity that surpasses the traditional Brunel mechanism, explains Dr. Alexandre Mermillod-Blondin from the Max Born Institute for Nonlinear Optics and Short Pulse Spectroscopy, who supervised the experiment.

In the experiment, two ultrashort light pulses with different wavelengths and slightly different propagation directions were focused onto a thin slab of glass, and a time- and frequency-resolved analysis of the emerging light emission was performed.

Identification of the mechanism responsible for this emission was made possible by a theoretical analysis of the measurements that was performed by the group of Prof. Thomas Fennel, who works at the University of Rostock and at the Max Born Institute in the framework of a DFG Heisenberg Professorship. The analysis of the measured signals in terms of a quantity that we termed the effective non-linearity was key to distinguish the new ionization current mechanism from other possible mechanisms and to demonstrate its dominance, explains Fennel.

Future studies using this knowledge and the novel metrology method that was developed in the course of this work may enable researchers to temporally resolve and steer strong-field ionization and avalanching in dielectric materials with unprecedented resolution, ultimately possibly on the time-scale of a single cycle of light.

Reference: Origin of strong-field-induced low-order harmonic generation in amorphous quartz by P. Jrgens, B. Liewehr, B. Kruse, C. Peltz, D. Engel, A. Husakou, T. Witting, M. Ivanov, M. J. J. Vrakking, T. Fennel and A. Mermillod-Blondin, 29 June 2020, Nature Physics.DOI: 10.1038/s41567-020-0943-4

Visit link:

Light From Inside the Tunnel: Advance in Steering and Monitoring the Light-Driven Motion of Electrons - SciTechDaily

Read More..

Adoption of Cloud Computing in Municipalities Aids Public Health, Transportation and Safety – BroadbandBreakfast.com

July 17, 2020 Adoption of cloud computing services by public entities impacts many civic sectors, said local officials on Wednesday.

In an Amazon Web Services webinar, local representatives from Louisville and Minneapolis detailed how cloud services helped spur innovation in their respective municipalities, benefitting health, transportation and overall safety.

Emily Ward, state planning director for emergency preparedness and response at the Minnesota Department of Health, detailed ways in which the healthcare sector leveraged and repurposed the citys cloud services in response to the pandemic.

The departments information technology sector developed two applications to assist in getting medical supplies to those in need, called POD PreCheck and POD Locator.

PODs, or point(s) of dispensing, are community locations at which state and local agencies dispense medical materials and medications to the public.

POD PreCheck allowed clients to prescreen their conditions electronically, which assisted the Minnesota Department of Health in delivering the best medication to consumers with speed and efficiency, reducing wait times.

POD Locator is a dynamic mapping application that shows the locations of PODs on a searchable map and provides any site-specific instructions.

The scalability offered by the cloud was the most desirable feature, said Ward. This app will still work if more than 5 million users try to access it.

It is Important that it remains stable, she added.

Meanwhile, across the country, the city of Louisville is leveraging data provided by its open source software and cloud technology to better understand the use of new transportation technologies in the city, such as accounting for the rise of electronic scooters.

Louisvilles IT department created an application that connects mobility companies with local government agencies, in an attempt to safely manage public space.

Data drawn from the application allowed employees of the public IT sector to measure the companies operation compliance with a geofence the city enforced around a public downtown weekend event, where no scooters were allowed to operate.

The public data not only revealed non-compliance, but further exposed that new transportation technologies are not distributed equitably.

Through the data, the city found that transportation services were not located in disadvantaged neighborhoods. The city responded by requiring more equitable distribution of services.

Michael Schnuerle, director of open source operations at the Open Mobility Foundation, said that the cities cloud services play an important role in increasing capabilities to move data across different systems and automating certain data initiatives.

More here:
Adoption of Cloud Computing in Municipalities Aids Public Health, Transportation and Safety - BroadbandBreakfast.com

Read More..

Cloud Computing Market Worth $765.6 Billion By 2027 | Grand View Research Inc. – MENAFN.COM

(MENAFN - GetNews) As per the World Economic Forum, the fourth industrial revolution will be characterized by a fusion of technologies such as artificial intelligence, internet of things, and cloud computing. AI and cloud computing will complement each other along with IoT to improve technology and catalyze growth. Organizations across various verticals are proactively integrating cloud computing with these evolving technologies.

The globalcloud computing marketsize is expected to reach USD 765.6 billion by 2027 , expanding at a CAGR of 14.9%, according to a new study conducted by Grand View Research, Inc. Cloud services are being increasingly adopted by businesses, due to their cost-effectiveness, service-related flexibility and real-time service catering nature. It is also making its mark in medium and small enterprises and is experiencing more demand due to increasing number of these small-scale enterprises all over the world. Moreover, technologies such as artificial intelligence, machine learning will complement cloud services to boost the organizational growth across industries.

COVID-19 Effect

COVID-19 has affected the work culture in a big way. There is a general shift towards work from home culture due to lockdown situations all around and it has proved to be a novel boosting factor for the cloud computing market. Also, nowadays, companies are looking at cloud technology as something that can boost their efficiency while lowering their cost of running the business. Although these two factors have provided a big push to adoption of cloud computing, security issues are something that are considered an obstacle. With rising adoption of cloud services and work from culture, security issues have also increased and companies are striving towards reducing its occurrence and minimizing the losses. New technologies and firewalls are coming up to make online services safer and more secure, which will ensure healthy online environment for the businesses.

'Would you like/try a Free Sample Report Click the linkbelow: [To enable links contact MENAFN] ___________________________________________________________________________________________

At present, cloud services have been utilized across various industries and most of the organizations are relying on IT resources to conduct their day to day work. In fact, governments are also making a move towards cloud services and helping generate growth for the market. Some of the important sectors the cloud computing market caters to are:

Infusion of Big Data is something that is extremely important to foster market growth as it will lead to replacement of traditional data warehouses by cloud computing technology, due to their incapability to manage and analyze the volume, veracity and variety of Big Data. This will help in creating a good demand in the market, leading to higher growth rate.

Cloud computing market report highlights:

Inquiry Before Buying: [To enable links contact MENAFN] __________________________________________________________________________________________

Major Market Players:

Some Recent Developments:

Thanks for reading this article; you can also get individual chapter wise section or region wise report versions like North America, Europe, or Asia.

If you need specific information, which is not currently within the article, we will provide it to you as a part of customization.Explore the BI enabled intuitive market research database, Navigate with Grand View Compass, by Grand View Research, Inc.

Media Contact Company Name: Grand View Research, Inc. Contact Person: Sherry James, Corporate Sales Specialist - U.S.A. Email: Send Email Phone: 1-415-349-0058, Toll Free: 1-888-202-9519 Address: 201, Spear Street, 1100 City: San Francisco State: California Country: United States Website: [To enable links contact MENAFN]

MENAFN] alt="" width="1px" height="1px">

See original here:
Cloud Computing Market Worth $765.6 Billion By 2027 | Grand View Research Inc. - MENAFN.COM

Read More..

Managing the Impact of Cloud Computing – The CPA Journal

Cloud computing is in the vanguard of a global digital transformation. This article looks at how to identify cloud computing opportunities and operationalize cloud activities. It also defines the stakeholders involved in the enterprises risk management strategy and shared responsibility model. Finally, the article provides advice on how to manage the disruption caused by the adoption of cloud computing.

***

A fourth Industrial Revolution is underway globally; a digital revolution driven by the rapid, wide-scale deployment of digital technologies, such as in high-speed mobile Internet capabilities, artificial intelligence (AI), and machine learning. Cloud computing is at the vanguard of this transformation. As a result, organizations of all sizes, sectors, and geographies have substantially and rapidly increased their use of cloud computing. According to Gartner (2019), more than one-third of organizations see cloud investments as a top-three priority. The public cloud services market is projected to reach a staggering $266 billion in 2020.

One driver in this proliferation and widespread use of cloud computing is the current digital transformation. In a 2016 address, Microsoft CEO Satya Nadella advanced this enduring description of digital transformation: becoming more engaged with their customers, empowering their employees, optimizing how they run their business operations and transforming the products and services they offer using digital content. Such benefits from a cloud computing perspective include managing and outsourcing costly and difficult-to-update and -manage in-house IT infrastructure; streamlining and scaling storage, software, and application support; increasing speed and processing; reducing costs. As a result, organizations of all sizes, geographies and sectors, including CPA firms and their clients, are developing their own private cloud or purchasing public cloud services from cloud service providers (CSP), such as Microsoft Azure and Amazon AWS.

While such potential benefits are compelling, market intelligence reveals that cloud computing exacerbates risks and creates new and unexpected risks. For example, a cloud security breach exposed the names, addresses, and account details of as many as 14 million U.S.-based Verizon customers. In this context, one can only imagine the potential cloud-related cybersecurity breaches and service failures that may emerge from the unexpected disruption and rapid transformation to remote working caused by the current coronavirus (COVID-19) pandemic. On the one hand, workers unexpectedly transitioning to remote working have been enabled in part by cloud computing to immediately, rapidly, and seamlessly access necessary data, software, and applications. On the other hand, such an unanticipated disruption and rapid transformation has exacerbated existing risks and created new risks as workers access data from remote locations; for example, breaches in data confidentiality, unauthorized access, and system availability failures.

This disruptive cloud paradigm raises questions from the corporate boards, managers, regulators, and assurance providers concerning cloud strategy, performance, risks, and controls. Such questions include: the scope and location of cloud activities; the implications of dependency on a web of cloud solution provider (CSP) vendors; reputation, intellectual property, financial statement and market trust vulnerabilities; global jurisdiction regulatory compliance; as well as the adequacy of risk management, cybersecurity, audit, and change management. This article looks at cloud computing opportunities, risks, and resiliency strategies, including enterprise risk management, CPA firm assurance, and change management.

The National Institute of Standards and Technology (NIST) defines cloud computing as a means for enabling on-demand access to shared pools of configurable computing resources (e.g., networks, servers, storage applications, services) that can be rapidly provisioned and released. In simple terms, the cloud is a massive cluster of super-sized servers housed in locations scattered around the globe (i.e., cloud farms). Cloud farms are operated by CSP vendors such as Amazon AWS; these vendors provide a range of hosting services.

Some organizations are adopting a cloud-first strategy for new systems or when replacing systems. Popular cloud deployment models include private clouds, public clouds, hybrid clouds, and community clouds;Exhibit 1defines each model. Popular CSP cloud services include Infrastructure as a Service (IaaS), Software as a Service (SaaS), and Platform as a Service (PaaS);Exhibit 2defines each service. Pay-as-you-go (i.e., when customers are billed based on their levels of usage) is a popular pricing model.

Cloud Computing Services Deployment Models, per NIST

Three Primary Models of Cloud Services, per NIST

Cloud computing also changes organizations. According to Deloitte (2020), Executives extend the enterprise every time they use a cloud service, outsource a business process, or otherwise spread operations beyond the traditional four walls of their organization. In a cloud computing context, this extended enterprise creates a complex web of distributed, interconnected, and interdependent shared-responsibility participants, including employees (i.e., first party), customers (i.e., second party), vendors, and their hired subcontractors (i.e., third, fourth, and fifth parties).Exhibit 3depicts this web of extended relationships.

Extended Enterprise: Web of Data Sharing and Cloud Computing

The cloud also democratizes and decentralizes IT activitiesthat is, non-IT employees are capable of developing applications and given the authority to contract directly with CSPs outside of the centralized IT procurement process.

Cloud-driven changes, such as the following, also impact the CFO organization.

The cloud also exacerbates existing risks, creates new and unexpected risks, and stretches the limits of governance, risk management, cybersecurity, internal audit, assurance, and change management. For CPA firms and their clients, this cloud disruption requires a what-can-go-wrong analysis.

As far back as 2013, McKinsey warned, Large institutions, which have many types of sensitive information to protect and many cloud solutions to choose from, must balance potential benefits against, for instance, risks of breaches of data confidentiality, identity and access integrity, and system availability. More recently, IDC (2018) reported that 50% of security professionals spend most of their time securing the cloud. In 2019, the Cloud Security Alliance (CSA) advanced their top-11 cloud security threats.Exhibit 4presents the CSAs 11 threats.

Cloud Security Alliance (CSA) Top 11 Threats to Cloud Computing (2019)

In spite of such warnings, recent cloud-breaches such as the following continue to emerge:

In 2019, Gartner advanced the following predictions concerning cloud security:

The wave of breaches suggests cloud computing is risky; exacerbating risks (i.e., known-knowns), creating new risks (unknown-knowns), and unforeseeable risks (unknown-unknowns). For example, consider the following service availability and cyber-risks associated with the geographic location of cloud servers a company is relying on:

Sector-level regulations will play an important role in contributing to addressing such risks. For example, a customized set of standards has been developed under the umbrella of the U.S. Federal Risk and Authorization Management Program (FedRAMP) to authorize the use of cloud services. HIPAA regulations that focus on governing cloud resources offered by a CSP are another sector example. The HIPAA Privacy, Security, and Breach Notification Rules establish important protections for individually identifiable health information when created, received, maintained, or transmitted by a HIPAA-covered entity or business associate (e.g., a CSP). For example, CSP-related SLAs should include provisions that address HIPAA-related requirements, including system availability and reliability; backup and data recovery; the manner in which data will be returned to customers after service use termination and security responsibility; and use, retention, and disclosure limitations.

Regulatory compliance alone will not suffice. To mitigate risk, an organization should conduct a holistic, enterprise-wide what-can-go-wrong analysis, including an analysis of cyber-security risks and a single-point-of-failure risk analysis associated with their cloud ecosystem. A what-can-go-wrong analysis posits the question: Are CPA firms and their clients prepared to respond to cloud risks?

Cloud computing disrupts organizations, calling into question its impact on governance, compliance, risk management, cybersecurity, audit and change management.

The KPMG Audit Committee Institute highlighted understanding technologys impactwith a reference to cloud computingas one of their seven items to consider for the audit committees 2020 agenda. In this context, an organization needs transparency into the nature, scope, and location of CSP vendors and the performance of their cloud activities. The board, senior management, and CPAs should ask the following questions:

While these questions may seem fundamental, market intelligence suggests that some organizations are unclear about the nature, scope, and locations of their cloud activities.

One reason for this is shadow IT activities. This refers to empowered employees scattered throughout the organization that are adopting cloud services under the radar of the IT department. According to Gartner, most organizations grossly understate the number of shadow IT applications already in use. A continuously updated inventory of the current state of organization-wide cloud activities is essential for conducting a holistic analysis of cloud performance and risk.

The linkage of objectives and risks is a foundational premise of enterprise risk management (ERM) frameworks. The International Organization for Standardization (ISO) defines risk as effect of uncertainty on objectives. For cloud computing, such objectives may include privacy, availability, productivity, reliability, compliance, cost transparency, and cost savings. The Committee of Sponsoring Organizations of the Treadway Commission (COSO) ERM framework, Enterprise Risk Management Integrating Strategy with Performance,DNS:https://www.coso.org/Documents/2017-COSOERM-Integrating-with-Strategy-and-Performance-Executive-Summary.pdfmakes explicit the linkage of performance objectives and risk.

An ERM approach can also contribute to cyber-resiliency; the ability to rapidly and fully recovery from system failures and security breaches. In a 2020 financial service industry report, Thomson Reuters identified cyber-resiliency as a key regulatory risk, asserting that, senior individuals need to ensure cyber-risks are expressly included in the range of risks considered, and the board is prepared to discuss the actions taken to ensure all possible has been done to embed cyber-resilience throughout the firm. The organizations incident response plan, including plans for incident-handling and information-spilling response, should be an integral part of cyber-security policy and an ERM analysis. In summary, an ERM analysis that integrates cloud computing can contribute to cloud performance; managing cloud risk; rapid, timely, and proper incident response; change management; and resiliency.

An ERM analysis will also assist CPA firms and other assurance providers with identifying and assessing risks and controls, as well as the nature, timing, and extent of audit and attestation procedures selected.Exhibit 5presents an example of ERM analysis.

Sample Enterprise Risk Management (ERM): Cloud Risk Analysis

Cloud computing is disrupting CPA firms, their clients, and the traditional norms of the external audit and quality control. In its 20202021 Strategy Plan, the AICPA Auditing Standards Board (ASB) addressed this issue: Rapid developments in technologies are having a profound effect on audit and assurance engagements, including the use of automated tools and techniques and changes in how engagement teams are structured and interact. In Initiative D: Keep our standards relevant in a changing environment, the ASB commits to monitoring the use of innovative technologies and determining whether the standards in place for the acceptance of clients and service performance are appropriate.

Cloud computing impacts CPA assurance providers in a range of waysfor example, obtaining an understanding of the audit clients cloud environment; identifying and assessing risks of material misstatement (RMM); defining the role to be served by System Organization Control (SOC) reports; assessing the impact of the clients and the firms cloud computing activities on the firms compliance with GAAS Quality Control (QC) Standards.

Audit clients are increasingly moving some or all of their accounting systems and financial statement data to public clouds. This cloud transition introduces complexity, disruption, and risk.

For example, a cloud computing environment often integrates third-party CSPs and potentially fourth-party sub-contracted CSPs (Exhibit 3) into the clients accounting system and control environment. Such a complex web of CSPs results in shared responsibilities between the client and CSPs for financial accounting data, cybersecurity, internal controls over financial reporting (ICFR), service organizations control (SOC) reporting, and assurance services.

Such material changes to the control environment and accounting system require auditors to obtain an understanding of the companys environment and risks as a basis for assessing the risk of material misstatement (RMM) of the financial statements, as prescribed by PCAOB Auditing Standard (AS) 2110.

A prudent starting point for obtaining a preliminary understanding of a companys cloud environment and risks is the analysis of the inventory of audit client cloud activities, including the nature and extent of third- and fourth-party CSP vendors and any material changes in such arrangements during the period under audit. The audit client will be the primary source for obtaining an understanding of the current state of the cloud. Market intelligence suggests, however, that some organizations may not have an up-to-date current state analysis of its cloud activities. If documentation does not exist, this will impact (i.e., increase) RMM and may require additional audit procedures (e.g., walkthroughs), specialized cloud audit skills, and higher audit fees.

SOC for Service Organizations are internal control reports on the third-party services provided by an outsourcing service organization (e.g., CSP). AICPA SOC Reports are subject to standards AT-C section 320 and SSAE 18. The following SOC Reports are available in this category: SOC 1, SOC 2, SOC 3, and SOC for Cybersecurity.Exhibit 6defines each report.

Exhibit 6 Types of AICPA SOC Reports

For audit clients with material cloud computing operations, the selection of report type, as well as the right to conduct such services will be based upon a range of factors, including the type of the assurance service and the audit clients cloud footprint, as well as the web of third- and fourth-party CSP vendors and shared control responsibility agreements and the terms of service-level agreements (SLA) with CSPs.

One of the six elements of the AICPA quality control (QC) standards deals with client acceptance and retention, requiring consideration of whether the CPA firm is competent to perform the engagement and has the capabilities, including time and resources, to do so. Another element is associated with human resources, requiring the CPA firm have sufficient personnel with the competence and capabilities to perform engagements in accordance with professional standards and applicable legal and regulatory requirements. To comply with these QC audit standards in a cloud computing assurance engagement, CPA firms will need to assess the demand for, and timely availability of, the necessary specialized skills.

Another important element of the AICPA QC standards covers new client acceptance and retention of existing clients. Such QC considerations include the following:

A CPA firm will need to make selective changes to accept cloud computing-related engagements, such as training staff, securing subject experts, and protecting the privacy of client data accessed through the client and their CSP clouds and stored on the CPA firms clouds.

The emergence of cloud computing and the incipient digital transformation of business is having a profound impact on the traditional techniques and services provided by CPA firms. Organizations adopting or leveraging cloud computing should obtain a continuous update of their inventory of cloud activities, including the nature, scope, and locations of their cloud activities; conduct a holistic, enterprise-wide, what-can-go-wrong analysis, including cybersecurity risks and single-point-of-failure risks associated with their cloud ecosystem; and perform an analysis of cloud computing resiliency, including an ERM analysis of cloud performance, security risk, and change management risk. CPA firms adapting to digital disruption and transformation must obtain an understanding of the implications of cloud computing on their clients business and control environment; analyze risks of material misstatement and cybersecurity risks; assess cloud controls; and manage cloud-informed changes to the CPA firms QC processes and compliance.

Meredith Stein, CPA, leads the NIH Risk Management Program at the National Institutes of Health (NIH), Bethesda, Md. The views expressed are her own and do not necessarily represent the views of the NIH or the United States Government. She began her career with KPMG.

Vincent Campitelli, CPA, is a consultant to the office of the president of the Cloud Security Alliance (CSA) Seattle, Wash., serving as an enterprise security specialist with a focus on cloud computing. He is formerly a partner of PricewaterhouseCoopers.

Steven Mezzio, PhD, CPA, CISA, CISSP, FSAI, is a professor of accounting and the executive director of the Center for Excellence in Financial Reporting for the Pace University Lubin School of Business. He is also a former partner with PricewaterhouseCoopers.

Read more from the original source:
Managing the Impact of Cloud Computing - The CPA Journal

Read More..

Venture capital firms bank on a sustained shift to the cloud – Brisbane Times

"If you look at SafetyCulture, Culture Amp, and Canva is probably the best example of it, they are all built on the cloud," he said. "A lot of these companies are seeing (the shift to the cloud) accelerate as a result of the coronavirus pandemic."

Meanwhile, Airtree Ventures partner James Cameron pointed to the growth of online cloud computing training provider A Cloud Guru, which reached $116 million in revenue this year, as a good example of how tech startups can make the most of the rush to the cloud.

A Cloud Guru teaches people how to use cloud platforms like Amazon Web Services, Microsoft Azure and Google Cloud Platform and has taught two million users since launch in 2015.

"It really has been the right time and right place (for the business) to enjoy that explosive growth," Mr Cameron said.

"They are sitting at this mega trend and of the tailwinds they have got behind them one is online education and the second is the shift into digital reskilling, then there's also the shift in the general software development world to the cloud."

A Cloud Guru co-founder and chief executive Sam Kroonenburg said remote working has been a key catalyst for the shift to the cloud.

A Cloud Guru co-founder and chief executive Sam Kroonenburg. Credit:Eamon Gallagher

"I think the world has a mandate to move to the cloud and this is a major shift that is happening across the world globally," he said. "Companies are wanting to move away from managing their own infrastructure, they want to have the flexibility to send their workforce home and COVID-19 is just accelerating that trend."

Mr Kroonenburg said he expected this shift to continue even after the pandemic is over.

"It is a long term systemic change," he said.

Loading

Paul Bassat, co-founder of Square Peg, said migration to the cloud was one of the key themes the venture capital firm was focused on, pointing to its investments in Israeli data centre infrastructure startup Excelero and cloud storage infrastructure startup Lightfix.

"We are essentially seeing pretty much all businesses are going to move to the cloud, we are 20 to 30 per cent into that journey so there's still a long way to go here," he said.

Read more:
Venture capital firms bank on a sustained shift to the cloud - Brisbane Times

Read More..

3 cloud computing stocks riding the hype wave – ForexLive

Mass gatherings were drastically reduced and working from homebecame the practice for quite a while now. Due to the coronavirus emergency, aheavier number of consumers rely on cloud computing services to get thingsdone. Cloud-based solutions are pumping up to provide the necessities of peoplefor remote collaborations, video and audio conferencing, online classes,gaming, and e-commerce amid the pandemic. Big data and cloud computing playcritical roles in the healthcare industry as the combat against the viruscontinues.

The immense demand in the cloud-computing space provided robustdevelopments in stock prices for this sector. Here are3 cloud computing stocksto watch out for.

MICROSOFT

Microsoft is a key player in the cloud infrastructure sector,contributing 18% market share from 16% last year. According to Microsoft CEOSatya Nadella, they move ahead of other cloud providers by having more datacenter regions, with Mexico and Spain being the recent additions. Last quarter,Microsoft gained 29%in intelligent cloud segment revenue to $12.3 billion. It has a $13.7 billion free cash flow in therecent quarter, increasing 25% year over year.

MSFT.US is up by 35% YTD, SimpleFX WebTrader

MSFT.US is moving above the 50-, 100-, and 200-day SMAs sinceApril. It increased by 1.16% on Thursday, touching a fresh new high at $216. Itis currently up by about 35% this year to date and climbed by 9.7% from thepast week.

NVIDIA

NVIDIA is not letting others get ahead easily. This multinationaltech giant delivers GPUs to well-known cloud providers, resulting in massivesales growth. Its data center revenue blew up to $2.99 billion this year fromonly $339 million in 2016. The total sales in this segment increased by 80%from the year earlier and reached a total of $1.14 billion in the Q1 FY2021.NVDA.US is trading at $415.09 as of writing and isup by over 70% thisyear to date.

NVDA.US is up by over 70% this year, SimpleFX WebTrader

The recent acquisition of Mellanox, a leading supplier of computernetwork products based on InfiniBand and Ethernet technology, will be a bigboost for NVIDIA's data center scope. With this, NVIDIA's expected Q2 revenueclimbs to about $3.65 billion.

ALIBABA

Alibaba Group Holding Ltd shares (BABA.US) reached a fresh 52-weekhigh on Thursday at $268.00. This is after Needham's Vincent Yu, an Alibaba analyst,revealed a "buy" rating with a target of $275.

According to Yu, Alicloud gains from gigantic shifts to the cloudby enterprises and government agencies. Alicloud is a market leader in Chinawith about 46% market share. The multinational tech company plans to allocate$28 billion over 3 years in cloud infrastructure.

Alibaba shares gain 20.62% YTD, SimpleFX WebTrader

Alibaba is also at the top of the e-commerce market with itsJuhuasuan and Taobao Deals attracting more consumers as a provider ofcompetitively priced goods. BABA.US has gained 20.62% this year to date and isup by 10% from the past week.

Want to trade the hottest stocks? Join us at SimpleFX's Summer Trading Cashback Promo andearn up to 2000 USDT cashback! Trade any of more than 170 instruments on July2-August 31 to qualify. The higher your trade volume on these dates, the higherrewards you collect.T&C apply. Give your trading plans a boost with up to 500x leverage. Thereare more reasons to trade this summer.Don't get left behind!

Read more here:
3 cloud computing stocks riding the hype wave - ForexLive

Read More..