Page 2,045«..1020..2,0442,0452,0462,047..2,0502,060..»

What is quantum mechanics trying to tell us? – Big Think

Classical physics did not need any disclaimers. The kind of physics that was born with Isaac Newton and ruled until the early 1900s seemed pretty straightforward: Matter was like little billiard balls. It accelerated or decelerated when exposed to forces. None of this needed any special interpretations attached. The details could get messy, but there was nothing weird about it.

Then came quantum mechanics, and everything got weird really fast.

Quantum mechanics is the physics of atomic-scale phenomena, and it is the most successful theory we have ever developed. So why are there a thousand competing interpretations of the theory? Why does quantum mechanics need an interpretation at all?

What, fundamentally, is it trying to tell us?

There are many weirdnesses in quantum physics many ways it differs from the classical worldview of perfectly knowable particles with perfectly describable properties. The weirdness you focus on will tend to be the one that shapes your favorite interpretation.

But the weirdness that has stood out most, the one that has shaped the most interpretations, is the nature of superpositions and of measurement in quantum mechanics.

Subscribe for counterintuitive, surprising, and impactful stories delivered to your inbox every Thursday

Everything in physics comes down to the description of what we call the state. In classical physics, the state of a particle was just its position and momentum. (Momentum is related to velocity.) The position and velocity could be known with as much accuracy as your equipment allowed. Most important, the state was never connected to making a measurement you never had to look at the particle. But quantum mechanics forces us to think about the state in a very different way.

In quantum physics, the state represents the possible outcomes of measurements. Imagine you have a particle in a box, and the box has two accessible chambers. Before a measurement is made, the quantum state is in a superposition, with one term for the particle being in the first chamber and another term for the particle being in the second chamber. Both terms exist at the same time in the quantum state. It is only after a measurement is made that the superposition is said to collapse, and the state has only one term the one that corresponds to seeing the particle in the first or the second chamber.

So, what is going on here? How can a particle be in two places at the same time? This is also akin to asking whether particles have properties in and of themselves. Why should making a measurement change anything? And what exactly is a measurement? Do you need a person to make a measurement, or can you say that any interaction at all with the rest of the world is a measurement?

These kinds of questions have spawned a librarys worth of so-called quantum interpretations. Some of them try to preserve the classical worldview by finding some way to minimize the role of measurement and preserve the reality of the quantum state. Here, reality means that the state describes the world by itself, without any reference to us. At the extreme end of these is the Many Worlds Interpretation, which makes each possibility in the quantum state a parallel Universe that will be realized when a quantum event a measurement happens.

This kind of interpretation is, to me, a mistake. My reasons for saying this are simple.

When the inventors of quantum mechanics broke with classical physics in the first few decades of the 1900s, they were doing what creative physicists do best. They were finding new ways to predict the results of experiments by creatively building off the old physics while extending it in ways that embraced new behaviors seen in the laboratory. That took them in a direction where measurement began to play a central role in the description of physics as a whole.Again and again, quantum mechanics has shown that at the heart of its many weirdnesses is the role played by someone acting on the world to gain information. That to me is the central lesson quantum mechanics has been trying to teach us: That we are involved, in some way, in the description of the science we do.

Now to be clear, I am not arguing that the observer affects the observed, or that physics needs a place for some kind of Cosmic Mind, or that consciousness reaches into the apparatus and changes things. There are much more subtle and interesting ways of hearing what quantum mechanics is trying to say to us. This is one reason I find much to like in the interpretation called QBism.

What matters is trying to see into the heart of the issue. After all, when all is said and done, what is quantum mechanics pointing to? The answer is that it points to us. It is trying to tell us what it means to be a subject embedded in the Universe, doing this amazing thing called science. To me that is just as exciting as a story about a Gods eye view of the Universe.

See the article here:

What is quantum mechanics trying to tell us? - Big Think

Read More..

How the Multiverse could break the scientific method – Big Think

Today lets take a walk on the wild side and assume, for the sake of argument, that our Universe is not the only one that exists. Lets consider that there are many other universes, possibly infinitely many. The totality of these universes, including our own, is what cosmologists call the Multiverse. It sounds more like a myth than a scientific hypothesis, and this conceptual troublemaker inspires some while it outrages others.

The controversy started in the 1980s. Two physicists, Andrei Linde at Stanford University and Alex Vilenkin at Tufts University, independently proposed that if the Universe underwent a very fast expansion early on in its existence we call this an inflationary expansion then our Universe would not be the only one.

This inflationary phase of growth presumably happened a trillionth of a trillionth of a trillionth of one second after the beginning of time. That is about 10-36 seconds after the bang when the clock that describes the expansion of our universe started ticking. You may ask, How come these scientists feel comfortable talking about times so ridiculously small? Wasnt the Universe also ridiculously dense at those times?

Well, the truth is we do not yet have a theory that describes physics under these conditions. What we do have are extrapolations based on what we know today. This is not ideal, but given our lack of experimental data, it is the only place we can start from. Without data, we need to push our theories as far as we consider reasonable. Of course, what is reasonable for some theorists will not be for others. And this is where things get interesting.

The supposition here is that we can apply essentially the same physics at energies that are about one thousand trillion times higher than the ones we can probe at the Large Hadron Collider, the giant accelerator housed at the European Organization for Nuclear Research in Switzerland. And even if we cannot apply quite the same physics, we can at least apply physics with similar actors.

In high energy physics, all the characters are fields. Fields, here, mean disturbances that fill space and may or may not change in time. A crude picture of a field is that of water filling a pond. The water is everywhere in the pond, with certain properties that take on values at every point: temperature, pressure, and salinity, for example. Fields have excitations that we call particles. The electron field has the electron as an excitation. The Higgs field has the Higgs boson. In this simple picture, we could visualize the particles as ripples of water propagating along the surface of the pond. This is not a perfect image, but it helps the imagination.

The most popular protagonist driving inflationary expansion is a scalar field an entity with properties inspired by the Higgs boson, which was discovered at the Large Hadron Collider in July 2012.

Subscribe for counterintuitive, surprising, and impactful stories delivered to your inbox every Thursday

We do not know if there were scalar fields at the cosmic infancy, but it is reasonable to suppose there were. Without them, we would be horribly stuck trying to picture what happened. As mentioned above, when we do not have data, the best that we can do is to build reasonable hypotheses that future experiments will hopefully test.

To see how we use a scalar field to model inflation, picture a ball rolling downhill. As long as the ball is at a height above the bottom of the hill, it will roll down. It has stored energy. At the bottom, we set its energy to zero. We do the same with the scalar field. As long as it is displaced from its minimum, it will fill the Universe with its energy. In large enough regions, this energy prompts the fast expansion of space that is the signature of inflation.

Linde and Vilenkin added quantum physics to this picture. In the world of the quantum, everything is jittery; everything vibrates endlessly. This is at the root of quantum uncertainty, a notion that defies common sense. So as the field is rolling downhill, it is also experiencing these quantum jumps, which can kick it further down or further up. Its as if the waves in the pond were erratically creating crests and valleys. Choppy waters, these quantum fields.

Here comes the twist: When a sufficiently large region of space is filled with the field of a certain energy, it will expand at a rate related to that energy. Think of the temperature of the water in the pond. Different regions of space will have the field at different heights, just as different regions of the pond could have water at different temperatures. The result for cosmology is a plethora of madly inflating regions of space, each expanding at its own rate. Very quickly, the Universe would consist of myriad inflating regions that grow, unaware of their surroundings. The Universe morphs into a Multiverse.Even within each region, quantum fluctuations may drive a sub-region to inflate. The picture, then, is one of an eternally replicating cosmos, filled with bubbles within bubbles. Ours would be but one of them a single bubble in a frothing Multiverse.

This is wildly inspiring. But is it science? To be scientific, a hypothesis needs to be testable. Can you test the Multiverse? The answer, in a strict sense, is no. Each of these inflating regions or contracting ones, as there could also be failed universes is outside our cosmic horizon, the region that delimits how far light has traveled since the beginning of time. As such, we cannot see these cosmoids, nor receive any signals from them. The best that we can hope for is to find a sign that one of our neighboring universes bruised our own space in the past. If this had happened, we would see some specific patterns in the sky more precisely, in the radiation left over after hydrogen atoms formed some 400,000 years after the Big Bang. So far, no such signal has been found. The chances of finding one are, quite frankly, remote.

We are thus stuck with a plausible scientific idea that seems untestable. Even if we were to find evidence for inflation, that would not necessarily support the inflationary Multiverse. What are we to do?

The Multiverse suggests another ingredient the possibility that physics is different in different universes. Things get pretty nebulous here, because there are two kinds of different to describe. The first is different values for the constants of nature (such as the electron charge or the strength of gravity), while the second raises the possibility that there are different laws of nature altogether.

In order to harbor life as we know it, our Universe has to obey a series of very strict requirements. Small deviations are not tolerated in the values of natures constants. But the Multiverse brings forth the question of naturalness, or of how common our Universe and its laws are among the myriad universes belonging to the Multiverse. Are we the exception, or do we follow the rule?

The problem is that we have no way to tell. To know whether we are common, we need to know something about the other universes and the kinds of physics they have. But we dont. Nor do we know how many universes there are, and this makes it very hard to estimate how common we are. To make things worse, if there are infinitely many cosmoids, we cannot say anything at all. Inductive thinking is useless here. Infinity gets us tangled up in knots. When everything is possible, nothing stands out, and nothing is learned.

That is why some physicists worry about the Multiverse to the point of loathing it. There is nothing more important to science than its ability to prove ideas wrong. If we lose that, we undermine the very structure of the scientific method.

Follow this link:

How the Multiverse could break the scientific method - Big Think

Read More..

No, particle physics on Earth won’t ever destroy the Universe – Big Think

Anytime you reach deeper into the unknown than ever before, you should not only wonder about what youre going to find, but also worry about what sort of demons you might unearth. In the realm of particle physics, that double-edged sword arises the farther we probe into the high-energy Universe. The better we can explore the previously inaccessible energy frontier, the better we can reveal the high-energy processes that shaped the Universe in its early stages.

Many of the mysteries of how our Universe began and evolved from the earliest times can be best investigated by this exact method: colliding particles at higher and higher energies. New particles and rare processes can be revealed through accelerator physics at or beyond the current energy frontiers, but this is not without risk. If we can reach energies that:

certain consequences not all of which are desirable could be in store for us all. And yet, just as was the case with the notion that The LHC could create black holes that destroy the Earth, we know that any experiment we perform on Earth wont give rise to any dire consequences at all. The Universe is safe from any current or planned particle accelerators. This is how we know.

The idea of a linear lepton collider has been bandied about in the particle physics community as the ideal machine to explore post-LHC physics for many decades, but only if the LHC makes a beyond-the-Standard-Model discovery. Direct confirmation of what new particles could be causing CDFs observed discrepancy in the W-bosons mass might be a task best suited to a future circular collider, which can reach higher energies than a linear collider ever could.

There are a few different approaches to making particle accelerators on Earth, with the biggest differences arising from the types of particles were choosing to collide and the energies were able to achieve when were colliding them. The options for which particles to collide are:

Travel the Universe with astrophysicist Ethan Siegel. Subscribers will get the newsletter every Saturday. All aboard!

In the future, it may be possible to collide muons with anti-muons, getting the best of both the electron-positron and the proton-antiproton world, but that technology isnt quite there yet.

A candidate Higgs event in the ATLAS detector at the Large Hadron Collider at CERN. Note how even with the clear signatures and transverse tracks, there is a shower of other particles; this is due to the fact that protons are composite particles, and due to the fact that dozens of proton-proton collisions occur with every bunch crossing. Examining how the Higgs decays to very high precision is one of the key goals of the HL-LHC.

Regardless, the thing that poses the most danger to us is whatevers up there at the highest energy-per-particle-collision that we get. On Earth, that record is held by the Large Hadron Collider, where the overwhelming majority of proton-proton collisions actually result in the gluons inside each proton colliding. When they smash together, because the protons total energy is split among its constituent particles, only a fraction of the total energy belongs to each gluon, so it takes a large number of collisions to find one where a large portion of that energy say, 50% or more belongs to the relevant, colliding gluons.

When that occurs, however, thats when the most energy is available to either create new particles (via E = mc2) or to perform other actions that energy can perform. One of the ways we measure energies, in physics, is in terms of electron-volts (eV), or the amount of energy required to raise an electron at rest to an electric potential of one volt in relation to its surrounding. At the Large Hadron Collider, the current record-holder for laboratory energies on Earth, the most energetic particle-particle collision possible is 14 TeV, or 14,000,000,000,000 eV.

Although no light can escape from inside a black holes event horizon, the curved space outside of it results in a difference between the vacuum state at different points near the event horizon, leading to the emission of radiation via quantum processes. This is where Hawking radiation comes from, and for the tiniest-mass black holes, Hawking radiation will lead to their complete decay in under a fraction-of-a-second.

There are things we can worry will happen at these highest-of-energies, each with their own potential consequence for either Earth or even for the Universe as a whole. A non-exhaustive list includes:

If you draw out any potential, it will have a profile where at least one point corresponds to the lowest-energy, or true vacuum, state. If there is a false minimum at any point, that can be considered a false vacuum, and it will always be possible, assuming this is a quantum field, to quantum tunnel from the false vacuum to the true vacuum state. The greater the kick you apply to a false vacuum state, the more likely it is that the state will exit the false vacuum state and wind up in a different, more stable, truer minimum.

Although these scenarios are all bad in some sense, some are worse than others. The creation of a tiny black hole would lead to its immediate decay. If you didnt want it to decay, youd have to impose some sort of new symmetry (for which there is neither evidence nor motivation) to prevent its decay, and even then, youd just have a tiny-mass black hole that behaved similarly to a new, massive, uncharged particle. The worst it could do is begin absorbing the matter particles it collided with, and then sink to the center of whatever gravitational object it was a part of. Even if you made it on Earth, it would take trillions of years to absorb enough matter to rise to a mass of 1 kg; its not threatening at all.

The restoration of whatever symmetry was in place before the Universes matter-antimatter symmetry arose is also interesting, because it could lead to the destruction of matter and the creation of antimatter in its place. As we all know, matter and antimatter annihilate upon contact, which creates bad news for any matter that exists close to this point. Fortunately, however, the absolute energy of any particle-particle collision is tiny, corresponding to tiny fractions of a microgram in terms of mass. Even if we created a net amount antimatter from such a collision, it would only be capable of destroying a small amount of matter, and the Universe would be fine overall.

The simplest model of inflation is that we started off at the top of a proverbial hill, where inflation persisted, and rolled into a valley, where inflation came to an end and resulted in the hot Big Bang. If that valley isnt at a value of zero, but instead at some positive, non-zero value, it may be possible to quantum-tunnel into a lower-energy state, which would have severe consequences for the Universe we know today. Its also possible that a kick of the right energy could restore the inflationary potential, leading to a new state of rapid, relentless, exponential expansion.

But if we instead were able to recreate the conditions under which inflation occurred, things would be far worse. If it happened out in space somewhere, wed create in just a tiny fraction of a second the greatest cosmic void we could imagine. Whereas today, theres only a tiny amount of energy inherent to the fabric of empty space, something on the order of the rest-mass-energy of only a few protons per cubic meter, during inflation, it was more like a googol protons (10100) per cubic meter.

If we could achieve those same energy densities anywhere in space, they could potentially restore the inflationary state, and that would lead to the same Universe-emptying exponential expansion that occurred more than 13.8 billion years ago. It wouldnt destroy anything in our Universe, but it would lead to an exponential, rapid, relentless expansion of space in the region where those conditions occur again.

That expansion would push the space that our Universe occupies outward, in all three dimensions, as it expands, creating a large cosmic bubble of emptiness that would lead to unmistakable signatures that such an event had occurred. It clearly has not, at least, not yet, but in theory, this is possible.

Visualization of a quantum field theory calculation showing virtual particles in the quantum vacuum. (Specifically, for the strong interactions.) Even in empty space, this vacuum energy is non-zero, and what appears to be the ground state in one region of curved space will look different from the perspective of an observer where the spatial curvature differs. As long as quantum fields are present, this vacuum energy (or a cosmological constant) must be present, too.

And finally, the Universe today exists in a state where the quantum vacuum the zero-point energy of empty space is non-zero. This is inextricably, although we dont know how to perform the calculation that underlies it, linked to the fundamental physical fields and couplings and interactions that govern our Universe: the physical laws of nature. At some level, the quantum fluctuations in those fields that cannot be extricated from space itself, including the fields that govern all of the fundamental forces, dictate what the energy of empty space itself is.

But its possible that this isnt the only configuration for the quantum vacuum; its plausible that other energy states exist. Whether theyre higher or lower doesnt matter; whether our vacuum state is the lowest-possible one (i.e., the true vacuum) or whether another is lower doesnt matter either. What matters is whether there are any other minima any other stable configurations that the Universe could possibly exist in. If there are, then reaching high-enough energies could kick the vacuum state in a particular region of space into a different configuration, where wed then have at least one of:

Any of these would, if it was a more-stable configuration than the one that our Universe currently occupies, cause that new vacuum state to expand at the speed of light, destroying all of the bound states in its path, down to atomic nuclei themselves. This catastrophe, over time, would destroy billions of light-years worth of cosmic structure; if it happened within about 18 billion light-years of Earth, that would eventually include us, too.

The size of our visible Universe (yellow), along with the amount we can reach (magenta). The limit of the visible Universe is 46.1 billion light-years, as thats the limit of how far away an object that emitted light that would just be reaching us today would be after expanding away from us for 13.8 billion years. However, beyond about 18 billion light-years, we can never access a galaxy even if we traveled towards it at the speed of light. Any catastrophe that occurred within 18 billion light-years of us would eventually reach us; ones that occur today at distances farther away never will.

There are tremendous uncertainties connected to these events. Quantum black holes could be just out of reach of our current energy frontier. Its possible that the matter-antimatter asymmetry was only generated during electroweak symmetry breaking, potentially putting it within current collider reach. Inflation must have occurred at higher energies than weve ever reached, as do the processes that determine the quantum vacuum, but we dont know how low those energies could have been. We only know, from observations, that such an event hasnt yet happened within our observable Universe.

But, despite all of this, we dont have to worry about any of our particle accelerators past, present, or even into the far future causing any of these catastrophes here on Earth. The reason is simple: the Universe itself is filled with natural particle accelerators that are far, far more powerful than anything weve ever built or even proposed here on Earth. From collapsed stellar objects that spin rapidly, such as white dwarfs, neutron stars, and black holes, very strong electric and magnetic fields can be generated by charged, moving matter under extreme conditions. Its suspected that these are the sources of the highest-energy particles weve ever seen: the ultra-high-energy cosmic rays, which have been observed to achieve energies many millions of times greater than any accelerator on Earth ever has.

The energy spectrum of the highest energy cosmic rays, by the collaborations that detected them. The results are all incredibly highly consistent from experiment to experiment, and reveal a significant drop-off at the GZK threshold of ~5 x 10^19 eV. Still, many such cosmic rays exceed this energy threshold, indicating that either this picture is not complete or that many of the highest-energy particles are heavier nuclei, rather than individual protons.

Whereas weve reached up above the ten TeV threshold for accelerators on Earth, or 1013 eV in scientific notation, the Universe routinely creates cosmic rays that rise up above the 1020 eV threshold, with the record set more than 30 years ago by an event known, appropriately, as the Oh-My-God particle. Even though the highest energy cosmic rays are thought to be heavy atomic nuclei, like iron, rather than individual protons, that still means that when two of them collide with one another a near-certainty within our Universe given the vastness of space, the fact that galaxies were closer together in the past, and the long lifetime of the Universe there are many events producing center-of-mass collision energies in excess of 1018 or even 1019 eV.

This tells us that any catastrophic, cosmic effect that we could worry about is already tightly constrained by the physics of what has happened over the cosmic history of the Universe up until the present day.

When a high-energy particle strikes another one, it can lead to the creation of new particles or new quantum states, constrained only by how much energy is available in the center-of-mass of the collision. Although particle accelerators on Earth can reach very high energies, the natural particle accelerators of the Universe can exceed those energies by a factor of many millions.

None of the cosmic catastrophes that we can imagine have occurred, and that means two things. The first thing is that we can place likely lower limits on where certain various cosmic transitions occurred. The inflationary state hasnt been restored anywhere in our Universe, and that places a lower limit on the energy scale of inflation of no less than ~1019 eV. This is about a factor of 100,000 lower, perhaps, than where we anticipate inflation occurred: a reassuring consistency. It also teaches us that its very hard to kick the zero-point energy of the Universe into a different configuration, giving us confidence in the stability of the quantum vacuum and disfavoring the vacuum decay catastrophe scenario.

But it also means we can continue to explore the Universe with confidence in our safety. Based on how safe the Universe has already shown itself to be, we can confidently conclude that no such catastrophes will arise up to the combined energy-and-collision-total threshold that has already taken place within our observable Universe. Only if we begin to collide particles at energies around 1020 eV or greater a factor of 10 million greater than the present energy frontier will we need to begin to worry about such events. That would require an accelerator significantly larger than the entire planet, and therefore, we can reach the conclusion promised in the articles title: no, particle physics on Earth wont ever destroy the Universe.

Link:

No, particle physics on Earth won't ever destroy the Universe - Big Think

Read More..

Anti Virus Spyware Malware Root kit | Silent Firewall | Internet Security

Anti Virus Spyware Malware Root kit | Silent Firewall | Internet Security

JavaScript seems to be disabled in your browser.You must have JavaScript enabled in your browser to utilize the functionality of this website.

Bank, chat, email, and browse online with round-the-clock security.

Real-time multi-layer ransomware protection with smart data backup and restore features.

Multi-layered protection against zero-day attacks, virus, phishing, and malware.

Detect and block unknown threats with behavioral and characteristic inspection.

Enjoy safe browsing experience by blocking risky sites from advanced attacks.

Analyze your network for signatures that match known cyberattacks and take actions to block it.

Block malware that may infiltrate through external drives and infect your system.

Scans thoroughly to detect and clean malware and other potential threats in your computer.

Restrict unauthorized USB port access and prevent data theft.

Restrict inappropriate websites/apps for your children and give them a safe browsing experience.

Advanced feature to protect your online banking and shopping activities.

Scan for security/vulnerability holes in your system and get the best fix.

Prevent hackers from stealing your data without your consent.

Scans files and folders in a quick manner without affecting system performance.

Assess the security of your Wi-Fi network and router, no matter where you connect.

Clean file and document tracks that you work on to prevent privacy breach

Easily restore the browser default settings modified by malware or spyware.

Protect your network from the latest threats with features that secure your unique environment.

Get alerts and manage your remote devices with just a few clicks.

Reliable way of tracking your lost or stolen laptop. Get yourself registered today with Quick Heal.

For more details, please refer to the product datasheet of Quick Heal Total Security

To use Quick Heal Internet Security, you must ensure the following requirements.

Note:

Windows 11

Windows 10

Windows 8.1 / Windows 8

Windows 7 SP 1 and later

Make sure you have installed Microsoft patches KB4474419 and KB4490628 also.

How to check if the required patches are installed?

(1) Open Control Panel. (2) Go to Windows Update. (3) From the Windows Update page, click View Update History.

Windows XP(Service Pack 2 and later)

Quick Heal Internet Security supports the following email clients.

Note: The Email Protection feature does not support encrypted email connections that use Secure Sockets Layer (SSL).

Application Control

Anti-Keylogger

Browser Sandbox

Emergency Disk

Firewall

Safe Banking

Self-Protection

Anti-Rootkit

Remotely Manage Quick Heal

Dear Quick Heal Community:

We are pleased to introduce Seqrite a new name and identity for Quick Heal Enterprise Security products.

You will soon be redirected to the Seqrite website. There you can explore our extensive range of Enterprise Security solutions.

Please wait...

Our website uses cookies. Cookies enable us to provide the best experience possible and help us understand how visitors use our website. By browsing this website, you agree to our cookie policy. Close

Scroll

The rest is here:
Anti Virus Spyware Malware Root kit | Silent Firewall | Internet Security

Read More..

Alarming Cyber Statistics For Mid-Year 2022 That You Need To Know – Forbes

Global cyber futuristic financial network security concept. Fast speed internet connection. Block ... [+] chain network

A couple of times per year, I take a deep dive on writing about the newly reported cybersecurity statistics and trends that are impacting the digital landscape. Unfortunately, despite global efforts, every subsequent year the numbers get worse and show that we are far from being able to mitigate and contain the numerous cyber-threats targeting both industry and government.

Below is a synopsis with links on some of the recent cyber developments and threats that CISOs need to key a close watch on (and that you need to know) for the remaining part of 2022 and beyond.

While many of the statistics seem dire, there is some positive aspect on the trends side as the cybersecurity community has been taking several initiatives to create both cyber awareness and action. And for those attending the 2022 RSA Conference in San Francisco, hopefully the backdrop of the following statistics and trends from mid-year 2022 can also be useful to analyze and match with product and services roadmaps for cybersecurity.

"Caution cyber attacks ahead" road sign.

Despite another record year of breaches including Solar Winds, Colonial Pipeline and others, half of U.S. Business still have not put a cybersecurity risk plan in place. The list of the 50 Biggest Data Breaches 2004-2021 below is illustrative of the problem of protecting data in both industry and government.

The 50 Biggest Data Breaches (2004-2021) (visualcapitalist.com)

50-biggest-data-breaches-infographic

Cybercriminals can penetrate 93 percent of company networks

Link: Cybercriminals can penetrate 93 percent of company networks (betanews.com)

In 93 percent of cases, an external attacker can breach an organization's network perimeter and gain access to local network resources.

This is among the findings of a new study of pen testing projects from Positive Technologies, conducted among financial organizations, fuel and energy organizations, government bodies, industrial businesses, IT companies and other sectors.

An attacker's path from external networks to target systems begins with breaching the network perimeter. According to the research, on average, it takes two days to penetrate a company's internal network. Credential compromise is the main route in (71 percent of companies), primarily because of simple passwords being used, including for accounts used for system administration.

Many security executives say theyre unprepared for the threats that lie ahead

Link: Many security executives say theyre unprepared for the threats that lie ahead | TechRepublic

As cyberattacks grow in both number and sophistication, organizations are increasingly under the gun to protect themselves from compromise. Though companies have responded by upping their security budgets and adopting more advanced defenses, keeping up with the threats that will surface over the next few years will be a challenge.

For its report titled Cybersecurity Solutions for a Riskier World, ThoughtLab studied the security practices and performance of 1,200 companies in 13 industries and the public sector across 16 countries.

In 2021, the average number of cyberattacks and data breaches increased by 15.1% from the previous year. Over the next two years, the security executives polled by ThoughtLab see a rise in attacks from social engineering and ransomware as nation-states and cybercriminals grow more sophisticated. The main causes of these attacks will come from misconfigurations, human error, poor maintenance, and unknown assets.

Despite the increased efforts to combat security threats, many of those interviewed by ThoughtLab see several reasons for alarm. A full 44% of the executives surveyed said that their growing use of partners and suppliers exposes them to significant security risks. Some 30% said their budgets arent sufficient to ensure proper cybersecurity, while several pointed out that the criminals are better funded. A quarter of all the respondents said the convergence of digital and physical systems, such as Internet of Things devices, has increased their security risks.

Further, 41% of the executives dont think their security initiatives have kept up with digital transformation. More than a quarter said that new technologies are their biggest security concern. And just under a quarter cited a shortage of skilled workers as their largest cybersecurity challenge

2022 Study: 50% Of SMBs Have A Cybersecurity Plan In Place

Link: 2022 Study: 50% of SMBs Have a Cybersecurity Plan in Place | UpCity

UpCity, a small business intelligence firm that has matched over 2 million businesses to providers they can trust since its inception in 2009, surveyed 600 business owners and IT professionals on their 2022 cybersecurity plans, priorities, and budgets. Findings include:

Only 50% on U.S. businesses have a cybersecurity plan in place

Of those, 32% havent changed their cybersecurity plan since the pandemic forced remote and hybrid operations

The most common causes of cyber-attacks are malware (22%) and phishing (20%)

Cybercrime cost U.S. businesses more than $6.9 billion in 2021, and only 43% of businesses feel financially prepared to face a cyber-attack in 2022

Software supply chain attacks hit three out of five companies in 2021

Link: Software supply chain attacks hit three out of five companies in 2021 | CSO Online

Survey finds significant jump in software supply chain attacks after Log4j exposed.

More than three in five companies were targeted by software supply chain attacks in 2021, according to a recent survey by Anchore. The survey of 428 executives, directors, and managers in IT, security, development, and DevOps found that the organizations of nearly a third of the respondents (30%) were either significantly or moderately impacted by a software supply chain attack in 2021. Only 6% said the attacks had a minor impact on their software supply chain.

82 percent of CIOs believe their software supply chains are vulnerable

Link: 82 percent of CIOs believe their software supply chains are vulnerable (betanews.com)

A new global study of 1,000 CIOs finds that 82 percent say their organizations are vulnerable to cyberattacks targeting software supply chains.

The research from machine identity specialist Venafi suggests the shift to cloud native development, along with the increased speed brought about by the adoption of DevOps processes, has made the challenges connected with securing software supply chains infinitely more complex.

The increase in the number and sophistication of supply chain attacks, like SolarWinds and Kaseya, over the last 12 months has brought this issue into sharp focus, gaining the attention of CEOs and boards.

Report: Increase in socially engineered, sophisticated cybersecurity attacks plagues organizations

A new report that showed a sharp increase in cybersecurity attacks in 2021 urged organizations to consider when, not if, they too will be under attack. Attacks are becoming more sophisticated and socially engineered making them harder to detect.

Link: Report: Increase in socially engineered, sophisticated cybersecurity attacks plagues organizations - MedCity News

A new cybersecurity report from San Francisco-based Abnormal Security found that medical industries and insurance companies had a 45-60% chance of being the target of a phone fraud attack via email: a sophisticated scam where the scammer sends an email to the target, asking the target to call them. In the second half of 2021, those attacks increased by 10 percent.

Additionally, healthcare systems are seeing a rise in more legitimate-looking yet problematic business email compromise (BEC) attacks. This occurs when the scammer accesses the targets business email and impersonates the target, and then uses that identity to create rapport with victims and get them to pay money.

Businesses Suffered 50% More Cyberattack Attempts per Week in 2021

Link: Businesses Suffered 50% More Cyberattack Attempts per Week in 2021 (darkreading.com)

Check Point Research on Monday reported that it found 50% more attack attempts per week on corporate networks globally in calendar year 2021 compared with 2020.

The researchers define a cyberattack attempt as a single isolated cyber occurrence that could be at any point in the attack chain scanning/exploiting vulnerabilities, sending phishing emails, malicious website access, malicious file downloads (from Web/email), second-stage downloads, and command-and-control communications. All of the attack attempts Check Point cites in the research were detected and stopped by its team.

Cyber-attacks per organization by Industry in 2021

The education/research sector sustained the most attacks in 2021, followed by government/military and communications.

Social engineering and phishing are easy means to corporate jewels that can include sensitive and proprietary emails and business E-Mail compromise is a favorite target of hackers.

Social engineering and phishing are easy means to corporate jewels that can include sensitive and proprietary emails.

$43 billion stolen through Business Email Compromise since 2016, reports FBI

Link: $43 billion stolen through Business Email Compromise since 2016 (tripwire.com)

Over US $43 billion has been lost through Business Email Compromise attacks since 2016, according to data released this week by the FBI.

The FBIs Internet Crime Complaint Center (IC3) issued a public service announcement on May 4 2022, sharing updated statistics on Business Email Compromise (BEC) attacks which use a variety of social engineering and phishing techniques to break into accounts and trick companies into transferring large amounts of money into the hands of criminals.

The report looked at 241,206 incidents reported to law enforcement and banking institutions between June 2016 and December 2021 and says that the combined domestic and international losses incurred amounted to US $43.31 billion.

Worryingly, there has been a 65% increase recorded in identified global losses between July 2019 and December 2021

And how to better protect:

The FBI offers a number of tips to companies wishing to better protect themselves from Business Email Compromise attacks:

$43 billion stolen through Business Email Compromise since 2016, reports FBI

Link: $43 billion stolen through Business Email Compromise since 2016 (tripwire.com)

Over US $43 billion has been lost through Business Email Compromise attacks since 2016, according to data released this week by the FBI.

The FBIs Internet Crime Complaint Center (IC3) issued a public service announcement on May 4 2022, sharing updated statistics on Business Email Compromise (BEC) attacks which use a variety of social engineering and phishing techniques to break into accounts and trick companies into transferring large amounts of money into the hands of criminals.

The report looked at 241,206 incidents reported to law enforcement and banking institutions between June 2016 and December 2021 and says that the combined domestic and international losses incurred amounted to US $43.31 billion.

Worryingly, there has been a 65% increase recorded in identified global losses between July 2019 and December 2021

What Should Business do to Mitigate Cyber-threats?!

Group of people. Human Resources. Global network. Diversity.

The forementioned links highlight many serious vulnerabilities that industry experts have attested. But the C-Suite does not have to remain idle in response to those threats and stats. My suggestion for all businesses, especially small and medium ones who are often at risk of being put out of business by a cyber-attack, is to seriously look at cyber-risk and plan accordingly as part of a corporate operational strategy. NIST and MITRE offer great resources for cyber-risk management planning and are continually updated. Also, some potential actions to take are excerpted from my recent article in Homeland Security Today, A Cybersecurity Risk Management Strategy for the C-Suite.

Risk Management and Assessment for Business Investment Concept. Modern graphic interface showing ... [+] symbols of strategy in risky plan analysis to control unpredictable loss and build financial safety.

A Cybersecurity Risk Management Strategy for the C-Suite.

Link: A Cybersecurity Risk Management Strategy for the C-Suite - HS Today

Create a corporate risk management strategy and vulnerability framework that identifies digital assets and data to be protected. A risk assessment can quickly identify and prioritize cyber vulnerabilities so that you can immediately deploy solutions to protect critical assets from malicious cyber actors while immediately improving overall operational cybersecurity.

Risk management strategies should include people, processes, and technologies. This includes protecting and backing up business enterprise systems such as financial systems, email exchange servers, HR, and procurement systems with new security tools (encryption, threat intel and detection, firewalls, etc.) and policies. That risk management approach must include knowing your inventory and gaps, integrating cybersecurity hygiene practices, procuring, and orchestrating an appropriate cyber-tool stack. It should also include having an incident response plan in place if you do get breached.

Also see my recent article from the Donald Allen Cybersecurity blog (his blog is a great resource and I suggest you subscribe for free!):

The Risk Management Imperative For Cybersecurity

Link: Cybersecurity Risk Management An Imperative for The Digital Age The Donald Allen Cybersecurity Blog (dacybersecurity.com)

Because of the new digital cyber risk environment, a security strategy for risk management is imperative.

A security strategy of risk management to meet these growing cyber-threat challenges needs to be both comprehensive and adaptive. It involves people, processes, and technologies.

Securing your data is key.

Because of digital transformation and a pandemic that transferred many from working at the office to home, data is at greater risk for a breach.

Securing data necessitates a hyper-security focus. At its core, the practice of vigilant and encompasses, identifying gaps, assessing vulnerabilities, and mitigating threats. Data security and cyber risk management are an integral part of the overall enterprise risk management (ERM) framework to stay ahead of the threats.

Defined by the most basic elements in informed risk management, cybersecurity is composed of:

Successful cybersecurity will also require the integration of emerging technologies for identity management, authentication, horizon monitoring, malware mitigation, resilience, and forensics. Automation and artificial intelligence are already impacting the capabilities in those areas.

Cybersecurity capabilities in information sharing, hardware, software, encryption, analytics, training, and protocols, must keep pace to protect and preempt the increasingly sophisticated threats in both the public and private sectors.

The Infographic I created below provides a pathway for exploring risk management frameworks:

cyber risk management infographic

Infographic: Strategic Paths to Cybersecurity, by Chuck Brooks

The Three Pillars of Cybersecurity Strategy

The growth and sophistication of cyber-attacks over the last couple of years, many of them state actor sponsored has caused both government and industry to reevaluate and bolster their risk management strategy approaches to cyber-defense.

There are three strong pillars of risk management that can be integrated into a successful cybersecurity strategy: Security by Design, Defense in Depth, and Zero Trust.

For more details, please see my article in FORBES, Combining Three Pillars Of Cybersecurity.

Link: Combining Three Pillars Of Cybersecurity (forbes.com)

I mentioned that there are some positive cybersecurity trends earlier. One such initiative is a new government focus on a Zero Trust Management strategy. That topic is subject matter for another article.

Please see GovCon Expert Chuck Brooks Authors New Zero Trust White Paper; Anacomp CEO Tom Cunningham Quoted for a quick overview of the benefits and need for Zero Trust in cybersecurity.

Link: GovCon Expert Chuck Brooks Authors New Zero Trust White Paper; Anacomp CEO Tom Cunningham Quoted (executivegov.com)

Ransomware, the Scourge Continues and is still trending a preferred method of cyber-attack in 2022

3D rendering Glowing text Ransomware attack on Computer Chipset. spyware, malware, virus Trojan, ... [+] hacker attack Concept

The Colonial Pipeline attack showed how a ransomware attack against an industrial target can have very real consequences for people, as gasoline supplies to much of the north-eastern United States were limited because of the attack.

Ransomware attacks, and ransom payments, are rampant among critical infrastructure organizations

Link: Ransomware attacks, and ransom payments, are rampant among critical infrastructure organizations - Help Net Security

80% of critical infrastructure organizations experienced a ransomware attack in the last year, with an equal number reporting that their security budgets have risen since 2020, a Claroty report reveals.

Ransomware Trends, Statistics and Facts in 2022

Read the original:
Alarming Cyber Statistics For Mid-Year 2022 That You Need To Know - Forbes

Read More..

These Are the Cyber Dangers Still Faced by SA’s SMEs – IT News Africa

Internet security provider Kaspersky says that small to medium-sized enterprises (SMEs) and other small businesses in South Africa are still facing many threats from cyber criminals and threat actors. Whats worse is that many small business owners do not use or believe it important to use cybersecurity services to secure their businesses.

As commerce is moving ever continually online, this disregard for IT security continues to be exploited by cybercriminals.

Kaspersky researchers assessed the dynamics of attacks on small and medium-sized businesses between January and April 2022 and the same period in 2021, to identify which threats pose an increasing danger to entrepreneurs.

Cyber Threats Still Being Faced by SMEs in South Africa:

In 2022, the number of Trojan-PSW (Password Stealing Ware) detections in South Africa increased by 69% when compared to the same period in 2021 20,922 detections in 2022 compared to 12,344 in 2021.

Trojan-PSW is a malware that steals passwords, along with other account information, which then allows attackers to gain access to the corporate network and steal sensitive information.

Another popular attack tool used on small businesses is Internet attacks, specifically, web pages with redirects to exploits, sites containing exploits and other malicious programs, botnet C&C centers, etc.

While the number of these attacks decreased in the first four-month of 2022 in South Africa by 13% (419,506 infections in 2022 compared to 483,846 infections in 2021), the amount of Internet attacks remains high.

With the shift to remote working and the introduction of numerous advanced technologies in the daily operations of even small companies, security measures need to evolve to support these sophisticated setups, comments Denis Parinov, security researcher at Kaspersky.

Many companies have introduced the Remote Desktop Protocol (RDP) as their workforces shift to remote, a technology that enables computers on the same corporate network to be linked together and accessed remotely, even when the employees are at home.

While the overall number of attacks on RDP has decreased in South Africa, globally this threat is still a challenge. For example, in the first trimester of 2021 there were about 47.5 million attacks in the U.S., whereas for the same period in 2022 the number had risen to 51 million.

How Small Businesses Can Protect Themselves

Kaspersky says that having a special security solution enables attack visualisation and provides IT administrators with a convenient tool for incident analysis.

The faster they can analyse where and how a leak occurred, the better they will be able to solve any negative consequences.

Even small businesses with limited IT resources still need to protect all their working devices, including computers and mobile phones, from cyber threats.

The updated Kaspersky Small Office Security is a key tool for startups, small online stores, and local businesses to keep all of their work devices protected, safely transfer any valuable business-related files and avoid falling victim to ransomware.

Cybercriminals are already way ahead of the curve, so much so that virtually every organisation will experience a breach attempt at some point. For small companies today, its not a matter of whether a cybersecurity incident will happen but when. Having trained staff and an educated IT-specialist is no longer a luxury but a must-have part of your business development, concludes Parinov.

Continued here:
These Are the Cyber Dangers Still Faced by SA's SMEs - IT News Africa

Read More..

Cloud computing dominates. But security is now the biggest challenge – ZDNet

Cloud computing security is complicated, but now a top priority for business.

It's clear that cloud computing is rapidly becoming the dominant model for used by business to host data and applications, and to develop new services.

Adoption of cloud computing has been growing rapidly over the past decade, and soon a tipping point will be reached, with use of cloud computing for application software, infrastructure software, business process services and system infrastructure overtaking traditional on-premises technology options within the next two or three years.

Recent events such as the enforced shift to hybrid working have generated further momentum behind cloud services, and as cloud offerings continue to mature and evolve, it's likely that adoption will continue to expand.

That's because cloud computing has some obvious advantages. Those include the ability to scale services almost infinitely based on demand without the need to buy or maintain expensive hardware, and the ability to take advantage of new applications without having teams of engineers on the payroll to deploy and manage them.

But the switch to cloud computing also brings new challenges. And the biggest worry for many is security.

It's true that one of the key advantages of the cloud for businesses is the opportunity to turn their systems and data over to a cloud company with dedicated experts working to keep their systems secure. That's certainly the case with software as a service (SaaS), which for many businesses takes away the worries and headaches around patching and maintaining software on their own servers.

But that doesn't mean businesses can forget about security after moving to the cloud.

Reaping the full benefits of cloud computing means using more than one cloud company, with data and workloads moving between a company's own data centre and the cloud, or between different clouds.

While the move to cloud computing may have removed some basic security worries, the emergence of the hybrid cloud has introduced a whole new set. Those range from securing staff access to services, ensuring that data is encrypted and not left accidentally exposed to other cloud users, and making sure that data stays safe when moving between applications and cloud services. No two cloud services are exactly the same, and the risks increase as the use of cloud computing expands to new areas.

This cloud computing security special report from ZDNet looks at some to the key issues and shows how cloud security is evolving. Moving to the cloud creates opportunities, but don't ignore the security challenges.

Originally posted here:
Cloud computing dominates. But security is now the biggest challenge - ZDNet

Read More..

Cloud Computing: Are Share Prices Heading Toward Zero, Or Is It An Opportunity To Buy? – Seeking Alpha

Just_Super/E+ via Getty Images

By Christopher Gannatti

The drawdown in many stocks focused on cloud computing software has been, in a word, unbelievable. In basically one months time, from April 11 through May 11, the BVP Nasdaq Emerging Cloud Index (EMCLOUD)a group of cloud-oriented companieshas lost roughly 30% of its value.

In figure 1, we see:

Figure 1: The Drawdown in Cloud Computing Share Prices Has Been INTENSE

Knowing this, the primary question comes back to the following, which we can simplify into two outcomes:

While we are never able to view the future with certainty, the evidence that we can interpret today would tend to indicate that outcome #2 has a higher probability of becoming true.

The big players are still growingFAST.

One of the risks we monitor in cloud computing regards the biggest players shifting from engines of growth to something more like utilitiesthe concept being that everyone able to adopt cloud computing has done so, so the future growth stabilizes.

While it is true that not every cloud-focused company is involved in M&A, even amidst the share price performance turmoil of 2022, companies are still active.

Cloud Computing Stocks Are Still Delivering Elevated Growth Rates

Conclusion: The Cloud Business Model Is Still Robust Amid Substantial Lowering of Equity Valuations

Some of us might have thought that there has been so much discussion about Western central banks shifting policy from extremely easy to extremely focused on mitigating the risk of runaway inflation that this must have been priced into equity markets. The recent behavior of software-oriented cloud computing companies would tell us something differentadjustments are clearly still being made. Our bottom line is thisthese subscription-oriented businesses are still largely growing their revenues, even if that growth is nowhere near what would have been seen during the pandemic period in 2020. Those with a time horizon of the next few months may have an extremely uncertain outcome. Those with a time horizon in the range of 5, 7 or 10 yearsas long as the cloud business model continues to find favormay see this downdraft as an interesting opportunity.

There are risks involved with investing, including possible loss of principal. Foreign investing involves currency, political and economic risk. Funds focusing on a single country, sector and/or funds that emphasize investments in smaller companies may experience greater price volatility. Investments in emerging markets, currency, fixed income and alternative investments include additional risks. Please see prospectus for discussion of risks.

Past performance is not indicative of future results. This material contains the opinions of the author, which are subject to change, and should not to be considered or interpreted as a recommendation to participate in any particular trading strategy, or deemed to be an offer or sale of any investment product and it should not be relied on as such. There is no guarantee that any strategies discussed will work under all market conditions. This material represents an assessment of the market environment at a specific time and is not intended to be a forecast of future events or a guarantee of future results. This material should not be relied upon as research or investment advice regarding any security in particular. The user of this information assumes the entire risk of any use made of the information provided herein. Neither WisdomTree nor its affiliates, nor Foreside Fund Services, LLC, or its affiliates provide tax or legal advice. Investors seeking tax or legal advice should consult their tax or legal advisor. Unless expressly stated otherwise the opinions, interpretations or findings expressed herein do not necessarily represent the views of WisdomTree or any of its affiliates.

The MSCI information may only be used for your internal use, may not be reproduced or re-disseminated in any form and may not be used as a basis for or component of any financial instruments or products or indexes. None of the MSCI information is intended to constitute investment advice or a recommendation to make (or refrain from making) any kind of investment decision and may not be relied on as such. Historical data and analysis should not be taken as an indication or guarantee of any future performance analysis, forecast or prediction. The MSCI information is provided on an as is basis and the user of this information assumes the entire risk of any use made of this information. MSCI, each of its affiliates and each entity involved in compiling, computing or creating any MSCI information (collectively, the MSCI Parties) expressly disclaims all warranties. With respect to this information, in no event shall any MSCI Party have any liability for any direct, indirect, special, incidental, punitive, consequential (including loss profits) or any other damages (www.msci.com)

Christopher Gannatti began at WisdomTree as a Research Analyst in December 2010, working directly with Jeremy Schwartz, CFA, Director of Research. In January of 2014, he was promoted to Associate Director of Research where he was responsible to lead different groups of analysts and strategists within the broader Research team at WisdomTree. In February of 2018, Christopher was promoted to Head of Research, Europe, where he will be based out of WisdomTrees London office and will be responsible for the full WisdomTree research effort within the European market, as well as supporting the UCITs platform globally. Christopher came to WisdomTree from Lord Abbett, where he worked for four and a half years as a Regional Consultant. He received his MBA in Quantitative Finance, Accounting, and Economics from NYUs Stern School of Business in 2010, and he received his bachelors degree from Colgate University in Economics in 2006. Christopher is a holder of the Chartered Financial Analyst designation.

Original Post

Editor's Note: The summary bullets for this article were chosen by Seeking Alpha editors.

Here is the original post:
Cloud Computing: Are Share Prices Heading Toward Zero, Or Is It An Opportunity To Buy? - Seeking Alpha

Read More..

How is cloud hosting better than other traditional hosting techniques? – Express Computer

By Manoj Dhanda, Founder and CTO, Microhost Cloud

In todays world, where data is the most valued resource, there are numerous practical problems in the context of data storage and safety. Over time, traditional servers and data centers were unable to meet the blooming hosting needs of the customers, which resulted in the emergence of the computing nostrum of modern times- Cloud Computing.

Cloud hosting is the gateway to scalable and secure web space used for data storage and management. As the name suggests, the cloud is a virtual space that assists with hosting data and information on the internet where everyone can access it. Unlike traditional hosting, cloud-based hosting data is not stored on a single server. Instead, it is a system of interconnected servers, bringing better organisation and more flexibility to the platter. But what else makes cloud computing so different from traditional hosting techniques? To understand, let us look at why data storage is futuristically shifting towards cloud computing.

The emergence of cloud hosting over traditional hosting

The evolution of cloud hosting began in the 1950s, owing to the onset of mainframe computing, where multiple users used a single central computer through dumb terminals. With the emergence of virtualization in the 70s, running multiple virtual computers with a single physical resource became possible. Plus, the introduction of Virtualized Private Network (VPN) welcomed the concept of shared access that paved the way for various forms of cloud computing like Grid Computing, Utility Computing and Software as a service. The arrival of Salesforce.com in 1999 and Amazon Web Services in 2002 further declared the era of cloud ahead of us.

Why cloud hosting is the future?

We now know that cloud computing is a futuristic architectural development in computing services that is here to stay. Let us delve deeper to understand how cloud hosting triumphs over traditional hosting to provide us with efficient management of data in the virtual space.

Enhanced server uptime The rise in server uptime increases the performance of your website by miles, and the absence of any single point of failure guarantees protection against system failure. If one of the servers cannot perform, the workload shifts to another, which ensures zero downtime for your website.

Secure network Unlike traditional hosting, the cloud allows you to save your data, share resources, and secure it at various levels. You can also ensure data safety via cloud hosting through customer identity management, data isolation and segregation, backup recoveries & firewalls and secure & encrypted solutions.

Scalability of resources The addition or reduction of resources in cloud hosting is simple. You can immediately set up and allocate resources on your server. Reduction or enhancement of resources like RAM, storage and bandwidth can be chosen from the band of resources maintained by multiple servers. Additionally, you have total control over the server (unlike traditional hosting), making the cloud a multi-purpose and flexible tool for managing your business needs.

Cost effective Cloud hosting is more value for money and cost-effective than traditional hosting systems. In traditional hosting, you pay a fixed amount for services, no matter whether or not you utilize them. But while using the cloud, you do not have to make a capital investment and invest in the infrastructure itself. In simple terms, you only have to pay for the resources and services that you utilize.

Global accessibility- Applications or information on the cloud can be accessed (irrespective of any fixed physical location) as servers are accessible globally from any corner of the world from a PC or mobile. You can access applications at the tip of your fingers using an internet connection through the cloud. Moreover, the cloud allows you to collaborate in groups and work in teams to create and share tasks.

Latest technology Latest and upcoming technologies are easier to leverage using the cloud. Through it, you can customize your software applications and integrate them as per the latest software versions.

You can discover an array of benefits while using the cloud to match your business needs. Therefore, its safe to say that the cloud is the most promising among many hosting solutions and is here to stay.

Advertisement

Continued here:
How is cloud hosting better than other traditional hosting techniques? - Express Computer

Read More..

Cloud-Native Benefits Reduce the Cost and Effort of Litigation (Yay?) – PaymentsJournal

Microservices in can (containers) are eliminating capital expenditures on computers and eliminating the need to have computers on standby to address peak loads, making specialized computer services such as AI and large storage and analytics readily available, and lowering the cost of collecting data across multiple sources and different languages. Cloud native litigation software leverages these same benefits:

Software solutions developed in the serverless cloud are distinct from their monolithic predecessors. They can leverage serverless containers to scale their functions depending on load, without requiring intervention, allowing software developers to focus on solving business problems using the appliances provided by companies like Microsoft, Amazon or Google. If your company wanted to develop and introduce language translation into a product, for instance, you could implement it in six weeks instead of six months by taking advantage of existing tools.

If my company wanted to run deep learning statistical algorithms twenty years ago, we would have had to invest in cost-prohibitive hardware. Now we use Azure Databricks for predictive algorithms in our litigation tech software, which is far more affordable but equally effective.

Most of these serverless appliances have a pay-per-use model instead of requiring you to pay for provisioning, resulting in significant operational cost efficiencies. Serverless cloud computing gives a software product the ability to be agile, affordable and innovative.

The Democratization Of Discovery

Cloud-native software products can therefore solve cost and adaptability challenges in law firms. The same product can be used by both a small firm and a large firm without compromising feature robustness. Traditionally, cheaper products in this space would lack features and wouldnt be able to scale. Serverless cloud computing and cloud-native solutions shift that paradigm.

Without a large capital investment, small or medium-sized law firms can now take on a case with terabytes of data, use an unsupervised machine learning-driven early case assessment module and cull the data to focus only on responsive documents.

Overview byTim Sloane,VP, Payments Innovation at Mercator Advisory Group

The rest is here:
Cloud-Native Benefits Reduce the Cost and Effort of Litigation (Yay?) - PaymentsJournal

Read More..