Page 2,417«..1020..2,4162,4172,4182,419..2,4302,440..»

SUPERNAP Thailand partners with Thai government agencies for leading Tier IV upgrades – DataCenterNews Asia

Thailand's National Data Center and cloud services have upgraded to Tier IV international standards as a result of the newly merged entity of state enterprises CAT Telecom and TOT and provider SUPERNAP (Thailand).

The new collaboration combines data center technology and communication technology to help the ongoing governmental digital modernisation program (The Data Center and Cloud Services Partnership Program) by supporting the national data center and cloud services upgrade to reach international standards Tier IV.

The partnership will allow National Telecom to provide 100% uptime for cloud services to governmental agencies in high-security data center premises under the certified international Tier IV standards.

Under the Memorandum of Understanding (MoU), the rapid deployment of the government's cloud will ensure all ministries, departments, and agencies have secure and efficient access to the information needed.

The MoU also says the technology will be implemented sustainably and be ready to scale up as required.

An increase in hybrid work situations and the need for more secure and extensive data management has been a driving factor in the partnership, with the project said to be a crucial national step towards the governmental cloud's quality, safety, and reliability.

This also aligns with the digital expansion accelerated by the pandemic in the past two years.

National Telecom acting president Group Captain Somsak Khaosuwan, says that the cooperation with SUPERNAP (Thailand) will give NT more options and convenience in data center and cloud services.

"NT is a state-owned enterprise with the goal of driving and upgrading the telecommunication and digital services for the government, having the telecom infrastructure covering the whole country. Therefore, this cooperation brings together the strengths of both organisations to develop NT's service capabilities. We are confident that it will be able to meet the diverse needs and bring more benefits to customers," he says.

SUPERNAP will also be utilising the first carrier-neutral hyperscale data center in the country, which they say will also help towards strengthening the government systems.

SUPERNAP (Thailand) CEO Sunita Bottse says that the company is excited and proud to deliver this partnership and technology and believes it will help the government focus on crucial challenges they may face when dealing with data management.

"Being the most advanced data center and cloud hosting provider, our facility has been built to support governments and corporates, while our services are made to facilitate the storage and process the mission-critical data assets with the highest security and no downtime," she says when alluding to their flagship center.

"It is an honor to work with NT, and to be part of their innovation program while supporting their sustainability goals by providing our digital infrastructure solutions, with international experience and local knowledge to support the government to innovate faster and scale on-demand."

Go here to see the original:
SUPERNAP Thailand partners with Thai government agencies for leading Tier IV upgrades - DataCenterNews Asia

Read More..

To build the quantum internet, UChicago engineer teaches atoms how to remember – UChicago News

When the quantum internet arrives, researchers predict it will shift the computing landscape on a scale unseen in decades. In their estimation, it will make hacking a thing of the past. It will secure global power grids and voting systems. It will enable nearly limitless computing power and allow users to securely send information across vast distances.

But forTian Zhong, assistant professor at the Pritzker School of Molecular Engineering (PME) at the University of Chicago, the most tantalizing benefits of the quantum internet have yet to be imagined.

Zhong is a quantum engineer working to create this new global network. In his mind, the full impact of the quantum internet may only be realized after its been built. To understand his work and why the United States is spending$625 millionon the new technology, it helps to consider the science behind it: quantum mechanics.

Quantum mechanics is a theory created to explain fundamental properties of matter, particularly on the subatomic scale. Its roots trace back to the late 19th and early 20th century, when scientists tried to explain the unusual nature of light, which behaves as both a wave and a particle. In the hundred years since then, physicists have learned a great deal, particularly concerning the strange behavior of subatomic particles.

Theyve learned, for example, that some subatomic particles have the ability to be in two states at the same time, a principle called superposition. Another such principle is entanglement, which is the ability of two particles to communicate instantaneously despite being separated by hundreds of miles.

Over time, scientists have found ways to manipulate those principles, entangling particles at will or controlling an electrons spin. That new control allows researchers to encode, send, and process information using subatomic particleslaying the foundations of quantum computing and the quantum internet.

At the moment, both technologies are still hampered by certain physical limitationsquantum computers, for example, need to be kept in giant sub-zero freezersbut researchers like Zhong are optimistic those limitations will be resolved in the near future.

Were at a juncture where this is no longer science fiction, Zhong said. More and more, its looking like this technology will emerge from laboratories any day, ready to be adopted by society.

Zhongs research focuses on the hardware needed to make the quantum internet a reality, things like quantum chips that encrypt and decrypt quantum information, and quantum repeaters that relay information across network lines. To create that hardware, Zhong and his team work on the subatomic scale, using individual atoms to hold information and single photons to transmit it through optic cables.

Zhongs current work centers on finding ways to fight against quantum decoherence, which is when information stored on a quantum system degrades to the point that its no longer retrievable. Decoherence is an especially difficult obstacle to overcome because quantum states are extremely sensitive and any outside forcebe it heat, light, radiation, or vibrationcan easily destroy it.

Most researchers address decoherence by keeping quantum computers at a temperature near absolute zero. But the instant any quantum state is transmitted outside the freezer, say on a network line, it begins to break down within a few microseconds, severely limiting the potential for expansive interconnectivity.

See the original post:

To build the quantum internet, UChicago engineer teaches atoms how to remember - UChicago News

Read More..

We asked a physicist whether The Witcher’s multiverse could really exist – The Verge

Those who plan on watching the second season of Netflixs The Witcher can look forward to plenty of epic monster battles, character development, and Henry Cavill staring broodingly into the middle distance. But season 2 also reveals a lot about the broader world that The Witcher takes place in or more accurately, the many worlds.

Specifically, this darker and more serious chapter in the epic fantasy saga zooms in on a seminal event in the Witcher lore known as the conjunction of the spheres. During the conjunction, which took place approximately 1,500 years before the events of the show, a bunch of different spheres of reality collided with one another, causing elves, dwarves, humans, and monsters to all get mixed up together on the same continent, much to their mutual discontent.

While this cosmic collision is pure fantasy, there is a potentially scientific idea at its core: some physicists have proposed that our universe may really be just one in a much grander multiverse of realities. If thats true, it may even be possible for different universes to interact to some extent. These ideas are wildly controversial, with one camp of physicists arguing that the multiverse is more a matter of philosophy or religion than a fruitful terrain for scientific inquiry. Others say that since we cant rule out the existence of a multiverse, theres no harm in speculating about its nature.

With season 2 of The Witcher dropping on Netflix today, it felt like an apt time for some rampant speculation. To keep things as scientifically grounded as possible, The Verge chatted with University of California, San Diego cosmologist Brian Keating about some of the most mind-bending multiverse ideas physicists have proposed, where pop culture stretches these ideas beyond recognition, and the cosmic horizons we may never see past.

This conversation has been edited for length and clarity.

The Witcher is not alone in popularizing the idea of the multiverse. Its a big theme in the Marvel cinematic universe now, its in Star Trek. Some physicists would say the multiverse is nothing but science fiction. But could you tell us a bit about why others think it might really exist?

Yeah. So the multiverse is kind of a natural extrapolation of what we call the Copernican principle, which is that where we are and who we are and what we are shouldnt be significant or aberrant. It should be kind of representative of the properties throughout all of reality. And just as theres many planets, there are many stars, there are many galaxies, and there are many clusters of galaxies, so, too, the logic would have one believe there is no reason to suspect that there should be only one universe. In fact, one of the foremost proponents of the multiverse paradigm, Andrei Linde, whos a professor at Stanford, claims that we shouldnt be biased in favor of a universe. That we should, in fact, start from this [idea] that there probably is a multiverse. And the notion has been extended by other people to really encompass all the different types of possibilities for the existence of more than one universe: A universe that is characterized by laws of physics, constants of nature, intelligent or conscious beings, and so forth.

The multiverse comes as directly as a consequence of two very different branches of physics. One is cosmology, in particular whats called inflation, the theory of the ultra-high energy origin of the expansion of space and time that would later become our observable universe. And also from string theory, which predicts sort of a landscape of possible values for different fundamental constants and forces. So these two different fields, which arent really associated with one another, both imply that there is the possibility for a multiverse. And as yet, there is a vast disagreement as to whether or not the multiverse actually is part of physics, or if its pure philosophy. And if its part of physics, how could one go about testing it or even falsifying its existence?

So to be clear, we have no direct evidence for the existence of a multiverse.

So the question is whether or not its even in principle possible to provide evidence that supports a multiverse. And if such evidence cant be found, is it possible to rule out the existence of the multiverse? Because you might be living in a multiverse, but then you might not be able to detect that youre living in a multiverse the same way that bacteria in a petri dish cant detect that they live inside of a laboratory inside of a building inside of a planet. Its too remote from the sphere of reality that they have access to.

Now, there are people who propose there are ways to measure the possibility that we are in a multiverse. The one particular signature would be looking for an impact, or a collision with another universe, that would produce an observational pattern in the oldest light in the universe called the cosmic microwave background radiation, which is what I study. And that theory, or conjecture, is pretty wildly contested. Its not at all clear if you could categorically detect and therefore motivate the existence of the multiverse.

But theres no doubt in any physicists minds that there are regions of our universe which are unobservable to us. And in that sense, you know, we already believe in a sort of multiverse. But then, extending and adding new features onto that multiverse, from inflation or from string theory, thats where things get very controversial.

As I said a moment ago, weve seen a lot of depictions of the idea of the multiverse in pop culture. I think what makes The Witcher stand out a little bit is its not just positing there are all these different universes out there in their own separate bubbles, but that universes have collided with each other. Theres a historical, cataclysmic collision that sort of sets the stage for the events of the series. In the context of a multiverse, do physicists have any ideas as to whether, or how, different universes might interact?

First of all, physicists arent in agreement that the multiverse is a serious scientific paradigm worthy of discussion. A lot of people believe its not. On the other hand, if you do take it seriously, then you can ask questions about it. But then its not clear whether or not theres any evidence, or set of evidence, that could prove it wrong. Because you could say, well, we thought this was evidence for the multiverse. But actually, in the multiverse, since anything is possible to happen, you can get any range of predictions that you want. And so its kind of unsatisfying. Its like eating cosmic Wonder Bread.

In the context where these universes collide, it could be just as light travels at a finite speed, the Sun could disappear right now, and we wouldnt see it for eight minutes. So its not possible to say something is ruled out just by not seeing it.

So there could be a multiverse. It could be one light-year away from us, in a certain sense, in which case next year well see it. It could be 10^50 light-years away from us, in which case well never see it. So it could turn out, yes, tomorrow we impact a universe thats one light-year away from us. But the thing I would gently push back on is that the notion of a collision nucleating some vast explosion is not at all clear.

For example, we know for sure we will eventually collide with the Andromeda galaxy, which is our nearest neighbor galaxy. Its almost like a twin sister of ours, and it has almost the same number of stars, hundreds of billions of stars. Its even more massive than ours, and its one of the few galaxies were moving towards rather than expanding away from, according to Hubble. That galaxy will someday crash into our galaxy, but its not like every single point will collide and each star will hit another star. In fact, theyll mostly pass right through each other. So if a galaxy, which is billions of times more dense than the universe on average, can pass right through another galaxy, all the more so a universe could pass through another universe in a certain sense.

So I think its artistic license to suggest that that could nucleate some fireworks. But I admit its pretty cute.

Could there be any sorts of interactions between different universes?

Yeah, in fact, one of the ways you might see the impact of a universe adjacent to ours is that it might have a gravitational force that deflects the light traveling in our universe. But all of this would be taking place at the boundary of what we can see just today. In other words, it wouldnt be happening to us. It would be happening 45 billion light-years away from us and we would just be seeing it now [Editors note: 45 billion light years is the approximate radius of the observable universe]. Unless youre talking about some interdimensional wormhole between different universes, and thats incredibly speculative.

And some of the problems with these physical phenomena when applied to science fiction, like wormholes and other things, is that theyre barely at the level of speculation. Theyre completely removed from testability in laboratory settings. Theyre mathematical possibilities. But as I always say, mathematics allows the possibility for infinity. You know, just divide one by zero. But there is nothing that we know about in the universe thats infinite. Nothing that has infinite temperature, density, pressure, energy, etc. So just because something is mathematically possible doesnt mean it has any physical relevance. So I dont want to be a downer. But the reality is, yes, it is possible to witness the effects of another universe interacting with ours. But it would be occurring not here, but a very, very distant there.

So it sounds like a physically plausible story about the multiverse would not have a lot of cool stuff to look at.

Well, yeah, its like saying, you know, a black hole or a wormhole as is possible. Of course, we measure black holes, but we dont measure any near us, right? Theres not one that we can kind of play with and jump into and then pop out, you know, in the Andromeda galaxy, even, let alone in another universe.

And by the way, if the laws of physics change from universe to universe, its not at all clear that the laws of mathematics, or the laws of logic, would be forbidden from changing. In other words, you get into a wormhole in our universe. You pop out in another universe. Well, the laws of wormholes are based on the laws of black holes, which are the consequence of general relativity, which is a consequence of partial differential equations, which is a consequence of calculus, which is a consequence of real numbers. And who knows if theres such a thing as real numbers in another universe? Just as the old joke goes, an old fish swims by two young fish and says to them, Hows the water? And they say, Whats water? They have no concept of it. Its so alien to their existence that they cant even contemplate it. And theres no reason to be chauvinistic, to think it would be like our universe.

I feel like we see that idea represented at least a little bit allegorically in science fiction and this true in The Witcher as well in how when beings move from universe to universe, they often cant survive in the other universe for an extended period of time because its fundamentally so different.

In my first book Losing the Nobel Prize, I made this kind of analogy which I called the petriverse. So imagine theres some bacterium and its in a petri dish and it starts making a colony. That bacterium, if it was very smart, could realize that theres a possibility for another colony really far away from it to exist because it has the agar gel and it has gravity and sunlight and whatever. It could deduce that there is a possibility for another universe in the petriverse, and actually some of these other colonies when they do form, even though they are only a couple of centimeters away, they produce toxins that prevent other bacteria from invading their space. So its like, a barrier that makes it inhospitable and hostile to the existence of hopping between universes, just like what you described.

What sorts of advances in physics could we make in the coming decades that might shed light on this question of whether the multiverse is real?

I think the field that Im studying, which is the cosmic microwave background, the key observable, and what were trying to discover, is unequivocal evidence that inflation took place. And if inflation took place, that would come concomitantly with the multiverse in most physicists anticipation. They go as a direct consequence. If you discover these waves of gravity embedded in the cosmic microwave background, then you would get a very strong piece of evidence that would seem to mandate the multiverse exists. [Editors note: Keating later clarified that this would be perhaps the strongest circumstantial evidence possible for a multiverse.] On the other hand, it may be that inflation took place, but its too weak to produce observable gravitational waves, in which case you might need to wait till a future version of the LIGO [the Laser Interferometer Gravitational-Wave Observatory] experiment in space called LISA. And that could potentially take us back and show us evidence of the fundamental origination from, perhaps, the surrounding multiverse.

And I should point out theres other ways that multiverses [could exist]. Theres a quantum mechanical version of the multiverse called the many worlds interpretation. And thats that at every possible moment of time, every possible choice, every possible observable, is instantiated. But we only observe one particular outcome for each observation because were sort of coherently oscillating with those quantum mechanical wave functions, and therefore we can observe them. Those are kind of parallel universes going on right now. So if I turn my head to the right or the left, theres a whole universe where Brian turned his head to the left. So thats a version of the multiverse. Theres also a version of the multiverse where the universe is cyclical in a certain sense; its coming into existence, its coming out of existence in a collapse. Its reemerging, and its kind of growing, and then that universe collapses. So thats kind of a temporal multiverse. And those kinds of models have been around since antiquity.

I would say, its hard to find a model of cosmology that doesnt have some version of a multiverse in it, whether thats temporal or spatial, or spatial and temporal, or quantum mechanical. So there are hopes that one could get some confidence from measuring aspects of quantum mechanics. And then theres the cosmic and gravitational wave experiments that I do. And then, perhaps if string theory were to make much more concrete predictions. So I think theres a lot more theoretical advances that need to be made, a lot more experimental [advances]. But fundamentally, we may never be able to prove it wrong. In other words, you ruled out 10^499 different universes but you didnt rule out this one. And these observations therefore become whats called unfalsifiable. In which case you cant prove that inflations wrong, but you also cant prove that the alternatives are right. And in that case, all hope would be lost. You cant prove it using an experiment or evidence, you can only prove it on Twitter or something.

Sounds like the multiverse is going to continue to fuel physics beef for many years to come.

Yes. I always say, inflation for economists means one thing. But for us, the multiverse ensures full employment for cosmologists for years to come. And for science fiction.

Link:

We asked a physicist whether The Witcher's multiverse could really exist - The Verge

Read More..

Einsteins theory holds up after 16-year test. – Cosmos Magazine

An international team have put Einsteins theory through the wringer with the help of a double pulsar system and 16 years of rigorous testing.

Using telescopes from around the world, including Australia with the CSIROs Murriyang radio telescope, at Parkes, they found that Einsteins general theory of relativity originally published back in 1915 still holds true today.

But what is general relativity anyway? In short, its the description of gravity used in modern physics; a geometric property of space and time also known as four-dimensional spacetime. However, while being the simplest theory consistent with experimental data at space-sized measures, general relativity cannot be reconciled with the laws of quantum physics at the very smallest scales of our universe.

According to research team member Dr Dick Manchester a fellow at Australias national space agency and CSIRO this result helps to better refine our understanding of the universe.

The theory of general relativity describes how gravity works at large scales in the universe, but it breaks down at the atomic scale where quantum mechanics reigns supreme, says Manchester.

We needed to find ways of testing Einsteins theory at an intermediate scale, to see if it still holds true. Fortunately, just the right cosmic laboratory, known as the double pulsar, was found using the Parkes telescope in 2003.

Our observations of the double pulsar over the past 16 years proved to be amazingly consistent with Einsteins general theory of relativity within 99.99 per cent to be precise, he says.

But how did they do it?

The first binary pulsar system SR B1913+16 was identified in 1975, but both it and those found subsequently comprised a pulsar and a star orbiting each other. The double pulsar system PSR J0737-3039A/B was spotted in 2003 and remains the only system yet found that contains two pulsars in a binary orbit, which offers a rare opportunity to test general relativity.

A double pulsar system acts kind of like a clock with a ticking second hand. The two orbiting pulsars dense neutron stars create very strong gravitational fields and emit radio waves at a regular time interval (the second hand, in this analogy). They also have very stable and very speedy rotation times.

Get an update of science stories delivered straight to your inbox.

The stars in the double pulsar system complete an orbit every 2.5 hours, with one pulsar rotating 45 times each second while the other spins just 2.8 times per second.

The two pulsars are predicted to collide in 85 million years time as, according to general relativity, the extreme accelerations in the system strain the fabric of space-time and send out ripples that will slow it down. But with such a long time scale this energy loss is difficult to detect.

Fortunately, the clock-like ticks of the radio waves coming from the spinning pulsars are perfect tools to trace these tiny changes. Pulsars with a stable rotation enable the measurement of miniscule variations in the arrival times of those ticks to test for gravitational theories.

Member of the research team Associate Professor Adam Deller, from Swinburne University of Technology and the ARC Centre of Excellence for Gravitational Waves (OzGrav), explains that these ticks take around 2400 years to reach Earth.

We modelled the precise arrival times of more than 20 billion of these clock ticks over 16 years, says Deller.

That still wasnt enough to tell us how far away the stars are, and we needed to know that to test general relativity.

By adding in data from the Very Long Baseline Array a network of telescopes spread across the globe the research team was able to spot a tiny wobble in the stars positions every year, which could be used to determine their distance from Earth.

Perhaps to the disappointment of the researchers, the end result showed that Einsteins theory held: results were 99.99% in accordance with the predictions of general relativity.

The double pulsar system remains a unique tool for testing gravitational theories, and the team plans to continue to use it to poke at Einsteins theory.

Well be back in the future using new radio telescopes and new data analysis hoping to spot a weakness in general relativity that will lead us to an even better gravitational theory, says Deller.

Here is the original post:

Einsteins theory holds up after 16-year test. - Cosmos Magazine

Read More..

Quantum computing use cases are getting real–what you need to know – McKinsey

Accelerating advances in quantum computing are serving as powerful reminders that the technology is rapidly advancing toward commercial viability. In just the past few months, for example, a research center in Japan announced a breakthrough in entangling qubits (the basic unit of information in quantum, akin to bits in conventional computers) that could improve error correction in quantum systems and potentially make large-scale quantum computers possible. And one company in Australia has developed software that has shown in experiments to improve the performance of any quantum-computing hardware.

As breakthroughs accelerate, investment dollars are pouring in, and quantum-computing start-ups are proliferating. Major technology companies continue to develop their quantum capabilities as well: companies such as Alibaba, Amazon, IBM, Google, and Microsoft have already launched commercial quantum-computing cloud services.

Of course, all this activity does not necessarily translate into commercial results. While quantum computing promises to help businesses solve problems that are beyond the reach and speed of conventional high-performance computers, use cases are largely experimental and hypothetical at this early stage. Indeed, experts are still debating the most foundational topics for the field (for more on these open questions, see sidebar, Debates in quantum computing).

Still, the activity suggests that chief information officers and other leaders who have been keeping an eye out for quantum-computing news can no longer be mere bystanders. Leaders should start to formulate their quantum-computing strategies, especially in industries, such as pharmaceuticals, that may reap the early benefits of commercial quantum computing. Change may come as early as 2030, as several companies predict they will launch usable quantum systems by that time.

To help leaders start planning, we conducted extensive research and interviewed 47 experts around the globe about quantum hardware, software, and applications; the emerging quantum-computing ecosystem; possible business use cases; and the most important drivers of the quantum-computing market. In the report Quantum computing: An emerging ecosystem and industry use cases, we discuss the evolution of the quantum-computing industry and dive into the technologys possible commercial uses in pharmaceuticals, chemicals, automotive, and financefields that may derive significant value from quantum computing in the near term. We then outline a path forward and how industry decision makers can start their efforts in quantum computing.

An ecosystem that can sustain a quantum-computing industry has begun to unfold. Our research indicates that the value at stake for quantum-computing players is nearly $80 billion (not to be confused with the value that quantum-computing use cases could generate).

Because quantum computing is still a young field, the majority of funding for basic research in the area still comes from public sources (Exhibit 1).

Exhibit 1

However, private funding is increasing rapidly. In 2021 alone, announced investments in quantum-computing start-ups have surpassed $1.7 billion, more than double the amount raised in 2020 (Exhibit 2). We expect private funding to continue increasing significantly as quantum-computing commercialization gains traction.

Exhibit 2

Hardware is a significant bottleneck in the ecosystem. The challenge is both technical and structural. First, there is the matter of scaling the number of qubits in a quantum computer while achieving a sufficient level of qubit quality. Hardware also has a high barrier to entry because it requires a rare combination of capital, experience in experimental and theoretical quantum physics, and deep knowledgeespecially domain knowledge of the relevant options for implementation.

Multiple quantum-computing hardware platforms are under development. The most important milestone will be the achievement of fully error-corrected, fault-tolerant quantum computing, without which a quantum computer cannot provide exact, mathematically accurate results (Exhibit 3).

Exhibit 3

Experts disagree on whether quantum computers can create significant business value before they are fully fault tolerant. However, many say that imperfect fault tolerance does not necessarily make quantum-computing systems unusable.

When might we reach fault tolerance? Most hardware players are hesitant to reveal their development road maps, but a few have publicly shared their plans. Five manufacturers have announced plans to have fault-tolerant quantum-computing hardware by 2030. If this timeline holds, the industry will likely establish a clear quantum advantage for many use cases by then.

The number of software-focused start-ups is increasing faster than any other segment of the quantum-computing value chain. In software, industry participants currently offer customized services and aim to develop turnkey services when the industry is more mature. As quantum-computing software continues to develop, organizations will be able to upgrade their software tools and eventually use fully quantum tools. In the meantime, quantum computing requires a new programming paradigmand software stack. To build communities of developers around their offerings, the larger industry participants often provide their software-development kits free of charge.

In the end, cloud-based quantum-computing services may become the most valuable part of the ecosystem and can create outsize rewards to those who control them. Most providers of cloud-computing services now offer access to quantum computers on their platforms, which allows potential users to experiment with the technology. Since personal or mobile quantum computing is unlikely this decade, the cloud may be the main way for early users to experience the technology until the larger ecosystem matures.

Most known use cases fit into four archetypes: quantum simulation, quantum linear algebra for AI and machine learning, quantum optimization and search, and quantum factorization. We describe these fully in the report, as well as outline questions leaders should consider as they evaluate potential use cases.

We focus on potential use cases in a few industries that research suggests could reap the greatest short-term benefits from the technology: pharmaceuticals, chemicals, automotive, and finance. Collectively (and conservatively), the value at stake for these industries could be between roughly $300 billion and $700 billion (Exhibit 4).

Exhibit 4

Quantum computing has the potential to revolutionize the research and development of molecular structures in the biopharmaceuticals industry as well as provide value in production and further down the value chain. In R&D, for example, new drugs take an average of $2 billion and more than ten years to reach the market after discovery. Quantum computing could make R&D dramatically faster and more targeted and precise by making target identification, drug design, and toxicity testing less dependent on trial and error and therefore more efficient. A faster R&D timeline could get products to the right patients more quickly and more efficientlyin short, it would improve more patients quality of life. Production, logistics, and supply chain could also benefit from quantum computing. While it is difficult to estimate how much revenue or patient impact such advances could create, in a $1.5 trillion industry with average margins in earnings before interest and taxes (EBIT) of 16 percent (by our calculations), even a 1 to 5 percent revenue increase would result in $15 billion to $75 billion of additional revenues and $2 billion to $12 billion in EBIT.

Quantum computing can improve R&D, production, and supply-chain optimization in chemicals. Consider that quantum computing can be used in production to improve catalyst designs. New and improved catalysts, for example, could enable energy savings on existing production processesa single catalyst can produce up to 15 percent in efficiency gainsand innovative catalysts may enable the replacement of petrochemicals by more sustainable feedstock or the breakdown of carbon for CO2 usage. In the context of the chemicals industry, which spends $800 billion on production every year (half of which relies on catalysis), a realistic 5 to 10 percent efficiency gain would mean a gain of $20 billion to $40 billion in value.

The automotive industry can benefit from quantum computing in its R&D, product design, supply-chain management, production, and mobility and traffic management. The technology could, for example, be applied to decrease manufacturing processrelated costs and shorten cycle times by optimizing elements such as path planning in complex multirobot processes (the path a robot follows to complete a task) including welding, gluing, and painting. Even a 2 to 5 percent productivity gainin the context of an industry that spends $500 billion per year on manufacturing costswould create $10 billion to $25 billion of value per year.

Finally, quantum-computing use cases in finance are a bit further in the future, and the advantages of possible short-term uses are speculative. However, we believe that the most promising use cases of quantum computing in finance are in portfolio and risk management. For example, efficiently quantum-optimized loan portfolios that focus on collateral could allow lenders to improve their offerings, possibly lowering interest rates and freeing up capital. It is earlyand complicatedto estimate the value potential of quantum computingenhanced collateral management, but as of 2021, the global lending market stands at $6.9 trillion, which suggests significant potential impact from quantum optimization.

In the meantime, business leaders in every sector should prepare for the maturation of quantum computing.

Until about 2030, we believe that quantum-computing use cases will have a hybrid operating model that is a cross between quantum and conventional high-performance computing. For example, conventional high-performance computers may benefit from quantum-inspired algorithms.

Beyond 2030, intense ongoing research by private companies and public institutions will remain vital to improve quantum hardware and enable moreand more complexuse cases. Six key factorsfunding, accessibility, standardization, industry consortia, talent, and digital infrastructurewill determine the technologys path to commercialization.

Leaders outside the quantum-computing industry can take five concrete steps to prepare for the maturation of quantum computing:

Leaders in every industry have an uncommon opportunity to stay alert to a generation-defining technology. Strategic insights and soaring business value could be the prize.

Here is the original post:

Quantum computing use cases are getting real--what you need to know - McKinsey

Read More..

Ten ways Fermilab advanced science and technology in 2021 – Fermi National Accelerator Laboratory

Researchers from more than 50 countries collaborate with the U.S. Department of Energys Fermi National Accelerator Laboratory to develop state-of-the-art technologies and solve the mysteries of matter, energy, space and time.

Here is a look at 10 ways they advanced science and technology in 2021. Fermilab and its partners:

The long-awaitedfirst results from the Muon g-2 experiment at Fermilab show fundamental particles called muons behaving in a way that is not predicted by scientists best theory, the Standard Model of particle physics.Thelandmark result, made with unprecedented precision, confirms a discrepancy that has been gnawing at researchers for decades. It indicates that muons could be interacting with yet undiscovered particles or forces.

The Deep Underground Neutrino Experiment is an international flagship experiment to unlock the mysteries of neutrinos. In May, construction crews started lowering equipment a mile underground and began the excavation of space for the South Dakota portion of the Long-Baseline Neutrino Facility. Scientists and engineers in the UK began the production and testing of components for the first large neutrino detector module to be installed in the facility. The DUNE collaboration also published the design of the near detector. A prototype component of the near detector traveled from the University of Bern to Fermilab for testing with the labs neutrino beam.

The PIP-II project team and its national and international partners are getting ready for the construction of the new, highly anticipated particle accelerator at Fermilab. Earlier this year, testing wrapped up at the PIP-II Injector Test Facility. The successful outcome paves the way for the construction of the 700-foot-long PIP-II accelerator, which will power record-breaking neutrino beams and drive a broad physics research program at Fermilab. Construction of the PIP-II Cryogenic Plant Building began in August 2020, and the structure of the building now is largely complete. The building will house utilities as well as cryogenic equipment for the new machine. Efforts to use machine learning for the operation of PIP-II and other Fermilab accelerators are underway as well.

In 2021, Fermilab scientists were co-authors ofmore than 600 scientific articles, advancing our understanding of energy, matter, space and time and the technologies that drive these discoveries. Top achievements include results from MicroBooNE (a neutrino experiment that looks for evidence for sterile neutrinos) and NOvA (which aims to decipher the neutrino mass ordering); the search for stealthy supersymmetry with the CMS experiment at the Large Hadron Collider; dozens of papers on the Dark Energy Survey, including the most precise measurements of the universes composition and growth to date; the discovery of performance-limiting nanohydrides in superconducting qubits; and the worlds fastest magnetic ramping rates for particle accelerator magnets, made with energy-efficient, high-temperature superconducting material.

In Dec. 2020, the U.S. Department of Energy formally approved the full U.S. contribution to the high-luminosity upgrade of the Large Hadron Collider, orHL-LHC, at the European laboratory CERN. Led by Fermilab, collaborators will contribute 16 magnets to focus the LHCs near-light-speed particle beams to a tiny volume before colliding. They will also deliver eight superconducting cavities, radio-frequency devices designed to manipulate the powerful beams. (They will also provide four spare magnets and two spare cavities.) The new instruments will enable a 10-fold increase in the number of particle collisions at the future HL-LHC compared to the current LHC. Together with upgrades to the CMS detector, the accelerator upgrade will enable physicists tostudy particles such as theHiggs bosonin greater detail. And the increase in the number of collisions could also uncover rare physics phenomena or signs of new physics.

Advances in particle physics and quantum information science are tightly connected. Together with their partners in industry and collaborating institutions, Fermilab scientists and engineers used their expertise to advance quantum research. Examples include: taking the first steps toward building a quantum computer, designing quantum sensors for dark matter research, developing quantum algorithms, implementing artificial intelligence on a microchip, and building and testing the components of a quantum internet. They all are all part of the many facets of quantum science at Fermilab. This also includes the construction of the MAGIS-100 experiment, which will use free-falling atoms to probe dark matter, gravity and quantum science.

In addition to the construction projects mentioned above, Fermilab and its collaborators worked on the assembly of the ICARUS detector and the Short-Baseline Near Detector, both part of the Short-Baseline Neutrino Program to investigate neutrino oscillations and look for new physics. ICARUS scientists saw the first particles in their neutrino detector earlier this year, and SBND scientists are working on the assembly of their liquid-argon detector. Scientists are also working on the magnets and detectors for the Mu2e experiment, which will look for the direct conversion of a muon into an electron. If observed, it would signal the existence of new particles or new forces of nature.

In collaboration with partners at other DOE national labs, Fermilab researchersdesigned, built and delivered superconducting accelerator cryomodules for upgrading the worlds most powerful X-ray laser, the Linac Coherent Light Sourceat SLAC National Accelerator Laboratory. With these LCLS-II upgrades, biologists, chemists and physicists will be able to probe the evolution of systems on a molecular level, forming flipbooks of molecular processes using one million X-ray pulses every second. Another upgrade to provide higher-energy particle beams, called LCLS-II-HE, is underway to enable even more precise atomic X-ray mapping. Tests of a verification cryomodule built at Fermilab achieved records far beyond current cryomodule specifications and should result in a 30% improvement compared to LCLS-II.

Fermilab is committed to attracting, developing and retaining diverse talent and cultivating an inclusive work environment that supports scientific, technological and operational excellence. The new Carolyn B. Parker Fellowship, named for the first African-American woman to earn a postgraduate degree in physics, presents an opportunity for Black and African-American postdoctoral scholars to work for up to five years in the Superconducting Quantum Materials and Systems Center at Fermilab. The new Sylvester James Gates, Jr. Fellowship offers a five-year appointment in the Theory Division at Fermilab. It prioritizes the inclusion of first-generation college graduates and the representation of historically and contemporarily minoritized individuals underrepresented in theoretical physics. The new Accelerator Science Program to Increase Representation in Engineering, or ASPIRE, fellowship provides students with immersive learning experiences on world leading particle accelerator projects at Fermilab. The fellowship is for undergraduate and graduate engineering students in underrepresented groups. To learn about other programs at Fermilab to increase diversity and inclusion in science, technology, engineering and math, visit our website.

Fermilab offered many online STEM education and outreach programs in 2021. Live programs included the virtual Family Open House, Ask a Scientist, STEM career fair,theSaturday Morning Physics programandtheArts and Lecture at Home series, including virtual art exhibits. They attracted viewers from different communities and backgrounds from around the world. Tens of thousands also have watched lectures, physics slams and other programs on theFermilab YouTube channel. Almost 600,000 people are now subscribed to this channel, which is known for the popular science explainer videosfeaturing Fermilab scientist Don Lincoln. This year, the channel also launched the new Even Bananas series with Kirsty Duffy and other scientists, who explain the mysteries of the neutrino.

Fermi National Accelerator Laboratory is supported by the Office of Science of the U.S. Department of Energy. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

See the original post:

Ten ways Fermilab advanced science and technology in 2021 - Fermi National Accelerator Laboratory

Read More..

Five Undeniable Scientific Proofs That Santa Is Definitely Real – IFLScience

With Christmas comesa most welcome visitor: Santa Claus, Father Christmas, Saint Nicholas, call him what you will, he only works one night a year but boy does he deliver. And how do we thank him? We deny his existence! Call him things like a holiday folk myth or literally impossible given the laws of physics!

Well, we at IFLScience have had enough of this disrespect. Santa is real, and we have the receipts. Here are five arguments, from all branches of science and philosophy, that prove that Santa Claus really is Coming To Town.

Logic

Not only is it easy to prove that Santa exists, its also quick in fact, were going to do it in two sentences. Ready?

1. Everything in this list is false;

2. Santa exists.

From those two statements, it follows that Santa is real.

Let us explain: either statement 1 is true, or its false. If its true, then everything in the list is false, which means statement 1 is false. But this is a contradiction we started by assuming that statement 1 is true. Clearly, this is nonsense: a statement cant be true and false. The only option is that statement 1 isnt true at all.

But if statement 1 is false, that means that at least something in the list is true. We know that statement 1 is false, so the only remaining option is for statement 2 to be true: Santa exists, QED.

(While this proof is obviously water-tight, the more persnickety among you might want to look up the Liar Paradox to understand why other arguments similar to this one arent as convincing as they initially seem basically, the first statement is self-referential.

Mathematically speaking, all this is a bit naughty, wrote mathematician Hannah Fry in her 2017 book The Undeniable Existence of Santa Claus. Self-referential statements like these dont actually have to be true or false, which resolves the paradox.

The liar paradox isnt just a handy way to prove the haters wrong on Santa it has some pretty astonishing philosophical repercussions too. Using some incredibly abstract mathematics, in the 1930s the logician Alfred Tarski tried to find a definition of truth that would be, as he put it, adequate typically understated mathematician-speak for irrefutable. Instead, thanks to the liar paradox, he accidentally proved what is now called Tarskis theorem on the undefinability of truth, which is, you imagine, the one thing he didnt want to happen.)

Quantum Physics

Of course, despite all these cast-iron proofs, some scientifically uninformed individuals still find reasons to doubt the existence of Santa. One of the most common reasons given for this apostasy is, on the face of it, fairly convincing: how, people ask, could anybody deliver all those presents in one night without being seen or heard?

Little do these doubters realize, however, that science has long known the answer to Santas apparently super-human courier skills its simple quantum physics.

Let us explain: we already know a lot about how Santa would need to travel on Christmas Eve to get the presents to every child who expects them.

Assuming he has the good sense to travel East to West, we know not just his direction but his speed: eight hours of nighttime spread over 24 time zones gives 31 hours to complete the job. From census data, we can estimate the number of children he has to get to: around 850 million.

That gives Santa a minimum speed of around 300,000 kilometers per second, according to science author Roger Highfield, which is pretty fast. Actually, its incredibly fast: at more than 6,000 times the speed of sound, Santa would be slammed back in his sleigh by forces more than 17,500 times stronger than gravity, and as for Rudolph well, according to one calculation, at those speeds, the friction from the atmosphere would vaporize Santas faithful friends in less than one two-hundredth of a second, taking out more than 214,000 reindeer before the nights work was complete.

We know, that sounds pretty grim, and hardly festive. Luckily, though, those figures contradict the facts on record so clearly, classical mechanics doesnt hold the answer.

But with quantum mechanics, everything falls into place.

In quantum mechanics, the Heisenberg uncertainty principle tells us that if we know one variable well, we cannot know the other one exactly, explained high-energy physicist Daniel Tapia Takaki to the BBC. We know what speed Santa will be travelling, but not his position.

Thanks to quantum physics inherent unavoidable weirdness, visiting every home in one night may be possible, Tapia Takaki explained it just requires that Santa be a superposition of quantum states, in other words a collection of Santas diffused all across the planet.

Once we know Santa obeys quantum rules rather than physical ones, it makes a lot more sense why weve never seen him in person. If a child happened to spot him on his mission, Tapia Takaki said, the uncertainty principle would no longer apply.

You would know his exact position, he explained, which would cause the quantum state to collapse and no more presents could be distributed.

Cosmology

Fine, so you havent been convinced so far well this proof is cast-iron. We know Santa exists for one simple reason: we can see him.

Granted, hes not exactly how the festive ads show him: for one thing, hes a couple hundred trillion square kilometers big. Also, hes something like two million degrees in temperature, which at least explains why he spends so much time hanging out at the North Pole the man just needs to cool down.

On that note: those cottages at the North Pole and Lapland must be vacation homes, because it turns out Santas natural habitat is actually in the southwest corner of the Orion Nebula.

See him there? With his little hat on?

What youre looking at is a massive cloud of incredibly hot gas that was formed after the wind from a star forty times the mass of our own sun smashed violently into the dense gas that surrounded it. It was discovered in 2007, less than a month before Christmas an early present for astronomers, as the press release from the European Space Agency said at the time.

The Orion Nebula isnt the only festive part of the night sky. Santas shadow can be seen in the Tarantula Nebula creepy crawlies need presents too, we suppose:

This is clearly the great mans face here in the nebula IC 2118 yes, we know its technically known as the Witch Head Nebula but look at that gigantic space face and tell us thats not a wispy beard on the chin.

Archaeology

You know, Santa wasnt always Santa. He used to be just a regular Joe Schmo from a town which is called Demre now but used to be called Myra, in what is now Turkey.

He may have been born around 1,700 years ago, but we have more than just centuries-old stories to support his existence. Thanks to the morbid traditions of the Orthodox and Catholic churches, there are quite a few bits of bodies around the world that people claim tohave come from the real-life Saint Nick, but one of them a piece of pelvis found in a Catholic church in Illinois may be the real deal.

Many relics that we study turn out to date to a period somewhat later than the historic attestation would suggest, said archaeological scientist Tom Higham back in 2017. This bone fragment, in contrast, suggests that we could possibly be looking at remains from St Nicholas himself.

Now, you might point out that having his skeleton living in a church on the outskirts of Chicago is more an argument against Santas existence, but consider this: nearly half a million hip replacements are performed every year in the United States alone, and most are on people aged 60 plus. Saint Nick, according to tradition, is over 1,750 years old so its not surprising in the least that the old fella might have had a bit of pelvis removed at some point.

Philosophy

Okay, so you still dont believe us. Thats fine. Lets consider the alternative.

If Santa doesnt exist, that means theres a huge conspiracy thats being willfully upheld by billions of people across the globe. Parents lying to their children; hundreds of movies being made about the same imaginary man; heck, even NORAD is engaged in this gigantic lie surrounding a jolly fat man who gives presents on Christmas. Which, when you put it like that, isnt even that unbelievable a premise.

And to what end? All good conspiracies have an end goal the CIA didnt pretend vampires were real just for fun, after all, they did it to stop the commies. What would be the point of postal workers across the world accepting mail to a mythical person (and sometimes even delivering replies); what gain would researchers and news organizations get for controverting their scientific and journalistic ethics every year?

This is where the philosophical principle known as Ockhams Razor comes into play. In simple terms, this is the idea that we shouldnt make things more complicated than they need to be to explain something. For instance: you flip the light switch, and the light turns on. Whats more likely to be true: that you flipping the switch turned it on, or that you flipping the switch set off a small alarm inside the wall, waking up a dormouse who runs up to the ceiling and opens a tiny chemistry lab, dons a tiny white coat, and starts mixing luminol with various substances which he then funnels down into the bulb in the light, thus illuminating the room?

So with that in mind, we ask you: whats more likely? That the whole world is engaged in a deception?

Or that Santa is, as we promised, real?

See the original post here:

Five Undeniable Scientific Proofs That Santa Is Definitely Real - IFLScience

Read More..

Thank You, Best Wishes and Happy Holidays | Office of the Chancellor – University of NebraskaLincoln

To our students, faculty and staff,

I want to thank you from the bottom of my heart for the incredible work you have done and the persistence you have shown to get to this point this eve of finals week for fall of 2021. You continue to help us navigate this long-lasting pandemic and what have been especially difficult times all the while, keeping us focused on our mission.

As we approach the winter holidays, I am filled with gratitude for every Husker in our university community. Your work has led to big things in 2021.

To name just a few:

And we are poised to do even more in 2022 ahead. It is all because of our people and our commitment to each other, where every person and every interaction matters.

So, as we approach finals week, I want to wish you the very best! Enjoy a free cup of coffee in the union, study hard, work hard on those final projects, but get as much rest as you can and pace yourself I know you will succeed.

Thats what Huskers do.

And when this week is over and grades are completed, and graduation is ahead, I encourage you to find ways to wind down and take care of yourselves. I hope you enjoy a peaceful and joyful holiday season.

Happy Holidays, and Go Big Red!

See the original post:

Thank You, Best Wishes and Happy Holidays | Office of the Chancellor - University of NebraskaLincoln

Read More..

Data Mining or Bust: How New Data Notebooks Enhance Contact Center Intelligence Aggregation and Analysis to Elevate the Customer Experience -…

Managing the modern contact center involves extrapolating the most up-to-date metrics and data for companies to ensure success and enhance efficiencies. Many companies still use dashboards for visibility purposes, and they are valuable tools for introducing high-level data, but they dont present the usability and detailed data needed to make strategic decisions for a businesss future. The best way to enhance data transparency is by utilizing dashboards equipped with data notebook capabilities.

What is a data notebook? Simply put, it provides companies with the capability to transform essential data into actionable insights. Notebook technology was created with non-technical employees in mind. Users can ask questions about the data directly and receive the answer as a graphic visualization. Data scientists build notebooks, but contact center leaders can tailor them to their specific needs for interpreting critical data. Additionally, data scientists directly connect data notebooks to data subsets, deeming them trustworthy by company leaders. Companies can rest assured knowing that data scientists have cleaned and validated the data making it easily digestible for users.

Now, lets look at why these capabilities are essential for contact centers. Lets be honest, a massive amount of valuable data is generated from the thousands of experiences clients have each day. A company can have anywhere from25,000 to 40,000agents working in multiple contact centers. While the global consensus is that companies are investing in automation and chatbots, conventional forms of communication between contact center agents and customers still reign supreme, with81.5%of the total call center inbound interactions coming through email and the phone. Contact centers need to incorporate new cloud-based solutions that analyze and aggregate customer data to gain insights into how agents can provide the best customer experience.

The reality is that having data is useless without the proper techniques and capabilities that provide contact centers with actionable insights. The success of contact centers relies on how they process and interpret data. Contact centers can reap the benefits of several capabilities when using the data presented by new data notebooks.

Simplified Usability and Dynamic Interactivity

Data notebooks are living files allowing users to incorporate their graphic elements into the file to answer questions within the company.Notebooks also have collaboration tools so various people can share the data analysis. Users can incorporate text and multimedia files into the notebook and have discussions with colleagues about how they can turn the data into actionable insights.

Improved Agent-to-Customer Experience

Contact center supervisors can use the metrics provided by new data notebooks to monitor operations remotely and in person to assess what needs to be improved. This includes tracking agent behavior and interaction with the customers to ensure the end-user receives quality service. This capability paves the way for better visibility into opportunities for the company and agents to serve customers better while reviewing areas that may need to adjust for more efficiency. As a result, companies can expect to experience decreased call wait times, reduced call abandon rates, enhanced call quality, regained customer loyalty, and future-proofed functionality.

Enhanced Visibility

A central component of the modern contact center is having the digital tools and dashboard visualizations available to provide insight into a customers journey with the contact center. Simply put, agents can see where the customers question or concern originated via email, social media message, or phone call and they can view past communications with the customer so previous matters can be addressed if necessary without having to rediscuss an issue. Better visibility simplifies the interpretation of data and streamlines decision-making to strengthen day-to-day workflows.

As businesses across the nation continue to predominantly rely on technology, they must be prepared to respond to customer service queries in a streamlined fashion by utilizing the necessary tools to extrapolate valuable data. The technological capabilities available to contact centers aid in shaping the departments financial success and reputation as the substantial need for a positive customer experience continues to grow.

About the Author

Tim Eyre is the Chief Marketing Officer at Aceyus. As CMO, his key focus is to represent the voice of the Aceyus customer, identify hidden opportunities for new messaging, and lead internal Product, Sales and Marketing teams in providing actionable solutions for Aceyus customers. A former VP of Red Ventures and Workfront, Tim has a proven track record in creating innovative marketing strategies that fuel new business growth for B2B SaaS companies, making him uniquely suited to drive customer acquisition and retention strategy for Aceyus. His success in the B2B marketing space earned him the CXO of the Year Award in 2016, highlighting his ability to propel organizations forward in engagement, loyalty, and new business growth.

Sign up for the free insideBIGDATAnewsletter.

Join us on Twitter:@InsideBigData1 https://twitter.com/InsideBigData1

Read the original here:

Data Mining or Bust: How New Data Notebooks Enhance Contact Center Intelligence Aggregation and Analysis to Elevate the Customer Experience -...

Read More..

10 Data Science Terms Every Analyst Needs to Know – Built In

Data science is one of the fields that can be overwhelming for newcomers. The term data science itself can be confusing because its an umbrella term that covers many subfields: machine learning, artificial intelligence, natural language processing, data mining the list goes on.

Within each of these subfields we have a plethora of terminology and industry jargon that overwhelm newcomers and discourage them from pursuing a career in data science.

When I first joined the field, I had to juggle learning the techniques, getting up to date with the research and advancements in the field, all while trying to understand the lingo. Here are ten foundation terms every data scientist needs to know to build and develop any data science project.

More Data Science Career Development4 Types of Projects You Need in Your Data Science Portfolio

One of the most important terms in data science youll hear quite often is model: model training, improving model efficiency, model behavior, etc. But what is a model?

Mathematically speaking, a model is a specification of some probabilistic relationship between different variables. In laypersons terms, a model is a way of describing how two variables behave together.

Since the term modeling can be vague, statistical modeling is often used to describe modeling done by data scientists specifically.

Another way to describe models is how well they fit the data to which you apply them.

Overfitting happens when your model considers too much information about that data. So, you end up with an overly complex model thats difficult to apply to various training data.

More on ModelingA Primer on Model Fitting

Underfitting (the opposite of overfitting) happens when the model doesnt have enough information about the data. In either case, you end up with a poorly fitted model.

One of the skills you will need to learn as a data scientist is how to find the middle ground between overfitting and underfitting.

Cross-validation is a way to evaluate a models behavior when you ask it to learn from a data set thats different from the training data you used to build the model. This is a big concern for data scientists because your model will often have good results on the training data but end up with too much noise when applied to real-life data.

There are different ways to apply cross-validation to a model; the three main strategies are:

The holdout method training data is divided into two sections, one to build the model and one to test it.

The k-fold validation an improvement on the holdout method. Instead of dividing the data into two sections, youll divide it into k sections to reach higher accuracy.

The leave-one-out cross-validation the extreme case of the k-fold validation. Here, k will be the same number of data points in the data set youre using.

Want More? We Got You.Model Validation and Testing: A Step-by-Step Guide

Regression is a machine learning term the simplest, most basic supervised machine learning approach. In regression problems, you often have two values, a target value (also called criterion variables) and other values, known as the predictors.

For example, we can look at the job market. How easy or difficult it is to get a job (criterion variable) depends on the demand for the position and the supply for it (predictors).

There are different types of regression to match different applications; the easiest ones are linear and logistic regressions.

Parameter can be confusing because it has slightly different meanings based on the scope in which youre using it. For example, in statistics, a parameter describes a probability distribution's different properties (e.g., its shape, scale). In data science or machine learning, we often use parameters to describe the precision of system components.

In machine learning, there are two types of models: parametric and nonparametric models.

Parametric models have a set number of parameters (features) unaffected by the number of training data. Linear regression is considered a parametric model.

Nonparametric models dont have a set number of features, so the technique's complexity grows with the number of training data. The most well-known example of a nonparametric model is the KNN algorithm.

In data science, we use bias to refer to an error in the data. Bias occurs in the data as a result of sampling and estimation. When we choose some data to analyze, we often sample a large data pool. The sample you select could be biased, as in, it could be an inaccurate representation of the pool.

Since the model were training only knows the data we give it, the model will learn only what it can see. Thats why data scientists need to be careful to create unbiased models.

Want More on Bias? Theres an Article for That.An Introduction to Bias-Variance Tradeoff

In general, we use correlation to refer to the degree of occurrence between two or more events. For example, if depression cases increase in cold weather areas, there might be some correlation between cold weather and depression.

Often, events correlate by different degrees. For example, following a recipe that results in a delicious dish may have a higher correlation than depression and cold weather. We call this the correlation coefficient.

When the correlation coefficient is one, the two events in question are strongly correlated, whereas if it is, lets say, 0.2, then the events are weakly correlated. The coefficient can also be negative. In that case, there is an inverse relationship between two events. For example, if you eat well, your chances of becoming obese will decrease. Theres an inverse relationship between eating a well-balanced diet and obesity.

Finally, you must always remember the axiom of all data scientists: correlation doesnt equal causation.

You Get Some Data Science, and YOU Get Some Data Science!The Poisson Process and Poisson Distribution, Explained

A hypothesis, in general, is an explanation for some event. Often, hypotheses are made based on previous data and observations. A valid hypothesis is one you can test with results, either true or false.

In statistics, a hypothesis must be falsifiable. In other words, we should be able to test any hypothesis to determine whether its valid or not. In machine learning, the term hypothesis refers to candidate models we can use to map the models inputs to the correct and valid output.

Outlier is a term used in data science and statistics to refer to an observation that lies an unusual distance from other values in the data set. The first thing every data scientist should do when given a data set is to decide whats considered usual distancing and whats unusual.

Dig in to Distributions4 Probability Distributions Every Data Scientist Needs

An outlier can represent different things in the data; it could be noise that occurred during the collection of the data or a way to spot rare events and unique patterns. Thats why outliers shouldnt be deleted right away. Instead, make sure you to always investigate your outliers like the good data scientist you are.

This article was originally published on Towards Data Science.

Here is the original post:

10 Data Science Terms Every Analyst Needs to Know - Built In

Read More..