Page 2,027«..1020..2,0262,0272,0282,029..2,0402,050..»

The race toward a new computing technology is heating up and Asia is jumping on the trend – CNBC

A quantum computer in a vibration-free building. Quantum computing will ultimately speed up the computational power that drives many industries and could affect everything from drug discovery to how data is secured.

Oliver Berg | Picture Alliance | Getty Images

Quantum computing was already gathering pace in Japan and elsewhere in Asia when the University of Tokyo and IBM launched their new quantum computer last year.

The computer was the second such system built outside the United States by IBM the latest in a string of key moves in quantum research.

The university and IBM have led the Quantum Innovation Initiative Consortium alongside heavyweights of Japanese industry like Toyota and Sony all with a view to nailing the quantum question.

Quantum computing refers to the use of quantum mechanics to run calculations. Quantum computing can run multiple processes at once by using quantum bits, unlike binary bits which power traditional computing.

The new technology will ultimately speed up the computational power that drives many industries and could affect everything from drug discovery to how data is secured. Several countries are racing to get quantum computers fully operational.

Christopher Savoie, CEO of quantum computing firm Zapata, who spent much of his career in Japan, said technological development has been very U.S.-centric. But now, Asian nations don't want to be left behind on quantum computing, he added.

"Nation states like India, Japan and China are very much interested in not being the only folks without a capability there. They don't want to see the kind of hegemony that's arisen where the large cloud aggregators by and large are only US companies," Savoie said, referring to the likes of Amazon Web Services and Microsoft Azure.

China, for example, has committed a great deal of brainpower to the quantum race. Researchers have touted breakthroughs and debates are simmering over whether China has surpassed the U.S. on some fronts.

India, for its part, announced plans earlier this year to invest $1 billion in a five-year plan to develop a quantum computer in the country.

James Sanders, an analyst at S&P Global Market Intelligence, told CNBC that governments around the world have been taking more interest in quantum computing in recent years.

In March, Sanders published a report that found governments have pledged around $4.2 billion to support quantum research. Some notable examples include South Korea's $40 million investment in the field and Singapore's Ministry of Education's funding of a research center, The Center for Quantum Technologies.

All of these efforts have a long lens on the future. And for some, the benefits of quantum can seem nebulous.

According to Sanders, the benefits of quantum computing aren't going to be immediately evident for everyday consumers.

What is likely to happen is that quantum computers will wind up utilized in designing products that consumers eventually buy.

James Sanders

analyst, S&P Global Market Intelligence

"On a bad day, I'm talking people down from the idea of quantum cell phones. That's not realistic, that's not going to be a thing," he said.

"What is likely to happen is that quantum computers will wind up utilized in designing products that consumers eventually buy."

There are two major areas where quantum's breakthrough will be felt industry and defense.

A staff member of tech company Q.ant puts a chip for quantum computing in a test station in Stuttgart, Germany, on Sept. 14, 2021. It's expected that the power of quantum computing will be able to decrypt RSA encryption, one of the most common encryption methods for securing data.

Thomas Kienzle | Afp | Getty Images

"Areas where you have HPC [high-performance computing] are areas where we will be seeing quantum computers having an impact. It's things like material simulation, aerodynamic simulation, these kinds of things, very high, difficult computational problems, and then machine learning artificial intelligence," Savoie said.

In pharmaceuticals, traditional systems for calculating the behavior of drug molecules can be time-consuming. The speed of quantum computing could rapidly increase these processes around drug discovery and, ultimately, the timeline for drugs coming to market.

On the flip side, quantum could present security challenges. As computing power advances, so too does the risk to existing security methods.

"The longer-term [motivation] but the one that that everyone recognizes as an existential threat, both offensively and defensively, is the cryptography area. RSA will be eventually compromised by this," Savoie added.

RSA refers to one of the most common encryption methods for securing data, developed in 1977, that could be upended by quantum's speed. It is named after its inventors Ron Rivest, Adi Shamir and Leonard Adleman.

You're seeing a lot of interest from governments and communities that don't want to be the last people on the block to have that technology because [other nations] will be able to decrypt our messages.

Christopher Savoie

CEO of Zapata

"You're seeing a lot of interest from governments and communities that don't want to be the last people on the block to have that technology because [other nations] will be able to decrypt our messages," Savoie said.

Magda Lilia Chelly, chief information security officer at Singaporean cybersecurity firm Responsible Cyber, told CNBC that there needs to be a twin track of encryption and quantum research and development so that security isn't outpaced.

"Some experts believe that quantum computers will eventually be able to break all forms of encryption, while others believe that new and more sophisticated forms of encryption will be developed that cannot be broken by quantum computers," Chelly said.

A quantum processor on a prototype of a quantum computer. There needs to be a twin track of encryption and quantum research and development so that security isn't outpaced, said Magda Lilia Chelly, chief information security officer at Singaporean cybersecurity firm Responsible Cyber.

Julian Stratenschulte/dpa | Picture Alliance | Getty Images

"In particular, [researchers] have been looking at ways to use quantum computers to factor large numbers quickly. This is important because many of the modern encryption schemes used today rely on the fact that it is very difficult to factor large numbers," she added.

If successful, this would make it possible to break most current encryption schemes, making it possible to unlock messages that are encrypted.

Sanders said the development and eventual commercialization of quantum computing will not be a straight line.

Issues like the threat to encryption can garner attention from governments, but research and breakthroughs, as well as mainstream interest, can be "stop-start," he said.

Progress can also be affected by fluctuating interest of private investors as quantum computing won't deliver a quick return on investment.

"There are a lot of situations in this industry where you might have a lead for a week and then another company will come out with another type of the advancement and then everything will go quiet for a little bit."

Another looming challenge for quantum research is finding the right talent with specific skills for this research.

"Quantum scientists that can do quantum computing don't grow on trees," Savoie said, adding that cross-border collaboration is necessary in the face of competing government interests.

"Talent is global. People don't get to choose what country they're born in or what nationality they have."

Read more:
The race toward a new computing technology is heating up and Asia is jumping on the trend - CNBC

Read More..

Quantum Computing Inc. Unveils Software Breakthrough That Amplifies Quantum Computer Processing Power By Up to 20x – Yahoo Finance

Quantum Computing Inc.

QAmplify Maximizes End-User Investment in Quantum Computing

LEESBURG, Va., June 07, 2022 (GLOBE NEWSWIRE) -- Quantum Computing Inc. (QCI'' or the Company) (NASDAQ: QUBT), a leader in accessible quantum computing, today unveiled QAmplify, a suite of quantum software technologies that expands the processing power of any current quantum computer by as much as 20x. QAmplify is capable of supercharging any quantum computer to solve real-world realistic business problems today. The Company is actively working with customers and partners in scaling the amplification capabilities of its ready-to-run Qatalyst software, which is designed to eliminate the need for complex quantum programming and runs seamlessly across a variety of quantum computers. QCI has filed for patents on QAmplify technology.

Currently there are two primary technology approaches that deliver a wide range of capabilities spanning the current Quantum Processing Unit (QPU) hardware landscape; gate model (e.g. IBM, IonQ, Rigetti, OQC, etc.) and annealing (e.g. D-Wave) quantum computers. Both are limited in the size of problems (i.e., number of variables and complexity of computations) they can process. For example, gate models can typically process from 10-120 data variables, and annealing machines can process approximately 400 variables in a simple problem set. These small problem sets restrict the size of the problems that can be solved by todays QPUs, limiting businesses ability to explore the value of quantum computing.

QCIs patent-pending QAmplify suite of powerful QPU-expansion software technologies overcomes these challenges, dramatically increasing the problem set size that each can process. The QAmplify gate model expansions demonstrated capabilities have been benchmarked at a 500% (5x) increase and the annealing expansion has been benchmarked at up to a 2,000% (20x) increase.

QAmplify maximizes end-user investment in current QPUs by allowing quantum users to transform from science experiments to solving real-world problems without waiting for the quantum hardware industry to catch up. For example, in terms of real-world applications, this means that an IBM quantum computer with QAmplify could solve a problem with over 600 variables, versus the current limit of 127 variables. A D-Wave annealing computer with QAmplify could solve an optimization with over 4,000 variables, versus the current limit of 200 for a dense matrix problem set.

Story continues

It is central to QCIs mission to deliver practical and sustainable value to the quantum computing industry, said William McGann, Chief Operating and Technology Officer of QCI. QCIs innovative software solutions deliver expansive compute capabilities for todays state-of-the-art QPU systems and offer great future scalability as those technologies continually advance. The use of our QAmplify algorithm in the 2021 BMW Group Quantum Computing Challenge for vehicle sensor optimization provided proof of performance by expanding the effective capability of the annealer by 20-fold, to 2,888 qubits.

To learn more about QCI and how Qatalyst can deliver results for your business today, visit http://www.quantumcomputinginc.com.

About Quantum Computing Inc.Quantum Computing Inc. (QCI) (NASDAQ: QUBT) is a full-spectrum quantum software and hardware company on a mission to accelerate the value of quantum computing for real-world business solutions. The company recently announced its intent to acquire QPhoton, a quantum photonics innovation company that has developed a series of quantum photonic systems (QPS). The combination of QCIs flagship ready-to-run software product, Qatalyst, with QPhotons QPS, sets QCI on a path to delivering a broadly accessible and affordable full-stack quantum solution that can be used by non-quantum experts, anywhere, for real-world industry applications. QCIs expert team in finance, computing, security, mathematics and physics has over a century of experience with complex technologies; from leading edge supercomputing, to massively parallel programming, to the security that protects nations. Connect with QCI on LinkedIn and @QciQuantum on Twitter. For more information about QCI, visit http://www.quantumcomputinginc.com.

Important Cautions Regarding Forward-Looking StatementsThis press release contains forward-looking statements as defined within Section 27A of the Securities Act of 1933, as amended, and Section 21E of the Securities Exchange Act of 1934, as amended. By their nature, forward-looking statements and forecasts involve risks and uncertainties because they relate to events and depend on circumstances that will occur in the near future. Those statements include statements regarding the intent, belief or current expectations of Quantum Computing Inc. (the Company), and members of its management as well as the assumptions on which such statements are based. Prospective investors are cautioned that any such forward-looking statements are not guarantees of future performance and involve risks and uncertainties, and that actual results may differ materially from those contemplated by such forward-looking statements.

Statements in this press release that are not descriptions of historical facts are forward-looking statements relating to future events, and as such all forward-looking statements are made pursuant to the Securities Litigation Reform Act of 1995. Statements may contain certain forward-looking statements pertaining to future anticipated or projected plans, performance and developments, as well as other statements relating to future operations and results. Any statements in this press release that are not statements of historical fact may be considered to be forward-looking statements. Words such as "may," "will," "expect," "believe," "anticipate," "estimate," "intends," "goal," "objective," "seek," "attempt," aim to, or variations of these or similar words, identify forward-looking statements. Such statements include statements regarding the Companys ability to consummate its planned acquisition of QPhoton, the anticipated benefits of such acquisition, and the Companys ability to successfully develop, market and sell its products. Factors that could cause actual results to differ materially from those in the forward-looking statements contained in this press release include, but are not limited to, the parties potential inability to consummate the proposed transaction, including as a result of a failure to satisfy closing conditions to the proposed transactions; risks that QPhoton will not be integrated successfully; failure to realize anticipated benefits of the combined operations; potential litigation relating to the proposed transaction and disruptions from the proposed transaction that could harm the Companys or QPhotons business; ability to retain key personnel; the potential impact of announcement or consummation of the proposed transaction on relationships with third parties, including customers, employees and competitors; conditions in the capital markets; and those risks described in Item 1A in the Companys Annual Report on Form 10-K for the year ended December 31, 2021, which is expressly incorporated herein by reference, and other factors as may periodically be described in the Companys filings with the SEC. The Company undertakes no obligation to update or revise forward-looking statements to reflect changed conditions.

Qatalyst is the trademark of Quantum Computing Inc. All other trademarks are the property of their respective owners.

Company Contact:Robert Liscouski, CEOQuantum Computing, Inc.+1 (703) 436-2161Email Contact

Investor Relations Contact:Ron Both or Grant StudeCMA Investor Relations+1 (949) 432-7566Email Contact

Media Relations Contact:Seth MenackerFusion Public Relations+1 (201) 638-7561qci@fusionpr.com

See the article here:
Quantum Computing Inc. Unveils Software Breakthrough That Amplifies Quantum Computer Processing Power By Up to 20x - Yahoo Finance

Read More..

Microsoft aims to win the race to build a new kind of computer. So does Amazon – Finger Lakes Times

SEATTLE The tech giants are locked in a race.

It might not end for another decade, and there might not be just one winner. But, at the finish line, the prize they promise is a speedy machine, a quantum computer, that will crack in minutes problems that can't be solved at all today. Builders describe revolutionary increases in computing power that will accelerate the development of artificial intelligence, help design new drugs and offer new solutions to help fight climate change.

Ready. Set. Quantum.

Relying on principles of physics and computer science, researchers are working to build a quantum computer, a machine that will go beyond the capabilities of the computers we use today by moving through information faster. Unlike the laptop screen we're used to, quantum computers display all their inner organs. Often cylindrical, the computers are an intimidating network of coils, plates, wires and bolts. And they're huge.

"We're talking about computing devices which are just unimaginable in terms of their power in what they can do," said Peter Chapman, president and CEO of IonQ, a startup in the race alongside tech giants Microsoft, Amazon, Google, IBM, Intel and Honeywell.

The companies are riding a swell of interest that could grow to $9.1 billion in revenue by 2030, according to Tractica, a market intelligence firm that studies new technologies and how humans interact with tech advancements.

Right now, each company is deciding how to structure the building blocks needed to create a quantum computer. Some rely on semiconductors, others on light. Still others, including Microsoft, have pinned their ambitions on previously unproven theories in physics.

"Bottom line, we are in very heavy experimentation mode in quantum computing, and it's fairly early days," said Chirag Dekate, who studies the industry for research firm Gartner. "We are in the 1950s state of classical computer hardware."

There's not likely to be a single moment when quantum computers start making the world-changing calculations technologists are looking forward to, said Peter McMahon, an engineering professor at Cornell University. Rather, "there's going to be a succession of milestones."

At each one, the company leading the race could change.

In October 2019, Google said it had reached "quantum supremacy," a milestone where one of its machines completed a calculation that would have taken today's most advanced computers 10,000 years. In October last year, startup IonQ went public with an initial public offering that valued the company at $2 billion. In November, IBM said it had also created a quantum processor big enough to bypass today's machines.

In March, it was Microsoft's turn.

After a false start that saw Microsoft retract some research, it said this spring it had proved the physics principles it needed to show that its theory for building a quantum computer was, in fact, possible.

"We expect to capitalize on this to do the almost unthinkable," Krysta Svore, an engineer who leads Microsoft's quantum program, said in a company post announcing the discovery. "It's never been done before. ... [Now] here's this ultimate validation that we're on the right path."

As envisioned by designers, a quantum computer uses subatomic particles like electrons instead of the streams of ones and zeros used by computers today. In doing so, a quantum computer can examine an unimaginable number of combinations of ones and zeros at once.

A quantum computer's big selling points are speed and multitasking, enabling it to solve complex problems that would trip up today's technology.

To understand the difference between classical computers (the computers we use today) and quantum computers (the computers researchers are working on), picture a maze.

Using a classical computer, you're inside the maze. You choose a path at random before realizing it's a dead end and circling back.

A quantum computer gives an aerial view of the maze, where the system can see several different paths at once and more quickly reach the exit.

"To solve the maze, maybe you have to go 1,000 times to find the right answer," said IonQ's Chapman. "In quantum computing, you get to test all these paths all at once."

Researchers imagine quantum computers being used by businesses, universities and other researchers, though some industry leaders also talk about quantum computing as a technology that will unlock new ideas our brains can't yet imagine. (It's not likely the average household will have a quantum computer room any time soon.)

Microsoft recently partnered with paints and coatings company AkzoNobel to create a "virtual laboratory" where it will test and develop sustainable products using quantum computing to overcome some of the constraints that jam up a traditional lab setting, like access to raw materials, lack of space and concerns about toxicity.

Goldman Sachs is working to use quantum computing to speed up risk evaluation done by Wall Street traders. Boeing wants to use the advanced tech to model how materials will react to different environments, while ExxonMobil has plans to use it to simulate the chemical properties of hydrogen, hoping to develop new materials that can be used to make renewable energy.

In the long run, companies are aiming for a "fault-tolerant" quantum computer that will keep operating correctly even if components go awry. To get there, researchers are focused on keeping one thing happy: the qubit.

The computers we use today to look up the best restaurants or check the weather rely on bits, a unit of information in the computing world that is usually a zero or a one. Quantum computers rely on qubits, short for quantum bits, a unit of quantum information that can be (confusingly) both zero and one at the same time.

In a classical computer, a bit flips between zero and one. In a quantum computer, a qubit can be in both states at once, allowing it to simultaneously evaluate different possibilities.

It helps to think about qubits like a spinning coin, said Jim Clarke, director of quantum hardware for Intel. (Clarke himself is so devoted to qubits he named his German shepherd after them.)

While a coin is spinning, it is briefly both heads and tails, before it lands on one side or the other. The electrons used to make quantum calculations in Intel's machines are mid-spin.

But qubits are easily disturbed by pretty much anything, including light, noise and temperature changes. "Qubits are notoriously fickle," said Chapman from IonQ. "They are the introverts of the world."

If a qubit gets too bothered, it will lose the information it is carrying, making the computer's calculations less reliable.

When computer scientists, physicists and engineers think about their quantum strategy, a lot of the discussion revolves around the best way to keep those qubits comfortable. That discussion then sparks another: What is the best way to build a qubit?

Intel is using semiconductors. Google, IBM and Amazon Web Services are using superconductors. IonQ is taking an approach that puts atoms in a vacuum sealed chamber to create something called "trapped-ion" qubits. Other companies are using light.

Microsoft is aiming to create something new. It's taking a physics-based approach to create what it calls "topological qubits." In March, it said it got one step closer by successfully demonstrating the physics behind its qubit philosophy.

But it has said that before. In 2018, a team of Microsoft-led researchers published a paper that said it had found evidence of the type of physics it was looking to prove. Last year, the group retracted the paper, writing it could "no longer claim the observation."

Since then, the Microsoft team developed a new protocol meant to "screen out false positives," said Svore, who is working on the quantum project at Microsoft's Redmond headquarters. "We are more confident than ever in our approach."

"Just like I can't prove the sun comes up tomorrow," Microsoft can't prove it can create the qubits it is hoping for, she said. But, "We've now demonstrated on multiple devices that the physics is here."

Though a competitive race, there may be more than one prize.

"All the technologies have advantages and disadvantages," said Fred Chong, a computer science professor at the University of Chicago. "A lot of these things are still evolving. Some of the technologies are good for the near-to-medium term, some of them are a little bit more in the future, some of them are very far in the future."

Determining the shortest route to get from Seattle to Portland might best be solved by one approach, while speeding up a chemical reaction might call for something different.

Most of the companies in the race today will develop "fairly credible quantum machines," Chong said, and customers will look for ways to "take advantage of their strengths and mitigate their weaknesses."

In the meantime, Amazon, Google and Microsoft are hosting quantum technology from their competitors, alongside their own, hoping to let customers play around with the tech and come up with uses that haven't yet been imagined. In the same way companies can buy cloud space and digital infrastructure technology from Amazon Web Services or Google Cloud, the tech companies now offer customers pay-as-you-go quantum computing.

"At this stage of the tech, it is important to explore different types of quantum computers," said Nadia Carlsten, former head of product at the AWS Center for Quantum Computing. "It's not clear which computer will be the best of all applicants. It's actually very likely there won't be one that's best."

Dekate, who analyzes the quantum industry for research and consulting firm Gartner, says quantum may have reached the peak of its "hype cycle."

Excitement and funding for the quantum industry has been building he said, pointing to a rising slope on a line graph. Now, it could be at a turning point, he continued, pointing to the spot right before the line graph takes a nosedive.

The hype cycle is a five phase model Gartner uses to analyze new technologies, as a way to help companies and investors decide when to get on board and when to cash out. It takes three to five years to complete the cycle if a new tech makes it through.

Predictive analytics made it to phase five, where users see real-world benefits. Autonomous vehicles are in phase three, where the original excitement wears off and early adopters are running into problems. Quantum computing is in phase two, the peak of expectations, Dekate said.

"For every industry to advance, there needs to be hype. That inspires investment," he said. "What happens in these ecosystems is end-users [like businesses and other enterprises] get carried away by extreme hype."

Some quantum companies are nearing the deadlines they originally set for themselves, while others have already passed theirs. The technology is still at least 10 years away from producing the results businesses are looking for, Dekate estimates. And investors are realizing they won't see profits anytime soon.

In the next phase of the hype cycle, Dekate predicts private investment in quantum computing will go down, public investment will go up in an attempt to make up the difference, and companies that have made promises they can no longer keep will be caught flat-footed. Mergers, consolidation and bankruptcy are likely, he said.

"The kind of macroeconomic dynamics that we're about to enter into, I think means some of these companies might not be able to survive," Dekate said. "The ecosystem is ripe for disruption: way too much fragmentation and companies overpromising and not delivering."

In other words, we could be headed toward a "quantum winter."

But, even during the funding freeze, businesses are increasingly looking for ways to use quantum computing preparing for when the technology is ready, Dekate said. While Amazon, Microsoft, Google and others are developing their quantum computers, companies like BMW, JPMorgan Chase, Goldman Sachs and Boeing are writing their list of problems for the computer to one day solve.

The real changes will come when that loop closes, Dekate said, when the tech is ready and the questions are laid out.

"At some point down the line, the classical [computing] approaches are going to stall, and are going to run into natural limitations," he said. Until then, "quantum computing will elicit excitement and, at the same time, disappointment."

See original here:
Microsoft aims to win the race to build a new kind of computer. So does Amazon - Finger Lakes Times

Read More..

Quantum information was teleported over a network for the first time – Syfy

When Heroes (now streaming on Peacock!) hit the airwaves in September of 2006, few characters were as immediately beloved as the appropriately named Hiro Nakamura. Granted the ability to manipulate space-time, Hiro could not only slow down, speed up, and stop time, he could also teleport from one place to another. Thats a useful skill if you need to get to a specific point in time and space to fight an evil brain surgeon or prevent the end of the world. Its also useful if you want to build the quantum internet.

Researchers at QuTech a collaboration between Delft University of Technology and the Netherlands Organization for Applied Scientific Research recently took a big step toward making that a reality. For the first time, they succeeded in sending quantum information between non-adjacent qubits on a rudimentary network. Their findings were published in the journal Nature.

While modern computers use bits, zeroes, and ones, to encode information, quantum computers us quantum bits or qubits. A qubit works in much the same way as a bit, except its able to hold both a 0 and a 1 at the same time, allowing for faster and more powerful computation. The trouble begins when you want to transmit that information to another location. Quantum computing has a communications problem.

Today, if you want to send information to another computer on a network, thats largely accomplished using light through fiber optic cables. The information from qubits can be transmitted the same way but only reliably over short distances. Fiber optic networks have a relatively high rate of loss and rely on cloning bits and boosting their signal in order to transmit over significant distances. Qubits, however, cant be copied or boosted. That means that when and if information is lost, its lost for good, and the longer the journey the more likely that is to happen.

Thats where Hiro Nakamura comes in, or at least his quantum counterpart. In order to reliably transmit quantum data, scientists use quantum teleportation, a phenomenon that relies on entanglement or what Einstein called "spooky action at a distance."

As with all things quantum, understanding entanglement isnt the easiest endeavor but, for our purposes, well simplify. When two particles are entangled, they share a connection, regardless of the physical distance between them. By knowing the state of one entangled particle, you can instantly know the state of the other even if its out of view. Its sort of like making two people share a single pair of shoes. If you know the first person is in possession of the right shoe, then you know the second person has the left.

Using that spooky connection, scientists can transmit information between the two particles and that information appears at one particle and vanishes at the other instantly. Thats where the analogy to teleportation comes in. First, its here, then its there, without the need for a journey along cables. Importantly, only information is transferred, not any physical matter. Our teleportation technologies arent at BrundleFly levels just yet.

Quantum teleportation isnt exactly new. Its been done before, but always between two directly connected entangled particles. In communications parlance, its the quantum equivalent of talking to your friend in the next room using two cans connected by a string. In order to create a true quantum network, we need to be able to transmit data between non-adjacent nodes using intermediaries.

In this case, researchers wanted to transfer information between nodes named Alice and Charlie, using Bob as a go-between. To make that happen Bob created an entangled state with Alice and stored his portion of the entanglement in a bit of quantum memory. Next, Bob repeats that process with Charlie. Then, using what researchers at QuTech describe as quantum mechanical sleight of hand, Bob completes a measurement and passes on the entanglement between Alice and Charlie.

Once thats done, Charlie prepares the information he wants to send and completes a complicated measurement between his message and his half of the entanglement with Alice. Quantum mechanics goes to work, and the information vanishes on Charlies end and appears on Alices.

This has some important implications for the future of communication. First, using quantum teleportation networks avoids the threat of packet loss over fiber optic cables. Second, it effectively encrypts the information at Alices end. In order to decode the information, you need to know the result of the calculation Charlie performed. The third thing builds upon the first; despite the immediate transfer of quantum information, we are still bound by the speed of light. As you know, the cosmic speed limit isnt just a suggestion, its the law. Sending the calculation information to Alice in order to decode the information relies on more traditional communications bound by light speed. No getting around it.

While this is an important step toward a quantum internet, in order to build the sorts of networks well need for everyday use, were going to need a lot more nodes. But, hey, even todays global communications network started with a single telephone.

More:
Quantum information was teleported over a network for the first time - Syfy

Read More..

Data Science Masters

Join one of the leading data science programs in the nation and accelerate your high-tech career in data science.

The MSDS degree is a professional masters program designed for students who want to begin or advance their careers in data science. The program is available full-time or part-time. Classes begin every fall quarter and meet in the evenings on the University of Washington campus.

The industry-relevant curriculum gives you the skills to extract valuable insights from big data. In this program, you will learn expertise in statistical modeling, data management, machine learning, data visualization, software engineering, research design, data ethics, and user experience to meet the growing needs of industry, not-for-profits, government agencies, and other organizations.

The curriculum consists of eight core courses and a two-quarter capstone project. The capstone project gives students the opportunity to work on a data science challenge facing an external organization.

The MSDS program can be completed full-time or part-time. Full-time students take two courses per quarter and attend classes two evenings per week. The full-time program is 1.5 years in length. Part-time students take one course per quarter and attend class one evening per week. The typical part-time student completes the program in 2.5 years. Approximately 80 percent of MSDS students are full-time and 20 percent are enrolled part-time.

Discover if the full-time program or the part-time program is the best fit for you here.

MSDS alumni work at top companies, including Amazon, Boeing, Facebook, Google, Microsoft, T-Mobile, and Zillow. Our graduates also pursue careers at leading not-for-profit organizations, such asSeattle Childrens Hospitaland theInstitute for Health Metrics and Evaluation.

Source: 2019 MSDS Alumni Survey

The MSDS program offers dedicated career services to students, including an annual Data Science Career Fair held every October. Learn more about our career outcomes and services on our Careers page.

In the MSDS program, we have a student body made up of more than numbers. Our students have strong undergraduate grades and technical skills, but we also look for more than that when making a cohort. We admit students who have a diverse set of backgrounds and perspectives. Because of this, our program is able to offer a unique, vibrant experience.

The incoming cohort reflects this diversity. There are over 20 majorsrepresented. Our incoming students have professional experience in a wide range of industries, including aerospace, energy, finance, healthcare, technology, telecommunications, and more. Of the students who will begin our program this fall, women make up more than half the total. They also come from around the world, with 59% coming from eight different countries. The countries represented include Argentina, Chile, China, Ethiopia, India, Pakistan, Taiwan, and South Korea.Read more about our incoming cohort on our Class Profile page.

The University of Washington is one of the worlds preeminent universities. The UW is ranked No. 7 in the world in U.S. News & World Reports Best Global Universities rankings.

The UW also has deep ties to the tech industry in the Seattle area and beyond.A UW degree provides alumni with a competitive edge on the job market.

Beyond its excellent academics, the University of Washington features one of the most beautiful campuses in the nation. Located just four miles from downtown Seattle, the campus offers stunning views of Mount Rainier and Lake Washington.

The M.S. in Data Science program gave me the knowledge and confidence to take the leap and change jobs.

Charles Duze, 19, Data Science Manager, Shopify

Read his story.

View original post here:

Data Science Masters

Read More..

Introducing Advata, a Software Company Improving Patient Outcomes Through Advanced Analytics – PR Newswire

"At our core, Advata is aware that every data point has a human life behind it," says Julie Rezek, Advata CEO. "Our solutions deliver insights for health providers and payers to help make informed care decisions to improve patient outcomes. We are setting a new standard for advanced data analytics utilizing AI insights to enable smarter healthcare operations, reduce costs, and recover revenue."

Advata has a comprehensive product offering to provide clinical decision support and back-office management to optimize care and operations. Advata's insights create intelligent workflows in the most expensive and complex areas of health care, like the emergency department and operating room. By leveraging data to produce more intelligent healthcare delivery, Advata improves revenue cycle logic and optimizes cash flow.

Continuing the foundation established by KenSci, named "U.S. healthcare partner of the year" by Microsoft in 2020, Advata's software solutions include pre-built cloud-native products on Microsoft Azure. Advata's analytics platform derives insights from data to improve population health, patient experiences, workflows, and diagnosis accuracy. Built to empower customizations, the platform allows users to develop their own applications.

Additionally, Colburn Hill Group, Alphalytics, andLumedic's products have all combined to offer a broad portfolio of revenue cycle management (RCM) solutions for healthcare providers, contributing to increased revenue, payments acceleration, and collections automation. Advata's RCM solutions enable better stakeholder cooperation across the revenue cycle and leverage patient-driven interoperability to promote greater transparency, access, and affordability for patients and communities. The proprietary Ops Center RCM platform introduced by Colburn Hill Group, for example, has earned praise from customers and industry analysts for its superior ease of use, reliability, and cost-effectiveness.

"Providence has been on a journey to transform health care through innovation, and Advata is a culmination of this important work. It represents our belief that when you pair data science with responsible artificial intelligence, machine learning, automation, and other technological advancements, you can better support clinicians at the bedside and in clinics, improve patient outcomes, and decrease overall healthcare costs. We look forward to sharing these solutions with clinics and health systems across the country to create real-world impact," says Rod Hochman, M.D., president and CEO of Providence.

Advata will be exhibiting at AHIP 2022, June 21-23 in Las Vegas, NV (Kiosk #: 1105-B), and at the HFMA Annual Conference, June 26-29 in Denver, CO (Booth #: 702).

About Providence

Providenceis a national, not-for-profit Catholic health system comprising a diverse family of organizations and driven by a belief that health is a human right. With 52 hospitals, over 900 physician clinics, senior services, supportive housing, and many other health and educational services, the health system and its partners employ nearly 120,000 caregivers serving communities across seven statesAlaska, California, Montana, New Mexico, Oregon, Texas, and Washington, with system offices in Renton, Washington, and Irvine, California. Learn about our vision ofhealth for a better worldatProvidence.org.

About Advata Inc.

Advata is on a mission to provide advanced analytics that transform healthcare management and operations. With a bedrock of data science research as its foundation, the company develops solutions rooted in a unifying platform driven by responsible artificial intelligence (AI) to improve clinical care, hospital operations, and population health. With a strong healthcare heritage, Advata leverages the collective institutional intelligence and technological contributions from its six legacy companies: KenSci, Colburn Hill Group, Alphalytics, Lumedic, Quiviq, and MultiScale. To learn more, visit advata.com.

Contact: Heather Fretz, [emailprotected]

SOURCE Advata

View post:

Introducing Advata, a Software Company Improving Patient Outcomes Through Advanced Analytics - PR Newswire

Read More..

Elon Analytics Day explores the role of analytics in society – Today at Elon

The one-day event featured speakers from SAS Institute, Duke University, N.C. A&T State University, BNH.AI and The Redwoods Group.

What impact does analytics have in our society? How can traditional analytics and AI approaches be redesigned to enable responsible and transparent application to societal issues? How can analytics support social good?

These are just a few of the questions discussed during the 2022 Elon Analytics Day, hosted by the Center for Organizational Analytics.

Cynthia Rudin, director of the Interpretable Machine Learning Lab at Duke University and recipient of the 2022 Squirrel AI Award for Artificial Intelligence for the Benefit of Humanity and 2022 Guggenheim Foundation Fellowship, kicked off the event sharing how the lack of transparency in machine learning models used in the decision-making processes can have serious societal consequences.

Providing specific real examples, Rudin showed how using black box models, which are very complex and opaque, in combination with typographical errors in human-entered input, can result in flawed parole decisions and show bias. Black box models are also proprietary, which raises the issue of trust, particularly when accountability and transparency are needed in decision-making processes impacting human lives.

When it comes to high-stakes decisions in domains such as criminal justice and healthcare, Rudin recommends using models that allow humans to interpret the results and immediately detect obvious errors. Her research has established that in most instances of these societal contexts, transparent models are not less accurate than traditional black box models.

As examples of effective and transparent scoring models, Rudin shared optimized scoring system models that her lab has designed for medical applications, enabling doctors to better interpret patient data to diagnose certain conditions. These scoring systems are simple and accessible, and easily interpreted and applied by medical experts. Their optimal accuracy is ensured by a complex machine learning algorithm that does not need to be understood by medical experts. Rudins pioneering work has shown that it is indeed possible to combine transparency, interpretability which is important for societal applications with high predictive accuracy and technical fidelity.

Patrick Hall, principal scientist at BNH.AI, echoed the need for transparency and accountability in his session on fairness, governance and the future of AI. Hall, who is a co-author of a National Institute of Standards and Technology paper on this topic, noted there are serious risks with AI, such as algorithm discrimination and data bias, lack of transparency and accountability, and data privacy problems. These risks can lead to serious harm, which is often concentrated on marginal groups and people who lack fundamental access to the internet and therefore do not even appear in data.

Halls advice on how to address these problems lies with the need to govern the people who create AI and computer software. You cannot govern the software because it is an inanimate object, he said. You need to govern the creators of the software. He suggests there should be incentives for AI fairness and serious legal consequences when there is harm done due to misuse and avoidable biases in AI models.

Presenters also shared examples of how analytics can help nonprofit organizations optimize their limited resources while achieving their goals. Natalia Summerville, who leads a team of data science practitioners within the Advanced Analytics Center of Excellence at SAS Institute, shared four prescriptive analytics applications that helped:

Optimization and prescriptive analytics provide a smart way of coming up with the best solution without necessarily having to enumerate all possible solutions, Summerville said.

In looking at how analytics can be applied in hunger relief supply chains, Lauren Davis, professor in the Department of Industrial & Systems Engineering at North Carolina Agricultural and Technical State University, presented predictive and descriptive models to quantify the availability of supply over time, characterize demand, and optimize the distribution of uncertain supply to ensure equity and improve food access. She used data from a local nonprofit hunger relief organization and took into consideration the distinguishing features of the hunger relief supply chains uncertain supply, varying shelf life, volunteer resources, storage capacity, balance equity, efficiency, effectiveness, and uncertain demand.

Davis noted the management of donated food supply in non-profit food distribution is challenging, and the pandemic added more challenges, such as increased demand and fewer donations due to food supply chain shortages.

In the executive wrap-up, Kevin Trapani, founder and CEO of The Redwoods Group and Love School of Business executive-in-residence, encouraged the audience to:

Elon Analytics Day provided a wonderful opportunity for students, faculty and the larger community to learn from prominent leaders in the field of analytics about what is involved in applying analytics or AI to decisions that have a human impact, said Manoj Chari, director of the Center for Organizational Analytics and assistant professor of management information systems. Like any other technological or scientific innovation, analytics and AI can make great contributions to the solution of societal problems, but its ill-considered and careless use, even with the best of intentions, can cause harm to individuals lives and can perpetuate historical biases and inequities. The presentations during the Analytics Day conference shed light on both of these important aspects of Analytics applications to society, while highlighting related analytical, legal and ethical issues that are subjects of current research and debate.

Continue reading here:

Elon Analytics Day explores the role of analytics in society - Today at Elon

Read More..

ARM plans upgrades as it marks 30 years of collecting atmospheric data – EurekAlert

As the Department of Energys Atmospheric Radiation Measurement user facility marks 30 years of collecting continuous measurements of the Earths atmosphere this year, the ARM Data Center at Oak Ridge National Laboratory is shepherding changes to its operations to make the treasure trove of data more easily accessible and useful to scientists studying Earths climate around the world.

The observations, comprising more than 3.3 petabytes of data so thus far, start as raw data from more than 460 instruments worldwide. Observational measurements include daily records of temperature, wind speed, humidity, cloud cover, atmospheric particles called aerosols and dozens of other atmospheric processes that are critically important to weather and climate.

The team at the ARM Data Center refine the data so they are more useful to researchers and ensure their quality. In some cases, experts use these processed data to create higher-end data products that sharpen high-resolution models.

In the past 30 years, the multi-laboratory ARM facility has amassed more than 11,000 data products. Thats the capacity of about 50,000 smartphones, at 64 gigabytes per phone. With that much data on hand, ARM is taking steps over the next decade to upgrade its field measurements, data analytics, data-model interoperability and data services. Upgrades and aspirations are outlined in a 31-page Decadal Vision document, released last year.

ARM Data Services Manager Giri Prakash said that when he started at ORNL in 2002, ARM had about 16 terabytes of stored observational data.

I looked at that as big data, he said.

By 2010, the total was 200 terabytes. In 2016, ARM reached one petabyte of data.

Collecting those first 16 terabytes took nearly 10 years. Today, ARM, a DOE Office of Science user facility supported by nine national laboratories, collects that much data about every six days. Its data trove is growing at a rate of one petabyte a year.

Prakash credits this meteoric rise to more complex data, more sophisticated instruments, more high-resolution measurements (mostly from radars), more field campaigns and more high-resolution models.

Rethinking data management

How should all these data be handled?

We had to completely rethink our approach to data management and re-design much of it from the ground up, said Prakash. We need end-to-end data services competence to streamline and automate more of the data process. We refreshed almost 70 data-processing tools and workflows in the last four years.

That effort has brought recognition. Since 2020, the ARM Data Center has been recognized as a CoreTrustSeal repository, was named a DOE Office of Science PuRe (Public Reusable Research) Data Resource and earned membership in the World Data System.

All these important professional recognitions require a rigorous review process.

ARM is special, said Prakash, who represents the United States on the International Science Councils Committee on Data. We have an operationally robust and mature data service, which allows us to process quality data and distribute them to users.

ARM measurements, free to researchers worldwide, flow continuously from field instruments at six fixed and mobile observatories. The instruments operate in climate-critical regions across the world.

Jim Mather, ARM technical director at Pacific Northwest National Laboratory, said that as part of the Decadal Vision, increasingly complex ARM data will get a boost from emerging data management practices, hardware and software, which are increasingly sophisticated.

Data services, as the name suggests, said Mather, is in direct service to enable data analysis.

That service includes different kinds of ARM assets, he said, including physical infrastructure, software tools, and new policies and frameworks for software development.

Meanwhile, adds Prakash, ARM employs FAIR guidelines for its data management and stewardship. FAIR stands for Findability, Accessibility, Interoperability and Reuse. Following FAIR principles helps ensure that data are findable and useful for repeatable research as scientists increasingly rely on data digitization and artificial intelligence.

One step in ARMs decadal makeover will be to improve its operational and research computing infrastructure. Greater computing, memory and storage assets will make it easier to couple high-volume data sets from scanning radars, for instance with high-resolution models. More computing power and new software tools will also support machine learning and other techniques required by big-data science.

The ARM Data Center already supports the user facilitys computational and data-access needs. But the data center is being expanded to strengthen its present mix of high-performance and cloud computing resources by providing seamless access to data and computing.

Mather laid out the challenge: ARM has more than 2,500 active datastreams rolling in from its hundreds of instruments. Processing bottlenecks are possible when you add the pressure of those datastreams to the challenge of managing petabytes of information. In all, volumes like that could mean it is harder to make science advances with ARM data.

To get around that, in the realm of computing hardware, said Mather, ARM will provide more powerful computation services for data processed and stored at the ARM Data Center.

The need continues to grow

Some of that ramped-up computing power came online in the last few years to support a new ARM modeling framework, where large-eddy simulations, or LES, require a lot of computational horsepower.

So far, the LES ARM Symbiotic Simulation and Observation, or LASSO, activity has created a large library of simulations informed by ARM data. These exhaustively screened and streamlined data bundles, to atmospheric researchers, are proxies of the atmosphere. For example, they make it easier to test the accuracy of climate models.

Conceived in 2015, LASSO first focused on shallow cumulus clouds. Now, data bundles are being developed for a deep-convection scenario. Some of those data were made available through a beta release in May 2022.

Still, the need continues to grow for more computing power, said Mather. Looking ahead, we need to continually assess the magnitude and nature of the computing need.

ARM has a new Cumulus high-performance computing cluster at the Oak Ridge Leadership Computing Facility, which provides more than 16,000 processing cores to ARM users. The average laptop has four to six cores.

As needed, ARM users can apply for more computing power at other DOE facilities, such as the National Energy Research Scientific Computing Center. Access to external cloud computing resources is also available through DOE.

Prakash envisions a menu of user-friendly tools, including Jupyter Notebook, available to ARM users to work with ARM data. The tools are designed for users to transition from a laptop or workstation while they access petabytes of ARM data at a time.

Prakash said, Our aim is to provide ARM data, wherever the computer power is available.

Developing a data workbench

Software tools are also critical, says Mather. We expect single cases of upcoming (LASSO) simulations of deep convection to be on the order of 100 terabytes each. Mining those data will require sophisticated tools to visualize, filter and manipulate data.

Imagine, for instance, he said, LASSO trying to visualize convective cloud fields in three dimensions. Its a daunting software challenge.

Challenges like that require more engagement than ever with the atmospheric research community to identify the right software tools.

More engagement helped shape the Decadal Vision document. To gather information for it, Mather drew from workshops and direct contact with users and staff to cull ideas on increasing ARMs science impact.

Given the growth in data volume, there was a clear need to give a broader audience of data users even more seamless access to the ARM Data Centers resources. They already have access to ARM data, analytics, computing resources and databases. ARM data users can also select data by date range or conditional statements.

For deeper access, ARM is developing an ARM Data Workbench.

Prakash envisions the workbench as an extension of the current Data Discovery interfaceone that will provide transformative knowledge discovery by offering an integrated data-computing ecosystem. It would allow users to discover data of interest using advanced data queries. Users could perform advanced data analytics by using ARMs vast trove of data as well as software tools and computing resources.

The workbench will allow users to tap into open-source visualization and analytic tools. Open-source code, free to anyone, can also be redistributed or modified. They could also use technologies such as Apache Cassandra or Apache Spark for large-scale data analytics.

By early 2023, said Prakash, a preliminary version of the workbench will be online. Getting there will require more hours of consultations with ARM data users to nail down their workbench needs.

From that point on, he adds, the workbench will be continuously developed until the end of fiscal year 2023.

Prakash calls the workbench, with its enhanced access and open-source tools, a revolutionary way to interact with ARM data.

ARM recently restructured its open-source code capabilities and has added data service organizations on the software-sharing site GitHub.

Within ARM, we have a limited capacity to develop the processing and analysis codes that are needed, said Mather. But these open-source software practices offer a way for us to pool our development resources to implement the best ideas and minimize any duplication of effort.

In the end, he added, this is all about enhancing the impact of ARM data.

UT-Battelle manages ORNL for the Department of Energys Office of Science, the single largest supporter of basic research in the physical sciences in the United States. The Office of Science is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.

Editors note: Adapted from an article by Corydon Ireland of Pacific Northwest National Laboratory, where ARM is headquartered.

View original post here:

ARM plans upgrades as it marks 30 years of collecting atmospheric data - EurekAlert

Read More..

Aunalytics to Showcase its Daybreak for Financial Services Solution at Illinois, Southeast, Ohio, and Michigan Banking Events in June – GlobeNewswire

SOUTH BEND, Ind., June 08, 2022 (GLOBE NEWSWIRE) -- Aunalytics, a leading data platform company delivering Insights-as-a-Service for mid-market businesses, today announced its participation at the Illinois Bankers Association Annual Conference, June 8-9, Southeast Credit Union Conference & Expo, June 15-17, Ohio Bankers League 2022 Convention, June 16-19, and the Michigan Bankers Association Annual Convention, June 22-24. The company will showcase its DaybreakTM for Financial Services solution, designed to help mid-market banks leverage artificial intelligence (AI)-powered data analytics to compete more effectively against their large, national counterparts. Aunalytics is also sponsoring the New York Credit Union Association's EXCEL22 Annual Meeting & Convention, June 16-19.

Personalized marketing in a digital world matters more than ever before, especially for mid-market banks that have traditionally relied on hometown, white glove service to win customers, said Katie Horvath, Chief Marketing Officer of Aunalytics. With Aunalytics Daybreak for Financial Services, midsize financial institutions can target-market more efficiently, reach high-value customers with the right product offering, and win business away from competitors to expand value. We look forward to meeting with bankers and credit unions from Ohio, Michigan, and the southeast, and demonstrating how Daybreak for Financial Services can help them strengthen their position in regional markets and compete more effectively.

Daybreak for Financial Services enables midsize financial institutions to gain customer intelligence and grow their lifetime value, predict churn, determine which products to introduce to customers and when, based upon deep learning models that are informed by data. Built from the ground up, Daybreak for Financial Services is a cloud-native data platform that enables users to focus on critical business outcomes. The solution seamlessly integrates and cleanses data for accuracy, ensures data governance, and employs artificial intelligence (AI) and machine learning (ML) driven analytics to glean customer intelligence and timely actionable insights that drive strategic value.

Tweet this: .@Aunalytics to Showcase its Daybreak for Financial Services Solution at Illinois, Southeast, Ohio, and Michigan Banking Events in June #FinancialServices #Banks #CreditUnions #Dataplatform #DataAnalytics #Dataintegration #Dataaccuracy #AdvancedAnalytics #ArtificialIntelligence #AI #Masterdatamanagement #MDM #DataScientist #MachineLearning #ML #DigitalTransformation #FinancialServices

About AunalyticsAunalytics is a data platform company delivering answers for your business. Named a Digital Innovator by analyst firm Intellyx, and selected for the prestigious Inc. 5000 list, Aunalytics provides Insights-as-a-Service to answer enterprise and mid-sized companies most important IT and business questions. The Aunalytics cloud-native data platform is built for universal data access, advanced analytics and AI while unifying disparate data silos into a single golden record of accurate, actionable business information. Its DaybreakTM industry intelligent data mart combined with the power of the Aunalytics data platform provides industry-specific data models with built-in queries and AI to ensure access to timely, accurate data and answers to critical business and IT questions. Through its side-by-side digital transformation model, Aunalytics provides on-demand scalable access to technology, data science, and AI experts to seamlessly transform customers businesses. To learn more contact us at +1 855-799-DATA or visit Aunalytics at http://www.aunalytics.com or on Twitter and LinkedIn.

PR Contact: Denise NelsonThe Ventana Group for Aunalytics (925) 858-5198dnelson@theventanagroup.com

View original post here:

Aunalytics to Showcase its Daybreak for Financial Services Solution at Illinois, Southeast, Ohio, and Michigan Banking Events in June - GlobeNewswire

Read More..

NetWitness Selected by Ubiquo as Exclusive XDR Partner to Provide Integrated and Rapid Threat Detection and Response Against Advanced Attacks – Yahoo…

NetWitness XDR Technology Uses a Combination of Network and Endpoint Detection, Behavioral Analysis, Data Science, and Threat Intelligence to Detect and Resolve Known and Unknown Attacks

BEDFORD, Mass., June 08, 2022--(BUSINESS WIRE)--NetWitness, an RSA business, and globally trusted provider of cybersecurity technologies and incident response services, today announced a new partnership for Extended Detection and Response (XDR) with Ubiquo, a newly launched company in Chile owned by Telecom Argentina and specialized cybersecurity provider for Latin America. Ubiquo provides enterprises with a full suite of cybersecurity services and works with businesses to help protect and monitor their critical systems and respond to emerging cybersecurity threats.

"The threat of cyber criminals is ever present, and businesses are constantly struggling against a never-ending wave of attacks that can severely disrupt key business operations. Organizations throughout the region require a combination of best-of-breed technologies and industry expertise to keep these threats at bay," said Mauricio Chiabrando, Cybersecurity Solutions Director at Ubiquo. "Were confident that the strength of a global cybersecurity leader like NetWitness and our expert team will give our customers an advantage in the battle against cyberattacks, ensuring unsurpassed visibility, smarter threat detection, and faster analytics."

In addition to XDR solutions, NetWitness is the foundation of Ubiquo's new state-of-the-art Managed Detection and Response (MDR) center, which will provide outsourced threat detection services designed to deliver visibility into critical systems, advanced insights into attacks, and the ability to take action to mitigate the impact and disruption of threat actors. The MDR center is staffed by analysts fully trained on NetWitness technologies.

"Keeping enterprises safe from cyberattacks requires innovative and forward-thinking approaches that enable those businesses to stay on the cutting-edge of technology, as well as ahead of a rapidly transforming threat landscape, where new methods of attack are emerging daily," said Marcos Nehme, Vice President of Latin America and the Caribbean at NetWitness. "Ubiquo in Chile is taking just that approach, and were proud to work with their team on XDR and MDR offerings that will significantly strengthen the security capabilities of their customers; this includes Incident Response services powered by NetWitness experienced threat hunters for rapid discovery and response. We look forward to continuing our work with Ubiquo to help keep Chile and all Latin America-based organizations protected from cyber threats."

Story continues

The NetWitness Platform is an evolved SIEM and open XDR platform that enables security teams to detect threats, understand the full scope of a compromise, and automatically respond to security incidents across modern IT infrastructures. The NetWitness Platform collects and analyzes data across all capture points, including logs, network packets, NetFlow, endpoint, and IoT, on physical, virtual, and cloud computing platforms. It applies threat intelligence and user behavior analytics to detect, prioritize, investigate threats, and automate response, improving the effectiveness and efficiency of security operations.

Using a centralized combination of network and endpoint analysis, behavioral analysis, data science techniques, and threat intelligence, NetWitness Platform for XDR helps analysts detect and resolve known and unknown attacks while automating and orchestrating the incident response lifecycle. With these capabilities on one platform, security teams can integrate disparate tools and data into a powerful and intuitive user interface for rapid and effective response.

To learn more, visit http://www.netwitness.com.

ABOUT NetWitness

NetWitness, an RSA Business, provides comprehensive and highly scalable threat detection and response capabilities for organizations around the world. The NetWitness Platform delivers complete visibility combined with applied threat intelligence and user behavior analytics to detect, prioritize, investigate threats, and automate response. This empowers security analysts to be more efficient and stay ahead of business-impacting threats. For more information, visit netwitness.com.

2022 RSA Security LLC or its affiliates. All rights reserved. RSA and the RSA logo are trademarks of RSA Security LLC or its affiliates. For a list of RSA trademarks visit https://www.rsa.com/en-us/company/rsa-trademarks. Other trademarks are trademarks of their respective owners. RSA believes the information in this document is accurate. The information is subject to change without notice.

View source version on businesswire.com: https://www.businesswire.com/news/home/20220608005327/en/

Contacts

SHIFT Communicationsnetwitness@shiftcomm.com

Follow this link:

NetWitness Selected by Ubiquo as Exclusive XDR Partner to Provide Integrated and Rapid Threat Detection and Response Against Advanced Attacks - Yahoo...

Read More..