Page 2,969«..1020..2,9682,9692,9702,971..2,9802,990..»

Special Report: Structural engineers warnings over citys mandatory retrofits have gone unheeded for years – Mission Local

Yesterday, Mission Local published a special report on a pervasive problem among the citys thousands of mandatory soft-story seismic retrofits.

Building department officials have admitted that the encasing of aging gas lines in new, mandated concrete foundations could lead to catastrophic failures and fires in the event of a major earthquake.

This matter was discussed in a public meeting in 2017, during which Department of Building Inspection brass stated that this situation could become a San Bruno, referencing the lethal 2010 PG&E gas explosion in that neighboring city.

Seismic safety and fire safety is not an either-or. But sources both within the Department of Building Inspection and outside of it worry that the steps San Francisco took in its mandatory soft-story retrofit program has balanced the former against the latter.

Separate and apart from the gas line issue, the regions structural engineers have long held misgivings about the integrity of the retrofitting jobs thousands of San Francisco property owners have been required to undertake since the passage of the 2013 soft-story ordinance.

Members of the regions structural engineers association said they spent years attempting to land a meeting with Department of Building Inspection officials regarding concerns about poor engineering practices and sloppy construction on mandatory retrofits built on relatively tight deadlines.

These entreaties to meet were, for years, brushed off. And, when the two sides did gather in 2018 and 2019, the engineers say their concerns were by and large brushed off.

All the while, thousands of mandatory soft story retrofits were being undertaken and completed. As of April 2021, of the 4,934 mandatory construction projects, nearly 4,000 have been finished.

The citys Department of Building Inspection, says veteran structural engineer David Bonowitz, does not seem to have any interest in revealing how things went for this program.

Bonowitz, along with fellow members of the Structural Engineers Association of Northern California (SEAONC) in 2016 began attending building department subcommittee meetings and voicing concerns.

I was at those meetings back in the day, and the answer was always, we got it. Its good, recalled structural engineer and SEAONC member Randy Collins.

We had examples of pretty bad projects, pretty bad engineering and retrofit designs, and these got permitted, got built and got signed-off on. We were merely trying to press the building department on what were their quality assurance/quality control procedures? We kept asking, specifically, whats the plan? They never really provided one.

Every engineer seemed to have his or her own disturbing experiences with the general quality of the thousands of mandatory retrofits.

Thor Mattesons suspicions about the program were always there, in the abstract. But then the structural engineers own involvement on a city retrofit pushed his doubts into the concrete.

Yes, literally.

On a project that Matteson had designed, a city building inspector departed shortly before Mattesons assistant engineer arrived. The inspector, Matteson says, had signed off on the reinforcing of the projects concrete foundation. But heres the rub: The rebar that was supposed to be reinforcing that foundation was still piled in the driveway where it had been unloaded off the truck.

If its not 100 percent clear: Youre supposed to sign off on a rebar inspection after the rebar has been installed. Youre also supposed to write a movie review after you see a movie or grade a test after a student fills it out.

Inspecting rebar, it turns out, may not have been the Department of Building Inspections No. 1 priority. One building inspector recalls a supervisor walking up to his desk and asking him to attend a holiday work gathering to mingle with the builders and developers whose projects hed be inspecting. When the inspector demurred, saying he had soft-story rebar inspections to do, the supervisor leaned into his cubicle and said, in a stage whisper: rebar, shmeebar.

After the first wave of mandatory retrofits was done, the engineers began trading war stories. Things we saw. And, usually in situations like that, people dont tell stories about how great everything is, said Bonowitz. You tell about all the crap you saw. And as these stories keep coming in, you wonder if the Department of Building Inspection is doing anything about it?

Typical war stories beyond the rebar-shmeebar variety involved on-the-cheap fellow engineers making plans for buildings by looking at them on Google Street View instead of visiting them. Or submitting plans one-third to one-quarter as complete as normal forcing contractors, who may or may not be qualified even with the best of plans, into improvisational construction.

Disturbingly, in the engineers documented examples, these projects passed DBI inspections and received Certificates of Final Completion.

Some 2.5 years of engineers wheedling for a meeting finally came to fruition in December 2018 and April 2019. By the time we even got the meeting, says Matteson of the 2019 gathering, it was almost too late.

And, based upon several strongly worded formal letters and a flurry of follow-up emails, these do not appear to have been fruitful sessions.

SEAONC members continue to report instances of faulty design and construction on soft story projects in San Francisco, wrote then-president Tim Hart in a letter to building department leaders following the April 2019 meeting. For reasons that we have discussed previously, we still feel strongly that DBI can and must take action to improve quality assurance for the benefit of building owners, DBIs own reputation, and the Citys retrofit programs generally.

Inspection data provided by the building department was vague and incomplete, the engineers complained.

It does not address the specific construction issues that our members have identified as having quality control problems, reads Harts 2019 letter. The data also does not identify the issues that DBI inspectors found during their spot check process or how those issues were resolved. Finally, the data does not include any projects that were inspected prior to 2018.

Then-chief building inspector Patrick ORiordans request of the engineers to provide specific addresses clearly didnt encompass the engineers graver concerns about systemic problems.

Engineers offers to ride along with inspectors or perhaps review the work on their own were received coldly. Nowhere in the code is there an allowance for inspecting the same work again, when its been approved once already, ORiordan stated in a May 2019 email. Please keep in mind building inspections are scheduled by the stakeholder, and are not generally set up by the Senior Building Inspector.

Writing via a spokesman, ORiordan this week characterized the meetings to Mission Local as productive, citing an expanded commitment for senior inspectors to spot check soft-story projects.

SEAONC members, however, countered that spot-checking was not something they asked for nor felt adequate. Claims of increased inspections did not assuage the engineers, as inconsistent or faulty inspections were one of their primary concerns.

I also want to note that DBIs first priority is public safety, read the April 2021 statement from ORiordan. In that meeting, SEAONC representatives said they knew of properties that were not in compliance with soft story requirements. At that time and in subsequent conversations, we requested those property addresses so we can investigate.

Engineers countered that its not their job to report cases to the building department or, if it comes to that, to police the building department.

We gave information as examples to show the need for a more systematic review, explains Bonowitz. Whats more, confidentiality agreements with clients generally prevent engineers from informing on them.

Regarding ORiordans response, Bonowitz writes, With respect to the issue at hand how to quantify the actual quality program-wide this is a bullshit answer.

I regret we didnt anticipate a program like this would draw more bad actors.

The structural engineers association helped develop the mandatory soft-story program. And, in retrospect, Bonowitz says not enough thought was put into ensuring corner-cutting and dishonesty didnt taint the undertaking.

The foisting of costly, mandatory construction work onto thousands of unenthusiastic property owners led to conditions that are perfect for owners to hire the cheapest bid they can get, and contractors and even engineers to say I can do this, he says.

As developers of the program, I dont think we thought through enough at the beginning the nature of quality control. I regret we didnt anticipate a program like this would draw more bad actors.

As it is, engineers are left to wonder just what would be uncovered if 10, 20, 50, or however many projects were randomly subjected to post-facto quality control. How closely does the actual situation on the ground resemble the building departments documented inspections? How safe are these buildings?

That remains unknown and, for now, unknowable.

Go here to see the original:

Special Report: Structural engineers warnings over citys mandatory retrofits have gone unheeded for years - Mission Local

Read More..

Hennessey’s Chief Engineer Updates Us on the Venom F5 – Car and Driver

This content is imported from YouTube. You may be able to find the same content in another format, or you may be able to find more information, at their web site.

We've previously told you about Hennessey's ambition to prove that the forthcoming Venom F5 is the fastest production car in the world. That milestone is still a way off, presuming the Hennessey manages to beat the Bugatti Chirons 304-mph record. But the Texas company has announced that the F5 has already passed the 200-mph mark during testing, and with its 6.6-liter twin-turbocharged V-8 making just half of what will be its eventual 1817-hp output.

"We have been working up to the full output," John Heinricy, Hennessey Performance's chief engineer, explained. "It's much easier to do the basics at a lower output to get the learnings in, and we can then build on that . . . But if you look at 900 horsepower and the [3053-pound] weight of the car, there's no lack of performance."

Development work has been done both at Hennesseys HQ in Texas and also on a 2.2-mile runway of a one-time U.S. Air Force base in Arkansas. Last year John Hennessey told us the F5 will make top speed runs on the 3.2-mile runway of NASAs Kennedy Space Center, but also on public roads if this proves insufficiently long to validate the car's aimed-for 311-mph top speed.

Although working toward the same ultimate performance goal, Heinricy is equally determined that the finished F5 will perform at lower speeds, with development work set to include both roads and technical race courses including Circuit of the Americas near Austin.

"Sure, we're focusing on top speed and being the fastest on the racetrack, but we're also looking at the whole balance," he said. "If you just look at top speed, you're not going to make a car that is capable of everyday driving, we're not going after just a single area . . . and if you get the basics together its going to be a lot easier to get the ultimate performance."

When asked to nominate a rival he particularly wants to beat, Heinricy said, "All of them." He elaborated: "Sure, we want to make a car that can be compared to a Bugatti or the Konigsegg, but not just on top speed. We want to create a decathlete."

Heinricy was previously development chief for the Chevrolet Corvette, and he told us the F5 will offer similar usability. "I've spent my career trying to make cars accessible for regular customers," he said. "It's very much my focus here to make sure the car has that capability."

We look forward to testing both sides of the Venom F5's character.

This content is created and maintained by a third party, and imported onto this page to help users provide their email addresses. You may be able to find more information about this and similar content at piano.io

Follow this link:

Hennessey's Chief Engineer Updates Us on the Venom F5 - Car and Driver

Read More..

Cambridge Quantum pushes into NLP and quantum computing with new head of AI – VentureBeat

Join Transform 2021 this July 12-16. Register for the AI event of the year.

Cambridge Quantum Computing (CQC) hiring Stephen Clark as head of AI last week could be a sign the company is boosting research into ways quantum computing could be used for natural language processing.

Quantum computing is still in its infancy but promises such significant results that dozens of companies are pursuing new quantum architectures. Researchers at technology giants such as IBM, Google, and Honeywell are making measured progress on demonstrating quantum supremacy for narrowly defined problems. Quantum computers with 50-100 qubits may be able to perform tasks that surpass the capabilities of todays classical digital computers, but noise in quantum gates will limit the size of quantum circuits that can be executed reliably, California Institute of Technology theoretical physics professor John Preskill wrote in a recent paper. We may feel confident that quantum technology will have a substantial impact on society in the decades ahead, but we cannot be nearly so confident about the commercial potential of quantum technology in the near term, say the next 5 to 10 years.

CQC has been selling software focused on specific use cases, such as in cybersecurity and pharmaceutical and drug delivery, as the hardware becomes available. We are very different from the other quantum software companies that we are aware of, which are primarily focused on consulting-based revenues, CQC CEO Ilyas Khan told VentureBeat.

For example, amid concerns that improvements in quantum hardware will make it easier to break existing algorithms used in modern cryptography, CQC devised a method to generate quantum-resistant cryptographic keys that cannot be cracked by todays methods. CQC partners with pharmaceutical and drug discovery companies to develop quantum algorithms for improving material discovery, such as working with Roche on drug development, Total on new materials for carbon capture and storage solutions, and CrownBio for novel cancer treatment biomarker discovery.

The addition of Clark to CQCs team signals the company will be shifting some of its research and development efforts toward quantum natural language processing (QNLP). Humans are good at composing meanings, but this process is not well understood. Recent research established that quantum computers, even with their current limitations, could learn to reason with the uncertainty that is part of real-world scenarios.

We do not know how we compose meaning, and therefore we have not been sure how this process can be carried over to machines/computers, Khan said.

QNLP could enable grammar-aware representation of language that makes sense of text at a deeper level than is currently available with state-of-the-art NLP algorithms like Bert and GPT 3.0. The company has already demonstrated some early success in representing and processing text using quantum computers, suggesting that QNLP is within reach.

Clark was previously senior staff research scientist at DeepMind and led a team working on grounded language learning in virtual environments. He has a long history with CQC chief scientist Bob Coecke, with whom he collaborated 15 years ago to devise a novel approach for processing language. That research stalled due to the limitations of classical computers. Quantum computing could help address these bottlenecks, and there are plans to continue that research program, Clark said in a statement.

The methods we developed to demonstrate this could improve a broad range of applications where reasoning in complex systems and quantifying uncertainty are crucial, including medical diagnoses, fault-detection in mission-critical machines, and financial forecasting for investment management, Khan said.

See more here:
Cambridge Quantum pushes into NLP and quantum computing with new head of AI - VentureBeat

Read More..

Quantum: It’s still not clear what its good for, but Amazon and QCI will help developers find out – ZDNet

When it comes to practical problems, including things such as the traveling salesman problem, a classic in optimization, the value of quantum is still to be decided, say Richard Moulds, left, head of Amazon's Braket quantum computing service, and Robert Liscouski, head of Quantum Computing Inc., which makes Qatalyst software to do optimization on both classical and quantum machines.

It's easy to imagine a problem for which, if one had a computer that magically leapt across steps of the computation, your life would be much better.

Say, for example, a computer that auto-magically searches through a vast space of possible solutions much faster than you can with a CPU or GPU.

That's the premise of quantum computing, and surprisingly, for all the hype, it's not clear if that premise is true.

"I don't think we've seen any evidence yet that a quantum machine can do anything that's commercially interesting faster or cheaper than a classical machine," Richard Moulds, head of Amazon Braket, the cloud giant's quantum computing service, said in an interview with ZDNet. "The industry is waiting for that to arrive."

It is the question of the "quantum advantage," the notion that the entangled quantum states in a quantum computer will perform better on a given workload than an electronic system.

"We haven't seen it yet," Robert Liscouski, CEO of Quantum Computing Inc, said of the quantum advantage, in the same Zoom interview with Moulds.

That aporia, the as-yet-unproven quantum advantage, is in fact the premise for a partnership announced this month, whereby QCI's Qatalyst software program will run as a cloud service on top of Braket.

QCI's corporate tag line is "ready-to-run quantum software," and the Qatalyst program is meant to dramatically simplify sending a computing task to the qubits of a quantum hardware machine, the quantum processing units, or QPUs, multiple instances of which are offered through Bracket, including D::Wave, IonQ, and Rigetti.

The idea is to get more people working with quantum machines precisely to find out what they might be good for.

"Our platform basically allows the democratization of quantum computing to extend to the user community," said Liscouski.

"If you look back on the quantum industry since it started, it's traditionally been very difficult to get access to quantum hardware," said Moulds, including some machines that are "totally unavailable unless you have a personal relationship with the the physicist that built it."

"We're trying to make it easy for everyone to have access to the same machinery; it shouldn't be those that have and those that have not, it should be everyone on the same flywheel," he said.

The spectrum of users who will be working with quantum comprise "two important communities" today, said Moulds, those that want to twiddle qubits at the hardware level, and those that want to spend time on particular problems in order to see if they actually gain any benefit when exposed to the quantum hardware.

"There's a lot of researchers focused on building better hardware, that is the defining force in this industry," said Moulds. "Those types of researchers need to be in the weeds, playing at the qubit level, tweaking the frequencies of the pulses sent to the chip inside the fridge."

On the other hand, "the other class of users is much more geared to Robert's view of the world: they don't really care how it gets done, they just want to understand how to program their problem so that it can be most easily solved."

That second class of users are "all about abstraction, all about getting away from the technology." As quantum evolves, "maybe it slides under so that customers don't even know it's there," mused Moulds.

When it comes to those practical problems, the value of quantum is still to be decided.

There has been academic work showing quantum can speed up tasks, but "that's not been applied to a problem that anybody cares about," said Moulds.

The entire quantum industry is "still finding its way to what applications are really useful," he said. "You tend to see this list of potential applications, a heralded era of quantum computing, but I don't think we really know," he said.

The Qatalyst software from QCI focuses on the kinds of problems that are of perennial interest, generally in the category of optimization, particularly constrained optimization, where a solution to a given loss function or objective function is made more complicated by having to narrow the solution to a bunch of variables that have a constraint of some sort enforced, such as bounded values.

"They are described at a high level as the traveling salesman problem, where you have multi-variate sort of outcomes," said Liscouski. "But it's supply-chain logistics, it's inventory management, it's scheduling, it's things that businesses do today that quantum can really accelerate the outcomes in the very near future."

Such problems are "a very important use case," said Moulds. Quantum computers are "potentially good at narrowing the field in problem spaces, searching through large potential combinations in a wide variety of optimization problems," he said.

However, "classical will probably give you the better result" at this time, said Liscouski.

One of the reasons quantum advantage is not yet certain is because the deep phenomena at the heart of the discipline, things such as entanglement, make the field much more complex than early digital computing.

"A lot of people draw the analogy between where we are and the emergence of the transistor," said Moulds.

"I think that's not true: this is not just a case of making the computers we have today smaller and faster and cheaper, we're not anywhere near that regime, that Moore's Law notion of just scaling these things up."

"There's fundamental scientific discoveries that have to be made to build machines that can tackle these sorts of problems on the grand scale that we've been talking about."

Beyond the machines' evolution, there is an evolution implicit for programmers. Quantum brings a fundamentally different approach to programming. "These are physics-based machines, they're not just computational engines that add ones and zeros together, it's not just a faster slide rule," said Moulds.

That different way of programming may, in fact, point the way to some near-term payoff for the Qatalyst software, and Braket. Both Liscouski and Moulds expressed enthusiasm for taking lessons learned from quantum and back-loading them into classical computers.

"Typically, access to quantum computing is through toolkits and resources that require some pretty sophisticated capabilities to program to ultimately get to some result that involves a quantum computer," observed Liscouski.

"With Braket, the platform provides both access to QPUs and classical computing at the same time, and the quantum techniques that we use in the platform will get results for both," said Liscouski.

"It isn't necessarily a black and white decision between quantum and classical," said Moulds. "There's an emerging area, particularly in the area of optimization, people use the term quantum-inspired approaches are used."

"What that means is, looking at the ways that quantum computers actually work and applying that as a new class of algorithms that run on classical machines," he said.

"So, there's a sort of a morphing going on," he said.

An advantage to working with QCI, said Moulds, is that "they bring domain expertise that we don't have," things such as the optimization expertise.

"We've coined the phrase, 'Build on Braket'," said Moulds. "We're trying to build a quantum platform, and we look to companies like QCI to bring domain expertise to use that platform and apply it to problems that customers have really got."

Also important is operational stability and reliability, said Moulds. For a first-tier Web service with tons of users, the priority for Amazon is "running a professional service, a platform that is reliable and secure and durable" on which companies can "build businesses and solve problems."

Although there are "experimental" aspects, he said, "this is not intended to be a best-effort showcase."

Although the quantum advantage is not certain, Moulds holds out the possibility someone working with the technology will find it, perhaps even someone working on Braket.

"The only way we can move this industry forward is by pulling the curtains apart and giving folks the chance to actually see what's real," he said.

"And, boy, the day we see a quantum computer doing something that is materially advantageous from a commercial point of view, you will not miss that moment, I guarantee."

Read more:
Quantum: It's still not clear what its good for, but Amazon and QCI will help developers find out - ZDNet

Read More..

Are We Doomed to Repeat History? The Looming Quantum Computer Event Horizon – Electronic Design

What youll learn:

A couple examples from history highlight our failure to secure the technology thats playing an increasingly larger role in both our personal lives and business. When computers were first connected to the internet, we had no idea of the Pandoras Box that was being opened, and cybersecurity wasnt even considered a thing. We failed to learn our lesson when mobile phones exploded onto the world and again with IoT still making fast to market more important than security. This has constantly left cybersecurity behind the 8 ball in the ongoing effort to secure data.

As we race to quantum computing, well see another, and perhaps the greatest, fundamental shift in the way computing is done. Quantum computers promise to deliver an increase in computing power that could spur enormous breakthroughs in disease research, understanding global climate, and delving into the origins of the universe.

As a result, the goal to further advance quantum-computing research has rightfully attracted a lot of attention and funding including $625 million from the U.S. government.1 However, it also will make many of our trusted security techniques inadequate, enabling encryption to be broken in minutes or hours instead of the thousands of years it currently takes.

Two important algorithms that serve as a basis for security of most commonly utilized public-key algorithms today will be broken by quantum computers:

As we prepare for a post-quantum world, we have another opportunity to get security right. The challenge of replacing the existing public-key cryptography in these applications with quantum-computer-resistant cryptography is going to be formidable.

Todays state-of-the-art quantum computers are so limited that while they can break toy examples, they dont endanger commercially used key sizes (such as specified in NIST SP800-57). However, most experts agree its only a matter of time until quantum computers evolve to the point of being able to break todays cryptography.

Cryptographers around the world have been studying the issue of post-quantum cryptography (PQC), and NIST has started a standardization process. However, even though were likely five to 10 years away from quantum computers becoming widely available, were approaching what can be described as the event horizon.

Data that has been cryptographically protected by quantum-broken algorithms up to Day 0 of the PQC deployment will likely need to remain secure for years decades in some cases after quantum computers are in use. This is known as Moscas Theorem (see figure).

%{[ data-embed-type="image" data-embed-id="6081ce0f2f5c1329008b4613" data-embed-element="span" data-embed-size="640w" data-embed-alt="Illustration of a bad outcome under Mosca’s Theorem, where a quantum adversary can break the security requirements for recorded messages. The adversary could, for example, break the encryption on a recorded message or alter a legal document and generate a fake signature indistinguishable from a valid signature." data-embed-src="https://img.electronicdesign.com/files/base/ebm/electronicdesign/image/2021/04/PQC_Event_Horizon_Figure_1.6081ce0f24f07.png?auto=format&fit=max&w=1440" data-embed-caption="Illustration of a bad outcome under Moscas Theorem, where a quantum adversary can break the security requirements for recorded messages. The adversary could, for example, break the encryption on a recorded message or alter a legal document and generate a fake signature indistinguishable from a valid signature." ]}%

Deploying any secure solution takes time. Given the inherent longer development time of chips compared to software, chip-based security becomes even more pressing. Throw in the added challenge that PQC depends on entirely new algorithms, and our ability to protect against quantum computers will take many years to deploy. All this adds up to make PQC a moving target.

The good news is that, and I take heart in this, we seem to have learned from previous mistakes, and NISTs PQC standardization process is working. The effort has been underway for more than four years and has narrowed entrants from 69 to seven (four in the category of public-key encryption and three in the category of digital signatures) over three rounds.

However, in late January 2021, NIST started reevaluating a couple of the current finalists and is considering adding new entries as well as some of the candidates from the stand-by list. As mentioned previously, addressing PQC isnt an incremental step. Were learning as we go, which makes it difficult to know what you dont know.

The current finalists were heavily skewed toward a lattice-based scheme. What the potential new direction by NIST indicates is that as the community has continued studying the algorithms, lattice-based schemes may not be the holy grail we first had hoped.

Someone outside the industry may look at that as a failure, but I would argue thats an incorrect conclusion. Only by trial and error, facing failure and course correcting along the way, can we hope to develop effective PQC algorithms before quantum computers open another, potentially worse cybersecurity Pandoras box. If we fail to secure it, we risk more catastrophic security vulnerabilities than weve ever seen: Aggressors could cripple governments, economies, hospitals, and other critical infrastructure in a matter of hours.

While its old hat to say, Its time the world took notice of security and give it a seat at the table, the time to deliver on that sentiment is now.

Reference

1. Reuters, U.S. to spend $625 million in five quantum information research hubs

Original post:
Are We Doomed to Repeat History? The Looming Quantum Computer Event Horizon - Electronic Design

Read More..

Cleveland Clinic and IBM hope their tech partnership could help prevent the next pandemic – WTHITV.com

After a year in which scientists raced to understand Covid-19 and to develop treatments and vaccines to stop its spread, Cleveland Clinic is partnering with IBM to use next-generation technologies to advance healthcare research and potentially prevent the next public health crisis.

The two organizations on Tuesday announced the creation of the "Discovery Accelerator," which will apply technologies such as quantum computing and artificial intelligence to pressing life sciences research questions. As part of the partnership, Cleveland Clinic will become the first private-sector institution to buy and operate an on-site IBM quantum computer, called the Q System One. Currently, such machines only exist in IBM labs and data centers.

Quantum computing is expected to expedite the rate of discovery and help tackle problems with which existing computers struggle.

The accelerator is part of Cleveland Clinic's new Global Center for Pathogen Research & Human Health, a facility introduced in January on the heels of a $500 million investment by the clinic, the state of Ohio and economic development nonprofit JobsOhio to spur innovation in the Cleveland area.

The new center is dedicated to researching and developing treatments for viruses and other disease-causing organisms. That will include some research on Covid-19, including why it causes ongoing symptoms (also called "long Covid") for some who have been infected.

"Covid-19 is an example" of how the center and its new technologies will be used, said Dr. Lara Jehi, chief research information officer at the Cleveland Clinic.

"But ... what we want is to prevent the next Covid-19," Jehi told CNN Business. "Or if it happens, to be ready for it so that we don't have to, as a country, put everything on hold and put all of our resources into just treating this emergency. We want to be proactive and not reactive."

Quantum computers process information in a fundamentally different way from regular computers, so they will be able to solve problems that today's computers can't. They can, for example, test multiple solutions to a problem at once, making it possible to come up with an answer in a fraction of the time it would take a different machine.

Applied to healthcare research, that capability is expected to be useful for modeling molecules and how they interact, which could accelerate the development of new pharmaceuticals. Quantum computers could also improve genetic sequencing to help with cancer research, and design more efficient, effective clinical trials for new drugs, Jehi said.

Ultimately, Cleveland Clinic and IBM expect that applying quantum and other advanced technologies to healthcare research will speed up the rate of discovery and product development. Currently, the average time from scientific discovery in a lab to getting a drug to a patient is around 17 years, according to the National Institutes of Health.

"We really need to accelerate," Jehi said. "What we learned with the Covid-19 pandemic is that we cannot afford, as a human race, to just drop everything and focus on one emergency at a time."

Part of the problem: It takes a long time to process and analyze the massive amount of data generated by healthcare, research and trials something that AI, quantum computing and high-performance computing (a more powerful version of traditional computing) can help with. Quantum computers do that by "simulating the world," said Dario Gil, director of IBM Research.

"Instead of conducting physical experiments, you're conducting them virtually, and because you're doing them virtually through computers, it's much faster," Gil said.

For IBM, the partnership represents an important proof point for commercial applications of quantum computing. IBM currently offers access to quantum computers via the cloud to 134 institutions, including Goldman Sachs and Daimler, but building a dedicated machine on-site for one organization is a big step forward.

"What we're seeing is the emergency of quantum as a new industry within the world of information technology and computing," Gil said. "What we're seeing here in the context of Cleveland Clinic is ... a partner that says, 'I want the entire capacity of a full quantum computer to be [dedicated] to my research mission."

The partnership also includes a training element that will help educate people on how to use quantum computing for research which is likely to further grow the ecosystem around the new technology.

Cleveland Clinic and IBM declined to detail the cost of the quantum system being installed on the clinic's campus, but representatives from both organizations called it a "significant investment." Quantum computers are complex machines to build and maintain because they must be stored at extremely cold temperatures (think: 200 times colder than outer space).

The Cleveland Clinic will start by using IBM's quantum computing cloud offering while waiting for its on-premises machine to be built, which is expected to take about a year. IBM plans to later install at the clinic a more advanced version of its quantum computer once it is developed in the coming years.

Jehi, the Cleveland Clinic research lead, acknowledged that quantum computing technology is still nascent, but said the organization wanted to get in on the ground floor.

"It naturally needs nurturing and growing so that we can figure out what are its applications in healthcare," Jehi said. "It was important to us that we design those applications and we learn them ourselves, rather than waiting for others to develop them."

See the original post here:
Cleveland Clinic and IBM hope their tech partnership could help prevent the next pandemic - WTHITV.com

Read More..

Fine-tuning the color of light | Stanford News – Stanford University News

Among the first lessons any grade school science student learns is that white light is not white at all, but rather a composite of many photons, those little droplets of energy that make up light, from every color of the rainbow red, orange, yellow, green, blue, indigo, violet.

Shanhui Fan (Image credit: Rod Searcey)

Now, researchers at Stanford University have developed an optical device that allows engineers to change and fine-tune the frequencies of each individual photon in a stream of light to virtually any mixture of colors they want. The result, published April 23 in Nature Communications, is a new photonic architecture that could transform fields ranging from digital communications and artificial intelligence to cutting-edge quantum computing.

This powerful new tool puts a degree of control in the engineers hands not previously possible, said Shanhui Fan, a professor of electrical engineering at Stanford and senior author of the paper.

The structure consists of a low-loss wire for light (the black line below) carrying a stream of photons that pass by like so many cars on a busy throughway. The photons then enter a series of rings (orange), like the off-ramps in a highway cloverleaf. Each ring has a modulator (EOM in green) that transforms the frequency of the passing photons frequencies which our eyes see as color. There can be as many rings as necessary, and engineers can finely control the modulators to dial in the desired frequency transformation.

Schematic of the new photonic device showing the external waveguide (black), the light-altering rings (orange) and frequency modulators (green). The spectrum on the top left represents the ratio of frequencies of incoming light. The spectrum at the right is the result of the frequency transformation implemented by the system. (Image credit: Courtesy of the Fan Lab)

Among the applications that the researchers envision include optical neural networks for artificial intelligence that perform neural computations using light instead of electrons. Existing methods that accomplish optical neural networks do not actually change the frequencies of the photons, but simply reroute photons of a single frequency. Performing such neural computations through frequency manipulation could lead to much more compact devices, say the researchers.

Our device is a significant departure from existing methods with a small footprint and yet offering tremendous new engineering flexibility, said Avik Dutt, a post-doctoral scholar in Fans lab and second author of the paper.

The color of a photon is determined by the frequency at which the photon resonates, which, in turn, is a factor of its wavelength. A red photon has a relatively slow frequency and a wavelength of about 650 nanometers. At the other end of the spectrum, blue light has a much faster frequency with a wavelength of about 450 nanometers.

A simple transformation might involve shifting a photon from a frequency of 500 nanometers to, say, 510 nanometers or, as the human eye would register it, a change from cyan to green. The power of the Stanford teams architecture is that it can perform these simple transformations, but also much more sophisticated ones with fine control.

To further explain, Fan offers an example of an incoming light stream comprised of 20 percent photons in the 500-nanometer range and 80 percent at 510 nanometers. Using this new device, an engineer could fine-tune that ratio to 73 percent at 500 nanometers and 27 percent at 510 nanometers, if so desired, all while preserving the total number of photons. Or the ratio could 37 and 63 percent, for that matter. This ability to set the ratio is what makes this device new and promising. Moreover, in the quantum world, a single photon can have multiple colors. In that circumstance, the new device actually allows changing of the ratio of different colors for a single photon.

We say this device allows for arbitrary transformation but that does not mean random, said Siddharth Buddhiraju, who was a graduate student in Fans lab during the research and is first author of the paper and who now works at Facebook Reality Labs. Instead, we mean that we can achieve any linear transformation that the engineer requires. There is a great amount of engineering control here.

Its very versatile. The engineer can control the frequencies and proportions very accurately and a wide variety of transformations are possible, Fan added. It puts new power in the engineers hands. How they will use it is up to them.

Additional authors include postdoctoral scholars Momchil Minkov, now at Flexcompute, and Ian A. D. Williamson, now at Google X.

This research was supported by the U.S. Air Force Office of Scientific Research.

Read more:
Fine-tuning the color of light | Stanford News - Stanford University News

Read More..

Quantum Computing Market Share Current and Future Industry Trends, 2020 to 2027 The Courier – The Courier

Quantum Computing Market is a professional and a detailed report focusing on primary and secondary drivers, market share, leading segments and geographical analysis. This analysis provides an examination of various market segments that are relied upon to observe the fastest development amid the estimated forecast frame. The report encompasses market definition, currency and pricing, market segmentation, market overview, premium insights, key insights and company profile of the key market players. The persuasive Quantum Computing market report also helps to know about the types of consumers, their response and views about particular products, and their thoughts for the step up of a product.

Quantum computing is an advanced developing computer technology which is based on the quantum mechanics and quantum theory. The quantum computer has been used for the quantum computing which follows the concepts of quantum physics. The quantum computing is different from the classical computing in terms of speed, bits and the data. The classical computing uses two bits only named as 0 and 1, whereas the quantum computing uses all the states in between the 0 and 1, which helps in better results and high speed. Quantum computing has been used mostly in the research for comparing the numerous solutions and to find an optimum solution for a complex problem and it has been used in the sectors like chemicals, utilities, defence, healthcare & pharmaceuticals and various other sectors. Quantum computing is used for the applications like cryptography, machine learning, algorithms, quantum simulation, quantum parallelism and others on the basis of the technologies of qubits like super conducting qubits, trapped ion qubits and semiconductor qubits. Since the technology is still in its growing phase, there are many research operations conducted by various organizations and universities including study on quantum computing for providing advanced and modified solutions for different applications. For instance, Mercedes Benz has been conducting research over the quantum computing and how it can be used for discovering the new battery materials for advanced batteries which can be used in electric cars. Mercedes Benz has been working in collaboration with the IBM on IBM Q network program, which allows the companies in accessing the IBMs Q network and early stage computing systems over the cloud. Global quantum computing market is projected to register a healthy CAGR of 29.5% in the forecast period of 2019 to 2026.

Download Sample Copy of the Report to understand the structure of the complete report (Including Full TOC, Table & Figures) @https://www.databridgemarketresearch.com/request-a-sample/?dbmr=global-quantum-computing-market&Somesh

Quantum Computing Market Scope and Segmentation:

Global quantum computing market is segmented into seven notable segments which are system, qubits, deployment model, component, application, logic gates and vertical.

Quantum Computing Market Country Level Analysis

For detailed insights on Global Quantum Computing Market Size, competitive landscape is provided i.e. Revenue Share Analysis (Million USD) by Players, Revenue Market Share (%) by Players and further a qualitative analysis is made towards market concentration rate, product differentiation, new entrants are also considered in heat map concentration.

New Business Strategies, Challenges & Policies are mentioned in Table of Content, Request TOC at @https://www.databridgemarketresearch.com/toc/?dbmr=global-quantum-computing-market&Somesh

Leading Key Players Operating in the Quantum Computing Market Includes:

Some of the major players operating in this market are Honeywell International, Inc., Accenture, Fujitsu, Rigetti & Co, Inc., 1QB Information Technologies, Inc., IonQ, Atom Computing, ID Quantique, QuintessenceLabs, Toshiba Research Europe Ltd, Google,Inc., Microsoft Corporation, Xanadu, Magiq Technologies, Inc., QX branch, NEC Corporation, Anyon System,Inc. Cambridge Quantum Computing Limited, QC Ware Corp, Intel Corporation and others.

Product Launch

The Quantum Computing Market research covers a comprehensive analysis of the following facts:

Table of Content:

PART 01: EXECUTIVE SUMMARY

PART 02: SCOPE OF THE REPORT

PART 03: RESEARCH METHODOLOGY

PART 04: INTRODUCTION

PART 05: MARKET LANDSCAPE

PART 06: MARKET SIZING

PART 07: FIVE FORCES ANALYSIS

PART 08: MARKET SEGMENTATION BY PRODUCT

PART 09: MARKET SEGMENTATION BY DISTRIBUTION CHANNEL

PART 10: CUSTOMER LANDSCAPE

PART 11: MARKET SEGMENTATION BY END-USER

PART 12: REGIONAL LANDSCAPE

PART 13: DECISION FRAMEWORK

PART 14: DRIVERS AND CHALLENGES

PART 15: MARKET TRENDS

PART 16: COMPETITIVE LANDSCAPE

PART 17: COMPANY PROFILES

PART 18: APPENDIX

Inquire Before Buying This Research Report:https://www.databridgemarketresearch.com/inquire-before-buying/?dbmr=global-quantum-computing-market&Somesh

About Us:

An absolute way to forecast what future holds is to comprehend the trend today!

Data Bridge Market Research set forth itself as an unconventional and neoteric Market research and consulting firm with an unparalleled level of resilience and integrated approaches. We are determined to unearth the best market opportunities and foster efficient information for your business to thrive in the market. Data Bridge Market Research provides appropriate solutions to complex business challenges and initiates an effortless decision-making process.

Contact:

US: +1 888 387 2818

UK: +44 208 089 1725

Hong Kong: +852 8192 7475

corporatesales@databridgemarketresearch.com

See original here:
Quantum Computing Market Share Current and Future Industry Trends, 2020 to 2027 The Courier - The Courier

Read More..

Atos unveils global R&D Lab to drive innovation in Cybersecurity, High Performance Computing and Quantum – Yahoo Finance UK

Les Clayes-sous-Bois (Yvelines), France - April 22, 2021 Atos today officially inaugurates its new global Research & Development Lab in Les Clayes-sous-Bois, in the greater Paris metropolitan area (Yvelines), France. The new 8,000 m2 lab, which hosts around 350 of Atos highly qualified engineers, provides a modern space dedicated to research in quantum computing, high-performance computing, edge, artificial intelligence and cybersecurity.

Supported by the Ile-de-France Region and built on Atos existing site at Les Clayes-sous-Bois, which employs almost 1,000 people, this lab is another milestone in Atos strategy to develop and globally position the historical site of Clayes-sous-Bois and the Ile-de-France Region as a strong center of technical expertise. Atos Quantum, Atos quantum computing research program and the first major quantum industry program in Europe, benefits from an investment of 5 million from the Ile-de-France Region as part of its Smart Industry strategy, adopted in July 2017.

Innovation to support the fight against global warming

Decarbonization is a key priority for Atos. The company is committed to reducing the global carbon emissions under its control and influence by 50% by 2025 and to achieve "zero net emissions", by 2028. The research developed in this new laboratory, meeting the highest environmental standards, will focus on innovation to support the fight against global warming, such as using quantum calculation or the energy efficiency of supercomputers to accelerate society's journey to carbon neutrality. Another example is the development of a supercomputer brain that will be able to predict and optimize energy consumption based on the workload and the energy available in the electricity providers grids.

Inauguration Ceremony

The inauguration ceremony saw Valrie Pcresse, President of the Ile-de-France Regional Council say: I am proud to be part of this development of the industry of the future in the Ile-de-France Region. This new building and investment show that we are preparing the future right here, right now. We are committed to making the Ile-de-France Region a territory of innovation, a digital leader at the heart of the economic fabric. This new R&D lab is in line with our plans to promote the implementation and development of strategic technologies, in particular quantum computing, in the Ile-de-France Region.

Story continues

In partnership with the Ile-de-France Region, I am thrilled to officially open our new R&D Lab today which illustrates more than 50 years of research work carried out at our historical site of Clayes-sous-Bois. From this symbolic site we will drive forward our ambitious quantum computing program and develop strategic technologies, products and solutions that will be sold worldwide, and that will help shape a safe, decarbonized future said Elie Girard, CEO Atos.

Atos Quantum: a global program

The R&D lab will accommodate the research work conducted as part of the Atos Quantum program, launched in 2016, which aims to accelerate the development of scientific and industry-relevant quantum computing use-cases. Atos researchers developed the Atos Quantum Learning Machine (Atos QLM), the world's highest-performing commercially available quantum simulator, which is already being used in numerous countries worldwide including Finland, France, Germany, India, Japan, the UK and the United States, empowering major research programs in various sectors like industry or energy. Atos also recently launched Q-score, the first universal quantum metrics, applicable to all programmable quantum processors, measures a quantum systems effectiveness at handling real-life problems, rather than simply measuring its theoretical performance.

Watch the video presentation of the new Atos R&D laboratory at the following link: https://youtu.be/-TOyFZuf-LQ (in French). Elie Girard and Valrie Pcresse, President of the le-de-France Regional Council, discuss the new lab, followed by a virtual visit of the new site with Philippe Guiguen, Mayor of Clayes-sous-Bois and the entire Atos team: Sophie Proust, CTO; Pierre Barnab Head of Big Data and Cybersecurity; Arnaud Bertrand, Director of Strategy and Innovation Big Data and Cybersecurity; Agnes Boudot, Director of HPC, AI & Quantum activities and Cyril Allouche, R&D Director, Quantum Computing.

###

About Atos

Atos is a global leader in digital transformation with 105,000 employees and annual revenue of over 11 billion. European number one in cybersecurity, cloud and high performance computing, the Group provides tailored end-to-end solutions for all industries in 71 countries. A pioneer in decarbonization services and products, Atos is committed to a secure and decarbonized digital for its clients. Atos operates under the brands Atos and Atos|Syntel. Atos is a SE (Societas Europaea), listed on the CAC40 Paris stock index.

The purpose of Atos is to help design the future of the information space. Its expertise and services support the development of knowledge, education and research in a multicultural approach and contribute to the development of scientific and technological excellence. Across the world, the Group enables its customers and employees, and members of societies at large to live, work and develop sustainably, in a safe and secure information space. http://www.atos.net

About the le-de-France Region

The le-de-France region plays a driving role for employment and French growth, both in terms of its economic weight and its influence. Leading economic region in Europe and third in the world, behind Tokyo and New York, the le-de-France is a territory of innovation, which concentrates 40% of Frances R&D activities, and which benefits from an international attractiveness. The le-de-France region acts in most of the areas that concern the daily life of the 12 million Franciliens: transport, but also high schools, economic development, the environment etc. In a space that covers 2% of the French territory but brings together 18% of its population and nearly 30% of the national GDP, the Region leads a development policy that places innovation and environment at its heart.

Press contacts:

Atos: Lucie Duchateau lucie.duchateau@atos.net - +33(0) 7 62 85 35 10

le-de-France Rgion: Elonore Flaceliere - eleonore.flaceliere@iledefrance.fr

Attachment

Read this article:
Atos unveils global R&D Lab to drive innovation in Cybersecurity, High Performance Computing and Quantum - Yahoo Finance UK

Read More..

The first 100 days: What does President Bidens approach to the world look like so far? – Brookings Institution

As we reach the end of this early chapterthese first 100 daysof the new Biden administration, much can be said about the dramatically different world view of this president when compared to President Trump and his America First mantra. In many ways, the early actions of the Biden-Harris White House resemble the forward-leaning, solution-focused optimism of the early Obama administration, but with a far greater emphasis on the connectivity between U.S. foreign affairs and American domestic politics. Indeed, President Bidens central Foreign Policy for the Middle Class argument, which places domestic economic renewal as the top priority of Americas actions abroad, speaks to this viewpoint directly. It also highlights the manner in which domestic political considerations and opinions drive policy objectives for President Bidenanother key theme thus far.

A Biden Doctrine?

An emerging Biden doctrine can be found in the release of its Interim National Security Strategic Guidance (iNSS)a precursor document to the official National Security Strategy (NSS)only two months after taking office. No document to date better encapsulates President Bidens world view and the likely trajectory of U.S. foreign policy. Indeed, the document seeks to foretell the administrations strategic vision, but two quotes stand out: 1) In advancing Americas interests globally, we will make smart and disciplined choices regarding our national defense and the responsible use of our military, while elevating diplomacy as our tool of first resort, and 2) At a time of multiple, intersecting crises, we must recognize that our strength abroad requires the United States to build back better at home.

In so many ways, these comments define the still-nascent Biden doctrine. The document also speaks at length about the Biden administrations desire to reinvigorate American leadership within international institutions, to join with fellow allies and partners in strengthening our shared values all around the world, and to confront the revolution in technology that poses both peril and promise.

On this latter point about emerging technology, which includes artificial intelligence, quantum computing, biotechnology, advanced robotics, and more, the strategic need for American leadership is truly pressing. New technologies are in many ways both the key to and the vital intersection of healing our society and building back better, ensuring Americas role as a geopolitical leader, and the coalescing of democratic nations around a common purpose. Especially given Chinas surging technology sector, the long-term strategic viability of the United States and our allies may lie in our leadership around the creation and deployment, but also the character, of these technologies. New and emerging technologies are likely to be the driving force of the 21st century at multiple levels, and the inclusion of this issue in the iNSS is essential.

Though still early, the Biden administration has moved decisively on several fronts promised during the campaign and spelled out in the interim guidance: resuming negotiations with Iran, rejoining of the Paris Climate Agreement, and expressing the Presidents outspoken support of NATO. Associated with these moves, the Biden administration has rightly said that it desires a review of many existing U.S. policies at home and abroad. Friend and foe alike are eagerly awaiting the outcomes of these reviews, which for some seem to be taking an eternity.

Implementing policy

In his first 100 days Biden has not only begun to sketch out a coherent doctrine but he has appointed leaders who are aligned with his perspectives and have significant foreign policy experience both in Washington and abroad, and across multiple prior administrations. From Secretary of State Tony Blinken and Secretary of Defense Lloyd Austin to Director of National Intelligence Avril Haines, Central Intelligence Agency Director Bill Burns, United Nations Ambassador Linda Thomas-Greenfield, and likely USAID Administrator Samantha Power, theirs is a perspective tempered by decades of lived experience, public service in support of the American people, and exposure to prior presidential leadership and policy perspectives. And importantly, the department and agency teams these individuals have begun to bring on board have thus far showcased a similar level of expertise. This latter point is essential, as these are the individuals placed in charge of actually implementing major U.S. policy decisions.

The character of American leadership

In addition to these important early observations, one additional question is central to the Biden administrations world view, which has been voiced by a number of foreign officials in recent months: Is President Biden a return to the traditional rule for American leadership, or are he and his administration now the exception to the new rule following four years of America First foreign policy, ultra-nationalism, and an overall doctrine of transactional U.S. engagement? The definitive answer to that question will of course be found in the 2024 U.S. presidential election. Until that time however, the Biden administration will almost certainly do everything in its power to strengthen multilateral relationships and international institutions, and to, at minimum, reassure our allies that America leadership is back on the world stage. This same logic applies to their urgency to prove President Bidens legitimacy at home, too, via a foreign policy focused on domestic American prosperity. This White House has a truly pressing need to prove wrong the naysayers at home and abroad; Americas ability to operate as a credible global actor depends on it.

Only time will tell if the palpable uncertainty, trending to genuine concern, amongst U.S. allies is well-placed. This dynamic is a pall hanging over the Biden administration. Such efforts will also be central to President Bidens overall approach to the world as well as his policy calculations when engaging abroad. Here, winning hearts and minds is as important internationally as it is at home (i.e., the previously mentioned Foreign Policy for the Middle Class concept). The desire to bring together the worlds democracies for a Summit for Democracy is grounded in this priority, too. Only together can democratic states strengthen their shared commitments and values and push back against the global rise of authoritarianism and radicalism.

The global race to develop new and emerging technologies is also crucial to this point. Over the mid- to long-term, no one nation will be able to compete against China in the creation and deployment of technology. Beyond the literal strategic implications of such a reality, it also risks the slow evolution of a world bifurcated not only by both the technologies we use, but also the values that informed their creation and ultimately their employment. From advanced surveillance networks to autonomous weapons systems, there are lines that global democracies will likely never cross, but which our adversaries will assuredly ignore.

So, what then of the days ahead?

From the continuing, though lessening, devastation of the COVID-19 pandemic; a teetering economy characterized by staggering income inequality; the looming threat of climate change; pervasive scourge of systemic racism; and the steady rise of authoritarian leaders inimical to our values, few if any other administrations have inherited more challenges on its first day in office. President Biden has shown a desire to frame U.S. foreign policy objectives around fair competition and collaboration, especially around issues like climate change, COVID-19, or even the U.S.-China relationship. The aforementioned democracy summit, which is aligned with this latter approach, would bring together a global, values-based community of democracies united by system of government, the interrelationships of their economies and commitment to free and fair trade, the aggregation of their technologies dedicated to the public good, and more

Though none can truly know the full measure of tomorrows challenges, we can safely assume this solution-focused, multilateral world view will persist across the trying months and years ahead.

Read the original post:
The first 100 days: What does President Bidens approach to the world look like so far? - Brookings Institution

Read More..