Page 2,009«..1020..2,0082,0092,0102,011..2,0202,030..»

Science: Misinformation and the role of social media – Digital Journal

Medical Laboratory Scientist at bench with micropipettes. Courtesy U.S. National Institutes of Health (Public Domain)

With science papers, these are subject to rigorous peer review. However, there are times when it later comes to light that the research was flawed, and the normal course of action is for the research paper to be retracted. This removes the research from the public domain, guiding everyone from the general public, professionals, and researchers away from erroneous conclusions.

In practice, the process for removing flawed research may not be sufficiently quick, according to a new study. The finding comes from Northwestern University and University of Michigan and the researchers put forward the notion that the delay in retraction leads to the risk of disinformation spreading. The issue also needs to be considered in the context of the vast numbers increasing significantly of research papers that are being issued each year.

One of the reasons for this is due to papers that are later retracted often being widely circulated online. This is driven by the traditional media and through social media. Sometimes the information sharing happens before the paper is retracted and by this time any removal of the paper is too later it has already filtered into the public consciousness. It also stands that many people do not hear about the retraction.

It is also the case at times that the papers and research that end up receiving the most attention are the ones that are later retracted. A case in point is with the various health and diet stories, which are sometimes based on flawed findings.

This is picked up by gnes Horvt, an assistant professor of communication and computer science at Northwestern, who says: Social media and even top news outlets the most prestigious venues that cover science are more prone to talk about papers that end up being retracted.

These occasions, especially with wilder research claims, tend to have wider ranging and long-lasting impacts. Another area where this has occurred is with disinformation around vaccines, which can have real world impacts in terms of vaccine hesitancy.

As well as social media being one vector for the spread of incorrect information, social media may also provide a means to slowdown, or at least flag, incorrect slices of information. Here the researchers look at Twitter and the role that the microblogging site could potentially play in providing early signals of dubious research.

The researcher drew on data compiled by the websites Retraction Watch and Altmetric and used the databases to compare the online footprints of 2,830 retracted papers to those of 13,599 unretracted papers. Each of the papers had similar publication venues, dates, numbers of authors, and author citation counts for a tracking period that extended for at least six months both post-publication and post-retraction.

This revealed that papers that were later retracted tended to have significantly higher numbers of initial mentions on forums like major social media platforms, online news sites, blogs and knowledge repositories like Wikipedia compared with papers that were never retracted. This is a factor of the novelty of many of the findings.

The research (into the research paper issue) appears in the journal Proceedings of the National Academy of Sciences. The paper is titled Dynamics of cross-platform attention to retracted papers.

See the original post here:

Science: Misinformation and the role of social media - Digital Journal

Read More..

Undergrads begin summer quantum research with support from Moore Foundation, Chicago region universities, national labs – EurekAlert

image:Open Quantum Initiative Fellowship students receive a behind the scenes look at IBMs quantum research lab at the Thomas J. Watson Research Center in Yorktown Heights, New York. view more

Credit: Kate Timmerman

More than a dozen college students from underrepresented backgrounds will be spending the summer conducting quantum information science and engineering research in labs across the Midwest thanks to the Open Quantum Initiative Undergraduate Fellowship, a new program that seeks to make the burgeoning quantum workforce a more diverse and inclusive community from the start.

The Open Quantum Initiative is a group of researchers, educators, and leaders among the Chicago Quantum Exchange that champions the values of diversity, equity, and inclusion in quantum science. Their new fellowship recently garnered almost half a million dollars of support from the Gordon and Betty Moore Foundation, affirming the importance of increasing the diversity of scientists and engineers in quantum information science and engineering.

The new fellowship program was founded in large part by graduate students and early-career researchers and seeks to make the expanding quantum workforce a more diverse and inclusive community by helping undergraduate students from a broad variety of backgrounds gain hands-on experience. Almost 70% of this years fellowship students are Hispanic, Latino, or Black, and half are the first in their family to go to college. In addition, while the field of quantum science and engineering is generally majority-male, the 2022 cohort is half female.

The unique thing about quantum information science is that the field is just starting to take off, said Katherine Harmon, a Maria Goeppert Mayer Fellow at U.S. Department of Energys Argonne National Laboratory, and one of the early-career researchers who helped conceptualize and launch the initiative. We have an opportunity and indeed an obligation to ensure that the field is open to everyone from the start.

The inaugural cohort of Open Quantum Initiative Fellows includes undergraduates from across the countryas close as Chicago State University and all the way to University of Texas at Rio Grande Valley. This week, these students have already received a behind the scenes look at IBMs quantum research lab at the Thomas J. Watson Research Center in Yorktown Heights, New York.

As part of the fellowship, students will spend the next 10 weeks at partnering institutions, including The University of Chicago, the U.S. Department of Energys Argonne National Laboratory, Fermi National Accelerator Laboratory, University of Illinois Urbana-Champaign, University of Wisconsin-Madison, and The Ohio State University, receiving one-on-one mentorship as they participate in quantum research.

Student research projects will span quantum networking, software and computing, and quantum sensing, which could lead to new kinds of unbreakable encryption and secure communication over vast distances, computers that can solve previously unsolvable problems, and sensors that can detect the tiniest change in the environment.

I have been interested in quantum computing development for some time now, but it's difficult to find formal opportunities for learning and exploring quantum. When I learned about OQI, it seemed like the perfect opportunity for such exploration, said Ariadna Fernandez, a computer science major at the University of Illinois at Chicago and a member of the inaugural fellowship cohort. Its important that everyone has access to the field to ensure that leadership includes the voices of communities that are often left behind and significantly impacted by new technology. I think the mission of OQI is an important way to make this happen in quantum.

The fellowship program was launched with support from the Chicago Quantum Exchange member and partner institutions where the students will work, along with Q-NEXT, a Department of Energy National Quantum Information Science Research Center and the National Science Foundation Quantum Leap Challenge Institute for Hybrid Quantum Architectures and Networks.

The recent Moore Foundation award will enable more than 30 additional fellows to join the program over the next four years. We are looking forward to seeing the program grow and provide more opportunities for students such as Ariadna to explore the potential of quantum science and to push the frontiers of this technology, said Gary Greenburg, program officer in the science program of the Gordon and Betty Moore Foundation.

We have an extraordinary opportunity to develop a diverse and inclusive workforce for this emerging discipline in science and engineering, and this fellowship is a large step towards that goal, said David Awschalom, director of the Chicago Quantum Exchange and the Liew Family Professor in Molecular Engineering and Physics at the University of Chicago. Our partnership with the Moore Foundation will help us create a program that can be a model for similar efforts across the country. We look forward to nurturing this next generation of quantum talent.

The fellowship also includes networking activities to help the fellows establish connections with mentors and peers in academia and industry. Fellowship cohorts will stay connected even after their summer is over: they will be able to attend online seminars designed to expand their professional network, teach science communication skills, and provide career preparation strategies, and past fellows will have opportunities to mentor future fellows. The Open Quantum Initiative also aims to provide future research experiences in subsequent summers.

About the Chicago Quantum Exchange:

The Chicago Quantum Exchange (CQE) is an intellectual hub for advancing the science and engineering of quantum information between the CQE community, across the Midwest, and around the globe. A catalyst for research activity across its member and partner institutions, the CQE is based at the University of Chicagos Pritzker School of Molecular Engineering and is anchored by the U.S. Department of Energys Argonne National Laboratory and Fermi National Accelerator Laboratory, the University of Illinois Urbana-Champaign, the University of Wisconsin-Madison, and Northwestern University. The CQE includes more than 35 corporate partners and is a member of the IBM Quantum Network.

About the Gordon and Betty Moore Foundation:

The Gordon and Betty Moore Foundation fosters path-breaking scientific discovery, environmental conservation, patient care improvements and preservation of the special character of the Bay Area. Visit Moore.org and follow @MooreFound.

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.

Read more:

Undergrads begin summer quantum research with support from Moore Foundation, Chicago region universities, national labs - EurekAlert

Read More..

Quantum computing: D-Wave shows off prototype of its next quantum annealing computer – ZDNet

Image: Wacomka/Shutterstock

Quantum-computing outfit D-Wave has announced commercial access to an "experimental prototype" of its Advantage2 quantum annealing computer.

D-Wave is beating its own path to qubit processors with its quantum annealing approach. According to D-Wave, the Advantage2 prototype available today features over 500 qubits. It's a preview of a much larger Advantage2 it hopes to be available by 2024 with 7,000 qubits.

Access to the Advantage2 prototype is restricted to customers who have a D-Wave's Leap cloud service subscription, but developers interested in trying D-Wave's quantum cloud can sign up to get "one minute of free use of the actual quantum processing units (QPUs) and quantum hybrid solvers" that run on its earlier Advantage QPU.

The Advantage2 prototype is built with D-Wave's Zephyr connection technology that it claims offers higher connectivity between qubits than its predecessor topology called Pegasus, which is used in its Advantage QPU.

D-Wave says the Zephyr design enables shorter chains in its Advantage2 quantum chips, which can make them friendlier for calculations that require extra precision.

SEE:What is quantum computing? Everything you need to know about the strange world of quantum computers

"The Advantage2 prototype is designed to share what we're learning and gain feedback from the community as we continue to build towards the full Advantage2 system," says Emile Hoskinson, director of quantum annealing products at D-Wave.

"With Advantage2, we're pushing that envelope again demonstrating that connectivity and reduction in noise can be a delivery vehicle for even greater performance once the full system is available. The Advantage2 prototype is an opportunity for us to share our excitement and give a sneak peek into the future for customers bringing quantum into their applications."

While quantum computing is still experimental, senior execs are priming up for it as a business disruptor by 2030, according to a survey by consultancy EY. The firm found found that 81% of senior UK executives expect quantum computing to play a significant role in their industry by 2030.

Fellow consultancy McKinsey this month noted funding for quantum technology startups doubled in the past two years, from $700 million in 2020 to $1.4 billion in 2021. McKinsey sees quantum computing shaking up pharmaceuticals, chemicals, automotive, and finance industries, enabling players to "capture nearly $700 billion in value as early as 2035" through improved simulation and better machine learning. It expects revenues from quantum computing to exceed $90 billion by 2040.

D-Wave's investors include PSP Investments, Goldman Sachs, BDC Capital, NEC Corp, Aegis Group Partners, and the CIA's VC firm, In-Q-Tel.

See the original post:
Quantum computing: D-Wave shows off prototype of its next quantum annealing computer - ZDNet

Read More..

McKinsey thinks quantum computing could create $80b in revenue … eventually – The Register

In the hype-tastic world of quantum computing, consulting giant McKinsey & Company claims that the still-nascent field has the potential to create $80 billion in new revenue for businesses across industries.

It's a claim McKinsey has repeated nearly two dozen times on Twitter since March to promote its growing collection of research diving into various aspects of quantum computing, from startup and government funding to use cases and its potential impact on a range of industries.

The consulting giant believes this $80 billion figure represents the "value at stake" for quantum computing players but not the actual value that use cases could create [PDF]. This includes companies working in all aspects of quantum computing, from component makers to service providers.

Despite wildly optimistic numbers, McKinsey does ground the report in a few practical realities. For instance, in a Wednesday report, the firm says the hardware for quantum systems "remains too immature to enable a significant number of use cases," which, in turn, limits the "opportunities for fledgling software players." The authors add that this is likely one of the reasons why the rate of new quantum startups entering the market has begun to slow.

Even the top of McKinsey's page for quantum computing admits that capable systems won't be ready until 2030, which is in line with what various industry players, including Intel, are expecting. Like fusion, it's always a decade or so away.

McKinsey, like all companies navigating if quantum computing has any real-world value, is trying to walk a fine line, exploring the possibilities of quantum computing while showing the ways the tech is still disconnected from ordinary enterprise reality.

"While quantum computing promises to help businesses solve problems that are beyond the reach and speed of conventional high-performance computers, use cases are largely experimental and hypothetical at this early stage. Indeed, experts are still debating the most foundational topics for the field," McKinsey wrote in a December 2021 article about how use cases "are getting real."

One could argue the report is something of a metaphor for the quantum industry in 2022. Wildl optimism about future ecosystem profitability without really understanding what the tech will mean and to whom--and at what scale.

See the original post:
McKinsey thinks quantum computing could create $80b in revenue ... eventually - The Register

Read More..

Chicago Quantum Exchange takes first steps toward a future that could revolutionize computing, medicine and cybersecurity – Chicago Tribune

Flashes of what may become a transformative new technology are coursing through a network of optic fibers under Chicago.

Researchers have created one of the worlds largest networks for sharing quantum information a field of science that depends on paradoxes so strange that Albert Einstein didnt believe them.

The network, which connects the University of Chicago with Argonne National Laboratory in Lemont, is a rudimentary version of what scientists hope someday to become the internet of the future. For now, its opened up to businesses and researchers to test fundamentals of quantum information sharing.

The network was announced this week by the Chicago Quantum Exchange which also involves Fermi National Accelerator Laboratory, Northwestern University, the University of Illinois and the University of Wisconsin.

People work in the Pritzker Nanofabrication Facility, June 15, 2022, inside the William Eckhardt Research Center at the University of Chicago. The Chicago Quantum Exchange is expanding its quantum network to make it available to more researchers and companies. Quantum computing is a pioneering, secure format said to be hacker-proof and of possible use by banks, the health care industry, and others for secure communications. (Erin Hooley / Chicago Tribune)

With a $500 million federal investment in recent years and $200 million from the state, Chicago, Urbana-Champaign, and Madison form a leading region for quantum information research.

Why does this matter to the average person? Because quantum information has the potential to help crack currently unsolvable problems, both threaten and protect private information, and lead to breakthroughs in agriculture, medicine and climate change.

While classical computing uses bits of information containing either a 1 or zero, quantum bits, or qubits, are like a coin flipped in the air they contain both a 1 and zero, to be determined once its observed.

That quality of being in two or more states at once, called superposition, is one of the many paradoxes of quantum mechanics how particles behave at the atomic and subatomic level. Its also a potentially crucial advantage, because it can handle exponentially more complex problems.

Another key aspect is the property of entanglement, in which qubits separated by great distances can still be correlated, so a measurement in one place reveals a measurement far away.

The newly expanded Chicago network, created in collaboration with Toshiba, distributes particles of light, called photons. Trying to intercept the photons destroys them and the information they contain making it far more difficult to hack.

The new network allows researchers to push the boundaries of what is currently possible, said University of Chicago professor David Awschalom, director of the Chicago Quantum Exchange.

Fourth-year graduate student Cyrus Zeledon, left, and postdoctoral student Leah Weiss, right, show senior undergraduate Tiarna Wise around one of the quantum science laboratories, June 15, 2022, inside the William Eckhardt Research Center at the University of Chicago. (Erin Hooley / Chicago Tribune)

However, researchers must solve many practical problems before large-scale quantum computing and networking are possible.

For instance, researchers at Argonne are working on creating a foundry where dependable qubits could be forged. One example is a diamond membrane with tiny pockets to hold and process qubits of information. Researchers at Argonne also have created a qubit by freezing neon to hold a single electron.

Because quantum phenomena are extremely sensitive to any disturbance, they might also be used as tiny sensors for medical or other applications but theyd also have to be made more durable.

The quantum network was launched at Argonne in 2020, but has now expanded to Hyde Park and opened for use by businesses and researchers to test new communication devices, security protocols and algorithms. Any venture that depends on secure information, such as banks financial records of hospital medical records, would potentially use such a system.

Quantum computers, while in development now, may someday be able to perform far more complex calculations than current computers, such as folding proteins, which could be useful in developing drugs to treat diseases such as Alzheimers.

In addition to driving research, the quantum field is stimulating economic development in the region. A hardware company, EeroQ, announced in January that its moving its headquarters to Chicago. Another local software company, Super.tech, was recently acquired, and several others are starting up in the region.

Because quantum computing could be used to hack into traditional encryption, it has also attracted the bipartisan attention of federal lawmakers. The National Quantum Initiative Act was signed into law by President Donald Trump in 2018 to accelerate quantum development for national security purposes.

In May, President Joe Biden directed federal agency to migrate to quantum-resistant cryptography on its most critical defense and intelligence systems.

Ironically, basic mathematical problems, such as 5+5=10, are somewhat difficult through quantum computing. Quantum information is likely to be used for high-end applications, while classical computing will likely continue to be practical for many daily uses.

Renowned physicist Einstein famously scoffed at the paradoxes and uncertainties of quantum mechanics, saying that God does not play dice with the universe. But quantum theories have been proven correct in applications from nuclear energy to MRIs.

Stephen Gray, senior scientist at Argonne, who works on algorithms to run on quantum computers, said quantum work is very difficult, and that no one understands it fully.

But there have been significant developments in the field over the past 30 years, leading to what some scientists jokingly called Quantum 2.0, with practical advances expected over the next decade.

Were betting in the next five to 10 years therell be a true quantum advantage (over classical computing), Gray said. Were not there yet. Some naysayers shake their canes and say its never going to happen. But were positive.

Just as early work on conventional computers eventually led to cellphones, its hard to predict where quantum research will lead, said Brian DeMarco, professor of physics at the University of Illinois at Urbana-Champaign, who works with the Chicago Quantum Exchange.

Thats why its an exciting time, he said. The most important applications are yet to be discovered.

rmccoppin@chicagotribune.com

Go here to see the original:
Chicago Quantum Exchange takes first steps toward a future that could revolutionize computing, medicine and cybersecurity - Chicago Tribune

Read More..

Businesses brace for quantum computing disruption by end of decade – The Register

While business leaders expect quantum computing to play a significant role in industry by 2030, some experts don't believe the tech is going to be ready for production deployment in the near future.

The findings, from a survey titled "2022 Quantum Readiness" commissioned by consultancy EY, refer to UK businesses, although it is likely that the conclusions are equally applicable to global organizations.

According to EY, 81 percent of senior UK executives expect quantum computing to have a significant impact in their industry within seven and a half years, with almost half (48 percent) believing that quantum technology will begin to transform industries as soon as 2025.

As for the naysayers who say quantum tech won't be ready for live deployment any time soon, the industry also suffers from a hype problem, with capabilities being exaggerated and even some accusations flying around of alleged falsification, as with the example of quantum startup IonQ that was recently accused by Scorpion Capital of misleading investors about the effectiveness of its quantum hardware.

Joseph Reger, Fujitsu Fellow, CTO of Central and Eastern Europe and Member of Quantum Computing Council of World Economic Forum, told The Register he is getting some "heat" for saying quantum is not nearly a thing yet.

"There are impressive advantages that pre-quantum or quantum-inspired technologies provide. They are less sexy, but very powerful."

He added: "Some companies are exaggerating the time scales. If quantum computing gets overhyped, we are likely to face the first quantum winter."

Fujitsu is itself developing quantum systems, and announced earlier this year that it was working to integrate quantum computing with traditional HPC technology. The company also unveiled a high performance quantum simulator based on its PRIMEHPC FX 700 systems that it said will serve as an important bridge towards the development of quantum computing applications in future.

Meanwhile, EY claims that respondents were "almost unanimous" in their belief that quantum computing will create a moderate or high level of disruption for their own organization, industry sector, and the broader economy in the next five years.

Despite this, the survey finds that strategic planning for quantum computing is still at an embryonic stage for most organizations, with only 33 percent involved in strategic planning for how quantum will affect them and only a quarter have appointed specialist leaders or set up pilot teams.

The survey conducted in February-March 2022 covered 501 UK-based executives, all with senior roles in their organisations, who had to demonstrate at least a moderate (but preferably a high) level of understanding of quantum computing. EY said they originally approached 1,516 executives, but only 501 met this requirement, which in and of itself tells a tale.

EY's Quantum Computing Leader, Piers Clinton-Tarestad, said the survey reveals a disconnect between the pace at which some industry leaders expect quantum to start affecting business and their preparedness for those impacts.

"Maximizing the potential of quantum technologies will require early planning to build responsive and adaptable organisational capabilities," he said, adding that this is a challenge because the progress of quantum has accelerated, but it is "not following a steady trajectory."

For example, companies with quantum processors have increased the power of their hardware dramatically over the past several years, from just a handful of qubits to over a hundred in the case of IBM, which expects to deliver a 4,158-qubit system by 2025. Yet despite these advances, quantum computers remain a curiosity, with most operational systems deployed in research laboratories or made available via a cloud service for developers to experiment with.

Clinton-Tarestad said "quantum readiness" is "not so much a gap to be assessed as a road to be walked," with the next steps in the process being regularly revisited as the landscape evolves. He warned businesses that expect to see disruption in their industry within the next three or five years need to act now.

According to EY's report, executives in consumer and retail markets are those most likely to believe that quantum will play a significant role by 2025, with just over half of technology, media and telecommunications (TMT) executives expecting an impact within the same time frame. Most respondents among health and life sciences companies think this is more likely to happen later, between 2026 and 2035.

Most organizations surveyed expect to start their quantum preparations within the next two years, with 72 percent aiming to start by 2024.

However, only a quarter of organizations have got as far as recruiting people with the necessary skills to lead quantum computing efforts, although 68 percent said they are aiming to set up pilot teams to explore the potential of quantum for their business by 2024.

Fear of falling behind because rival companies are working to develop their own quantum capabilities is driving some respondents to start quantum projects, while the applications of quantum computing anticipated by industry leaders would advance operations involving AI and machine learning, especially among financial services, automotive and manufacturing companies. TMT respondents cited potential applications in cryptography and encryption as being the most likely use of quantum computing.

While the EY report warns about companies potentially losing out to rivals on the benefits of quantum computing, there are also dangers that organizations should be preparing for now, as Intel warned about during its Intel Vision conference last month.

One of these is that quantum computers could be used to break current cryptographic algorithms, meaning that the confidentiality of both personal and enterprise data could be at risk. This is not a far-off threat, but something that organizations need to consider right now, according to Sridhar Iyengar, VP of Intel Labs and Director of Security and Privacy Research.

"Adversaries could be harvesting encrypted data right now, so that they can decrypt it later when quantum computers are available. This could be sensitive data, such as your social security number or health records, which are required to be protected for a long period of time," Iyengar told us.

Organizations may want to address threats like this by taking steps such as evaluating post-quantum cryptography algorithms and increasing the key sizes for current crypto algorithms like AES.

Or they may simply decide to adopt a wait and see attitude. EY will no doubt be on hand to sell consultancy services to help clarify their thinking.

See the original post:
Businesses brace for quantum computing disruption by end of decade - The Register

Read More..

Quantum computing can solve EVs safety woes – Times of India

Recent incidents of electric vehicle (EV) catching fire has shocked the Indian ecosystem and hindered the broad adoption of these vehicles. Before March of this year, there has been a substantial rise in the demand for electric vehicles and rapid advances in innovation and technology. Improvements in the battery technology, through increased efficiency and range, have made the EVs more accessible to the mass public, as the sector is currently dominated by two-wheelers and three-wheelers in India. According to Mordor Intelligence, Indias electric vehicle market was valued at $1.4 trillion in 2021, and it is expected to reach $15.4 trillion by 2027, recording a CAGR of 47.09% over the forecast period (2022-2027). Since March, the challenge in EV has shifted from affordability, charging, and range anxiety to safety. Safety has been of prime importance and an EV catching fire has led to dire consequences and even fatal.

The question is, why is this happening?

A report by the Defence Research and Development Organisations (DRDO) Centre for Fire Explosive and Environment Safety points it to the EV batteries. The issues highlighted includes poor quality cells, lack of fuse, issues with thermal management, and battery management system (BMS).

The highlighted issues cause the batteries to experience Thermal Runaway problem, leading to the fires. This phenomenon occurs when an increase in temperature changes the conditions in a manner that causes further increase in temperature, often leading to a destructive result. The issue highlighted by the DRDO report are all potential causes of thermal runaway. Lets explain why.

Local atmospheric temperature directly affects the operating temperature of battery. For efficient performance, batterys operating temperature should be around 20-35 C. To keep the battery at this temperature, EVs need battery thermal management system (BTMS). Now, with rising temperatures in our cities, the BTMS are being challenged and possibly due to the poor thermal management system of EV batteries, thermal runaway is being caused.

Another cause for the thermal runaway, is possibly due to the rapid battery charging. With the evolution of battery technology, charging technology is also advancing. While the fast charging can greatly improve the convenience of EVs, it increases the risks related to batteries. Fast charging an EV can overheat the battery system, enough to melt the electrical wires and cause short circuits, leading to explosive consequences, as already seen by several charging-related incidents.

While hot weather conditions and inadequate thermal management systems of the battery can negatively impact performance and shorten life, they alone cannot cause thermal runaway. As mentioned by DRDO report, inefficient, or even absence of, fuse as a fail-safe mechanism is a missing component causing thermal runaway.

The causes of thermal runaway highlighted above could be due to either inefficient design or not enough testing by EV manufacturers. But the manufacturers cannot spend more time on increased testing due to time-to-market constraints.

Whats the solution?

As stated, design and testing phase are very important phases of any product manufacturing. Since the era of industry 4.0, all design and testing have moved digitally and carried out on large-scale powerful computers through what is called Engineering Simulations (referred to as Simulations hereafter). Simulations can be of various types some of which are thermal (studying the effect of heat and temperature on object), structural (studying effect of objects strength, stress, and failure), fluid (studying effect of flow in and around an object), and electrochemical (studying effect of chemistry on electricity). Thermal runaway is a complex engineering problem, entailing all the types of simulations mentioned above. With the right simulation tools, simulations allow to mimic every possible physical condition, rising temperature, fast charging, or fuse placement and find areas of problem. After identifying, it can also aid in testing different solutions and hence avoid thermal runaway all together.

The question then becomes why are we seeing the news at all?

Biggest issue EV manufactures have with performing numerous simulations is the duration of time. To run a series of simulations, it can take months to obtain results with minimal flaws and defects (high accuracy simulations). Manufacturers cannot afford this as it greatly hampers the time to market. Thus, companies opt for simulations that can provide solutions but with several minor flaws and defects (low accuracy simulations) to them, leading to large mishaps like EV explosions, system failures, and affecting human lives. In addition, if the companies do find some time to perform these simulations with minimum flaws and defects (high accuracy simulations), the cost that manufacturers incur is very high due to the need for supercomputers whether on-premises (setup and maintenance cost) or on cloud (due high duration time of the computing).

So the real issue is the computing technology bottleneck. This is where the next-generation computing technology of Quantum computers can step in and revolutionize the industries like EV and Battery Design. This new technology is much more powerful, enabling exponential abilities to these industries.

Prospect of Quantum-powered simulations

The power Quantum computers is showcased by its ability to perform the same simulations in much less time compared to classical supercomputers. Hence, this technology can significantly help EV manufacturers in their time to market.

Moreover, the ability to obtain high accuracy from simulations is vital in using them in the product development process. Since high accuracy simulations took lot of time before, making them prohibitive, quantum-powered simulations can now enable the manufacturers to perform accurate simulations at reasonable time, in hours instead of months. Added accuracy will not only help companies create more efficient designs and improve the reliability of their vehicles, but also help in saving something invaluable, i.e., Lives. In addition, the speedup from Quantum computations enables lower computing usages, decreasing the overall cost and making it affordable for EV manufacturers.

Whats next?

In the computing sphere, Quantum Computing is the revolutionizing system, changing our understanding of computations and shows tremendous potential as shown by various use cases. While the prospect of Quantum-powered simulations offers the advantage of Better, Faster, and Cheaper, the development is very challenging as the Quantum computers work in entirely different ways.

Good news is that companies are already developing & building Quantum-powered simulation software, which can solve problems of thermal runaway and optimization of BTMS. Quantum Computing is here and now!

Views expressed above are the author's own.

END OF ARTICLE

Go here to read the rest:
Quantum computing can solve EVs safety woes - Times of India

Read More..

JUPITER, the European Unions All-AMD post-exascale supercomputer, promises to access around a quintillion operations per second – Wccftech

The EuroHPC Joint Unit initiative in the European Union recently allowed for the deployment of LUMI, the continent's first pre-exascale system that will integrate next-gen technology from AMD. This initiative and involvement with AMD will allow for a quantum-ready system to be designed with carbon-negative manufacturing and enable the globe to move closer to post-exascale computing in the European Union.

Jlich's Supercomputing Centre in Germany will be the installation home for JUPITER, with over $522 million to be spent on installation, hardware, and infrastructure, with the European Union funding the costs. The expected completion and full operation timeframe will be in 2024 when the system begins to process over a trillion operations per second.

JUPITER will allow researchers to study climate modeling, engineering of materials, sustainable energy production, and biological simulations utilizing the latest in accelerated artificial intelligence. Due to the level of workloads, incredibly taxing computations, and memory alone, that is the reason the EU is investing a large number of finances towards the supercomputer.

The organizations involved have not officially stated the hardware used to be the backbone of the JUPITER supercomputer. In the press release for JUPITER, it is indicated that GPU-based accelerators will be significant to the processing power of the system. Using star-based architecture in the LUMI system, JUPITER will also house several supercomputing modules to process GPU visualization and use a universal CPU-based accelerator and high-powered GPU clusters. Also included in the system will be a quantum computing node and temperature-based storage clusters to maintain heat levels properly. It is also reported that JUPITER will utilize rare computational models that include a fully dedicated neuromorphic computing node.

JUPITER will have a power consumption level of 15 MW, 22% less than the top global supercomputer, Frontier, which averages 19 MW of consumption. Compared to Japan's Arm-based Fugaku, JUPITER reduces it to 50% less than the previous top supercomputer from 2020.

AMD will have the upper hand in rivaling Intel with this new endeavor. Intel only has contracts with five of ten supercomputing installations, with AMD having ten of twenty installations using the company's hardware for their supercomputers. However, Intel recently announced the Silicon Junction initiative, which will see $80 billion in the European Union's research, development, and manufacturing of next-gen semiconductors.

See original here:
JUPITER, the European Unions All-AMD post-exascale supercomputer, promises to access around a quintillion operations per second - Wccftech

Read More..

CSIRO’s offer to SMEs working in cyber security – CSIRO

14 June 2022 News Release

Small and medium sized enterprises (SMEs) working on new cyber security solutions can join the free, 10-week online Innovate to Grow program, offered by CSIRO, to support their commercial idea with research and development expertise.

Upon completion of the program, participants will be able to access facilitation support, through CSIRO, to the connect to research expertise nationally, along with dollar-matched R&D funding.

CSIRO's SME Collaboration Lead Dr George Feast said the COVID-19 pandemic had led to an increased risk of cyber security attacks.

"Just like many other parts of the world, Australia's dependence on the internet saw a big increase during the pandemic, with many services moving online and more people working from home than ever before," Dr Feast said.

According to the Australian Cyber Security Centre, there was an annual increase of 13 per cent of cybercrime reports in the 2020-21 financial year.

"To stay ahead of these cyber attacks, new solutions are required, and much of this is driven by SMEs developing new products and services through R&D," Dr Feast said.

"SMEs make up 99.8 per cent of all businesses in Australia. However, R&D can be an expensive undertaking for businesses and risky for those without the right guidance and support."

The program extends beyond cyber security companies into a range of other industries that offer online solutions to their customers - such as agriculture and health - and want to improve the cyber security aspect of their offering.

"Participants will be given help to refine a new idea they want to explore and to better understand their ideas business and scientific viability. They will also be exposed to industry knowledge, hear from innovation and industry experts, and work with an R&D mentor. Companies will also tap into CSIRO's own cyber security expertise through Data61, CSIROs data and digital specialist arm ," Dr Feast said.

"Even though collaboration is key in driving good R&D outcomes, research we released last year found that less than 15 per cent of Australian businesses engage universities or research institutions for their innovation activities our goal through this program is to up that percentage."

CSIRO's Innovate to Grow: Cyber Security program, commences 26 July and is available for 20-25 Small and Medium Enterprises (SME). Applications close 11 July: Innovate to Grow: Cyber Security.

Read the original:
CSIRO's offer to SMEs working in cyber security - CSIRO

Read More..

This Week In Security: Pacman, Hetzbleed, And The Death Of Internet Explorer – Hackaday

Theres not one, but two side-channel attacks to talk about this week. Up first is Pacman, a bypass for ARMs Pointer Authentication Code. PAC is a protection built into certain ARM Processors, where a cryptographic hash value must be set correctly when pointers are updated. If the hash is not set correctly, the program simply crashes. The idea is that most exploits use pointer manipulation to achieve code execution, and correctly setting the PAC requires an explicit instruction call. The PAC is actually indicated in the unused bits of the pointer itself. The AArch64 architecture uses 64-bit values for addressing, but the address space is much less than 64-bit, usually 53 bits or less. This leaves 11 bits for the PAC value. Keep in mind that the application doesnt hold the keys and doesnt calculate this value. 11 bits may not seem like enough to make this secure, but keep in mind that every failed attempt crashes the program, and every application restart regenerate the keys.

What Pacman introduces is an oracle, which is a method to gain insight on data the attacker shouldnt be able to see. In this case, the oracle works via speculation attacks, very similar to Meltdown and Spectre. The key is to attempt a protected pointer dereference speculatively, and to then observe the change in system state as a result. What you may notice is that this requires an attack to already be running code on the target system, in order to run the PAC oracle technique. Pacman is not a Remote Code Execution flaw, nor is it useful in gaining RCE.

One more important note is that an application has to have PAC support compiled in, in order to benefit from this protection. The platform that has made wide use of PAC is MacOS, as its a feature baked in to their M1 processor. The attack chain would likely start with a remote execution bug in an application missing PAC support. Once a foothold is established in uprivileged userspace, Pacman would be used as part of an exploit against the kernel. See the PDF paper for all the details.

The other side-channel technique is a new take on an old idea. Hertzbleed is based on the idea that its possible to detect the difference between a CPU running at base frequency, and that CPU running at a boost frequency. The difference between those two states can actually leak some information about what the CPU is doing. Theres a pre-release PDF of their paper to check out for the details. The biggest result is that the standard safeguard against timing attacks, constant-time programming, is not always a reliable security measure.

It works because max frequency is dependent on the processor Thermal Design Power (TDP), the maximum amount of power a CPU is designed to use and amount of heat to dissipate. Different instructions will actually use different amounts of power and generate more or less heat based on this. More heat means earlier throttling. And throttling can be detected in response times. The details of this are quite fascinating. Did you know that even running the same instructions, with different register values, results in slightly different power draw? They picked a single cryptographic algorithm, SIKE, a quantum-safe key exchange technique, and attempted to extract a servers secret key through timing attacks.

There is a quirk in SIKE, also discovered and disclosed in this research, that its possible to short-circuit part of the algorithm, such that a series of internal, intermediary steps result in a value of zero. If you know multiple consecutive bits of the static key, its possible to construct a challenge that hits this quirk. By extension, you can take a guess at the next unknown bit, and it will only fall into the quirk if you guessed correctly. SIKE uses constant-time programming, so this odd behavior shouldnt matter. And here the Hertzbleed observation factors in. The SIKE algorithm consumes less power when doing a run containing this cascading-zero behavior. Consuming less power means that the processor can stay at full boost clocks for longer, which means that the key exchange completes slightly more quickly. Enough so, that it can be detected even over a network connection. They tested against Cloudflares CIRCL library, and Microsofts PQCrypto-SIDH, and were able to recover secret keys from both implementations, in 36 and 89 hours respectively.

There is a mitigation against this particular flaw, where its possible to detect a challenge value that could trigger the cascading zeros, and block that value before any processing happens. It will be interesting to see if quirks in other algorithms can be discovered and weaponized using this same technique. Unfortunately, on the processor side, the only real mitigation is to disable boost clocks altogether, which has a significant negative effect on processor performance.

[Frdric Basse] has a Google Nest Hub, and he really wanted to run his own Linux distro on it. Theres a problem, though. The Nest uses secure boot, and theres no official way to unlock the bootloader. Since when would a dedicated hacker let that stop him? The first step was finding a UART interface, hidden away on some unterminated channels of a ribbon cable. A custom breakout board later, and he had a U-Boot log. Next was to run through the bootup button combinations, and see what U-Boot tried to do with each. One of those combinations allows booting from a recovery.img, which would be ideal, if not for secure boot.

The great thing about U-Boot is that its Open Source under the GPL, which means that the source code should be available for perusal. Find a bug in that source, and you have your secure boot bypass. Open Source also allows some fun approaches, like running portions of the U-Boot code in userspace, and exercising it with a fuzzer. Thats the approach that found a bug, where a block size greater than 512 bytes triggers a buffer overflow. Its a generally safe assumption, as there arent really any USB storage devices with a block size greater than 512.

Never fear, a device like the Raspberry Pi Pico can run TinyUSB, which allows emulating a USB device with whatever block size you specify. A test determined that this approach did result in a repeatable crash on the real device. The code execution is fairly straightforward, writing a bunch of instructions that are essentially noop codes pointing to a payload, and then overwriting the return pointer. Code execution in the can, all that remained was to overwrite the command list and execute a custom U-Boot script. A thing of beauty.

The lowly ping command. How much can a single pair of packets tell us about a network and remote host? According to [HD Moore], quite a bit. For example, take the time given for a ping response, and calculate a distance based on 186 miles per millisecond. Thats the absolute maximum distance away that host is, though a quarter and half of that amount are reasonable lower and upper limits for a distance estimate. TTL very likely started at 64, 128, or 255, and you can take a really good guess at the hops encountered along the way. Oh, and if that response started at 64, its likely a Linux machine, 128 for Windows, and 255 usually indicates a BSD-derived OS.

Receiving a destination host unreachable message is interesting in itself, and tells you about the router that should be able to reach the given IP. Then theres the broadcast IP, which sends the message to every IP in the subnet. Using something like Wireshark for packet capture is enlightening here. The command itself may only show one response, even though multiple devices may have responded. Each of those responses have a MAC address that has can be looked up to figure out the vendor. Another interesting trick is to spoof the source IP address of a ping packet, using a machine you control with a public IP address. Ping every device on the network, and many of them will send the response via their default gateway. You might find an Internet connection or VPN that isnt supposed to be there. Who knew you could learn so much from the humble ping.

Internet Explorer is Really, Truly, Dead. If you were under the impression, as I was, that Internet Explorer was retired years ago, then it may come as a surprise to know that it was finally done in only this past week. This months patch Tuesday was the last day IE was officially supported, and from now on its totally unsupported, and is slated to eventually be automatically uninstalled from Windows 10 machines. Also coming in this months patch drop was finally the fix for Follina, as well as a few other important fixes.

Theres a new record for HTTPS DDOS attacks, set last week: Cloudflare mitigated an attack consisting of 26 million requests per second. HTTPS attacks are a one-two punch consisting of both raw data saturation, as well as server resource exhaustion. The attack came from a botnet of VMs and servers, with the largest slice coming from Indonesia.

Running the free tier of Travis CI? Did you know that your logs are accessible to the whole world via a Travis API call? And on top of that, the whole history of runs since 2013 seems to be available. It might be time to go revoke some access keys. Travis makes an attempt to censor access tokens, but quite a few of them make it through the sieve anyways.

Ever wonder what the risk matrix looks like for TPM key sniffing on boot? Its not pretty. Researchers at Secura looked at six popular encryption and secure boot applications, and none of them used the parameter encryption features that would encrypt keys on the wire. The ironic conclusion? discrete TPM chips are less secure than those built in to the motherboards firmware.

Here is the original post:
This Week In Security: Pacman, Hetzbleed, And The Death Of Internet Explorer - Hackaday

Read More..