Page 2,112«..1020..2,1112,1122,1132,114..2,1202,130..»

In Einsteins Footsteps and Beyond: New Insights Into the Foundations of Quantum Mechanics – SciTechDaily

By Harvard John A. Paulson School of Engineering and Applied SciencesMay 3, 2022

An illustration of a near-zero index metamaterial shows that when light travels through, it moves in a constant phase. Credit: Second Bay Studios/Harvard SEAS

Zero-index metamaterials offer new insights into the foundations of quantum mechanics.

In physics, as in life, its always good to look at things from different perspectives.

Since the dawn of quantum physics, how light moves and interacts with matter around it has been primarily described and understood mathematically through the lens of its energy. Max Planck used energy to explain how light is emitted by heated objects in 1900, a seminal study in the foundation of quantum mechanics. Albert Einstein used energy when he introduced the concept of the photon in 1905.

But light has another, equally important quality known as momentum. And, as it turns out, when you take momentum away, light starts behaving in really interesting ways.

An international team of physicists is re-examining the foundations of quantum physics from the perspective of momentum and exploring what happens when the momentum of light is reduced to zero. The researchers are led by Michal Lobet, a research associate at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) and Eric Mazur, the Balkanski Professor of Physics and Applied Physics at SEAS,

The research was published in the journal Nature Light Science & Applications on April 25, 2022.

Any object with mass and velocity has momentum from atoms to bullets to asteroids and momentum can be transferred from one object to another. A gun recoils when a bullet is fired because the momentum of the bullet is transferred to the gun. At the microscopic scale, an atom recoils when it emits light because of the acquired momentum of the photon. Atomic recoil, first described by Einstein when he was writing the quantum theory of radiation, is a fundamental phenomenon that governs light emission.

But a century after Planck and Einstein, a new class of metamaterials is raising questions regarding these fundamental phenomena. These metamaterials have a refractive index close to zero, meaning that when light travels through them, it doesnt travel like a wave in phases of crests and troughs. Instead, the wave is stretched out to infinity, creating a constant phase. When that happens, many of the typical processes of quantum mechanics disappear, including atomic recoil.

Why? It all goes back to momentum. In these so-called near-zero index materials, the wave momentum of light becomes zero and when the wave momentum is zero, odd things happen.

As physicists, its a dream to follow in the footsteps of giants like Einstein and push their ideas further. We hope that we can provide a new tool that physicists can use and a new perspective, which might help us understand these fundamental processes and develop new applications.

Michal Lobet, Research Associate, SEAS

Fundamental radiative processes are inhibited in three dimensional near-zero index materials, says Lobet, who is currently a lecturer at the University of Namur in Belgium. We realized that the momentum recoil of an atom is forbidden in near-zero index materials and that no momentum transfer is allowed between the electromagnetic field and the atom.

If breaking one of Einsteins rules wasnt enough, the researchers also broke perhaps the most well-known experiment in quantum physics Youngs double-slit experiment. This experiment is used in classrooms across the globe to demonstrate the particle-wave duality in quantum physics showing that light can display characteristics of both waves and particles.

In a typical material, light passing through two slits produces two coherent sources of waves that interfere to form a bright spot in the center of the screen with a pattern of light and dark fringes on either side, known as diffraction fringes.

In the double slit experiment, light passing through two slits produces two coherent sources of waves that interfere to form a bright spot in the center of the screen with a pattern of light and dark fringes on either side, known as diffraction fringes. Credit: Harvard John A. Paulson School of Engineering and Applied Sciences

When we modeled and numerically computed Youngs double-slit experiment, it turned out that the diffraction fringes vanished when the refractive index was lowered, said co-author Larissa Vertchenko, of the Technical University of Denmark.

As it can be seen, this work interrogates fundamental laws of quantum mechanics and probes the limits of wave-corpuscle duality, said co-author Iigo Liberal, of the Public University of Navarre in Pamplona, Spain.

While some fundamental processes are inhibited in near-zero refractive index materials, others are enhanced. Take another famous quantum phenomenon Heisenbergs uncertainty principle, more accurately known in physics as the Heisenberg inequality. This principle states that you cannot know both the position and speed of a particle with perfect accuracy and the more you know about one, the less you know about the other. But, in near-zero index materials, you know with 100% certainty that the momentum of a particle is zero, which means you have absolutely no idea where in the material the particle is at any given moment.

This material would make a really poor microscope, but it does enable to cloak objects quite perfectly, Lobet said. In some way, objects become invisible.

These new theoretical results shed new light on near-zero refractive index photonics from a momentum perspective, said Mazur. It provides insights into the understanding of light-matter interactions in systems with a low- refraction index, which can be useful for lasing and quantum optics applications.

The research could also shed light on other applications, including quantum computing, light sources that emit a single photon at a time, the lossless propagation of light through a waveguide, and more.

The team next aims to revisit other foundational quantum experiments in these materials from a momentum perspective. After all, even though Einstein didnt predict near-zero refractive index materials, he did stress the importance of momentum. In his seminal 1916 paper on fundamental radiative processes, Einstein insisted that, from a theoretical point of view, energy and momentum should be considered on a completely equal footing since energy and momentum are linked in the closest possible way.

As physicists, its a dream to follow in the footsteps of giants like Einstein and push their ideas further, said Lobet. We hope that we can provide a new tool that physicists can use and a new perspective, which might help us understand these fundamental processes and develop new applications.

Reference: Momentum considerations inside near-zero index materials by Michal Lobet, Iigo Liberal, Larissa Vertchenko, Andrei V. Lavrinenko, Nader Engheta and Eric Mazur, 25 April 2022, Light: Science & Applications.DOI: 10.1038/s41377-022-00790-z

Read this article:

In Einsteins Footsteps and Beyond: New Insights Into the Foundations of Quantum Mechanics - SciTechDaily

Read More..

2 Berkeley Lab Physicists Elected into the National Academy of Sciences – Lawrence Berkeley Lab (.gov)

Joel Moore, left, and Joseph W. Orenstein (Credit: UC Berkeley; courtesy of Joseph W. Orenstein)

Two Lawrence Berkeley National Laboratory (Berkeley Lab) physicists have been elected into the National Academy of Sciences (NAS) in recognition of their distinguished and continuing achievements in original research. Joel Moore and Joseph W. Orenstein join 120 scientists and engineers from the U.S. and 30 from across the world as new lifelong members and foreign associates.

Joel Moore is a senior faculty scientist in the Materials Sciences Division, professor of physics at UC Berkeley, and the director of the Center for Novel Pathways to Quantum Coherence in Materials. His theoretical work studies the properties of quantum materials, in which interactions between electrons yield new states of matter. He also investigates how quantum physics can lead to new devices for spin-based electronics and quantum sensing.

Before joining Berkeley Lab and UC Berkeley in 2002, Moore was a postdoc in the theoretical physics research group at Bell Labs. Moore received his bachelors degree in physics from Princeton University in 1995 and spent a Fulbright year abroad before graduate studies at the Massachusetts Institute of Technology on a Hertz fellowship.

He is a fellow of the American Physical Society, a Simons Investigator, and Chern-Simons Professor of Physics at UC Berkeley.

Joseph W. Orenstein is a senior faculty scientist in the Materials Sciences Division and a professor of physics at UC Berkeley. He has led the development of advanced experimental techniques to investigate how new materials, such as high-temperature superconductors, multiferroics, topological materials, and frustrated magnets, interact with light.

Orenstein earned his Ph.D. in solid state physics from the Massachusetts Institute of Technology in 1980. Before joining Berkeley Lab and UC Berkeley in 1990, he was an IBM postdoctoral fellow and a distinguished member of the technical staff at AT&T Bell Laboratories.

He received the American Physical Society Isaakson Prize for Optical Effects in Solids in 2008.

In 2020, he was honored by the Gordon and Betty Moore Foundation with an Experimental Investigators in Quantum Materials (EPIQs) award. He is a fellow of the American Physical Society.

In addition to Moore and Orenstein, Berkeley Lab Advisory Board members Young-Kee Kim and France Crdova were also elected into the 2022 class.

The NAS was founded in 1863 to provide the country with a non-partisan council of scientific and technological leaders who could lend expertise and advice to the government. Every year, a new class of 120-150 members are elected by existing members in recognition of distinguished achievement in their respective fields. There is now a total of 2,512 active American members and 517 international members.

###

Founded in 1931 on the belief that the biggest scientific challenges are best addressed by teams, Lawrence Berkeley National Laboratory and its scientists have been recognized with 14 Nobel Prizes. Today, Berkeley Lab researchers develop sustainable energy and environmental solutions, create useful new materials, advance the frontiers of computing, and probe the mysteries of life, matter, and the universe. Scientists from around the world rely on the Labs facilities for their own discovery science. Berkeley Lab is a multiprogram national laboratory,managed by the University of California for the U.S. Department of Energys Office of Science.

DOEs Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.

See more here:

2 Berkeley Lab Physicists Elected into the National Academy of Sciences - Lawrence Berkeley Lab (.gov)

Read More..

Searching for What Connects Us, Carlo Rovelli Explores Beyond Physics – The New York Times

Perhaps its Rovellis writing style, along with his facility with ideas, that sets him apart from other popular science writers. For some readers, he said, the writing in my books is what matters to them. And the truth is I use analogies, some poetical, but its not coloring or embellishment. Its actually where Im trying to go, trying to transmit some emotion, some sense of marvel, some sense of the core.

Simon Carnell, along with his late wife, Erica Segre, translated five of Rovellis books, including his new one. He said in an email that he sees Rovellis style as highly compressed without ever becoming dry or airless. He added that Rovelli has the scientific instinct to avoid and pare away every superfluous word (including of the translations of his work), but more importantly, a writerly ability to do so in the service of a style that is elegant, lively and above all engaging.

Beyond offering Rovellis heady but lean synthesis of science and the humanities, his new book also features pieces dealing with politics, climate change and justice. Dean Rickles, a professor of history and philosophy of modern physics at the University of Sydney, said in an interview over Zoom that this larger project of Rovellis, with its theme of interdependence, is particularly compelling.

Hes concerned now with justice and with peace and with climate. He has become a sort of very political scientist, he said. I think you can boil it all down, actually, to sort of a quality, like a democracy in all things Were all interdependent.

Maybe the best way to think of Rovellis worldview is through the work of Ngrjuna, a second-century Indian Buddhist philosopher he admires. Author of The Fundamental Wisdom of the Middle Way, Ngrjuna taught that there is no unchanging, underlying, stable reality that nothing is self-contained, that all is variable, interdependent. Reality, in short, is always something other than what it just was, or seemed to be, he argues. To define it is to misunderstand it.

In Emptiness is Empty: Ngrjuna, another piece from his new book, Rovelli writes about how the philosophers conception of reality provokes a sense of awe, a sense of serenity, but without consolation: To understand that we do not exist is something that may free us from attachments and from suffering; it is precisely on account of lifes impermanence, the absence from it of every absolute, that life has meaning.

Before leaving Rovellis home that day, I took another look at the concealing snow outside. Reality seemed at once more compelling and more mysterious. Hesitating, I asked him if he thought there was any grand, capital T truth. He indulged me, then paused for a moment.

Read more:

Searching for What Connects Us, Carlo Rovelli Explores Beyond Physics - The New York Times

Read More..

Here at Yale: Sounds from another realm – Yale News

Under an early evening dusk, made darker by rain clouds overhead, shades of red, blue, and rose flowed across the white faade of 17 Hillhouse Avenue as an electronic landscape of sounds pulsed from speakers.

This was the scene on a recent April evening, as a crowd gathered outside the home of the Yale Quantum Institute (YQI) to celebrate the end of Yale Quantum Week and the release of the album Quantum Sound, a collaboration between a trio of scientists and musicians Spencer Topel, a sound artist and composer and the 2018-2019 YQI artist-in-residence, and two former Yale physics graduate students, Kyle Serniak 19 Ph.D. and Luke Burkhart 20 Ph.D. They were brought together by Florian Carle, YQIs manager and creator of the artist-in-residence program, and the producer of the album.

The album project, which transforms the measurements of superconducting quantum devices into sound, both reflects the raw data and shapes it into an audio narrative.

In the same way that some people are more receptive to music as opposed to visual art, they may be more receptive to music as a medium for conveying certain scientific concepts, said Serniak, who now works at MIT Lincoln Laboratory, a federally funded research and development center in Massachusetts.

Topel looks at the project in the other direction: How can quantum physics unlock new forms of musical and artistic expression?

Eventually some of these [quantum] experiments will yield new devices that we can use to solve altogether different problems, he went on. A portion of these problems will be art, dance, writing, and music.

Quantum Sound may not have the hooks to propel it up the Billboard charts, but attendees of the Yale event about 150 physics-heads, electronic-music geeks, and assorted curiosity seekers quickly snapped up the 100 vinyl copies of the album offered as a giveaway. The album is a single 32-minute track, split into two sides: Noise and Tone. The light show, designed by Carle, used tones of blue and red to signify the movement of superconducting qubits from a grounded state to an excited one.

As the album played, soft sonic waves gave way to what sounded like rolls of thunder; fittingly, as the music rose to a crescendo, the skies opened up. Undeterred by the downpour, the intrepid audience sheltered under umbrellas and listened on.

The project, which was first performed live at the International Festival of Arts and Ideas in June 2019, has had a longer afterlife than even its creators anticipated. At the beginning it just seemed like a fun idea, said Serniak. I am pleasantly surprised that it's being pressed on vinyl and that were still talking about it three years later!

Quantum Sound was also presented at the first International Symposium on Quantum Computing and Musical Creativity, an online event held last fall. And a chapter detailing the science behind the art, from the data-generating technology the trio used to the musical motifs they surfaced, will be published in the forthcoming book Quantum Computer Music: Foundations, Methods and Advanced Concepts. And Topel is looking into releasing the instruments they used for Quantum Sound three variations of quantum synthesizers on a software platform that would allow students of all ages to experiment with the technology and get closer to the science.

This project really changed how I think about the tools we use as artists to perceive the world, said Topel. Superconducting systems, like the ones being studied at Yale, have great computational potential, but they also offer us a glimpse into an altogether different part of our universe, the quantum realm.

See the article here:

Here at Yale: Sounds from another realm - Yale News

Read More..

Quantum Future: Developing the Next Generation of Quantum Algorithms and Materials – SciTechDaily

Quantum computers are especially adept at simultaneously considering large numbers of possible combinations, but the instability of qubits in modern devices contributes to errors in calculations. Credit: Image by Timothy Holland | Pacific Northwest National Laboratory

Quantum computers are anticipated to revolutionize the way researchers address complex computing problems. These computers are being developed to address major challenges in fundamental scientific fields such as quantum chemistry. In its present state of development, quantum computing is very susceptible to noise and disruptive influences in the environment. This makes quantum computers noisy, since quantum bits, or qubits, lose information when they go out of sync, a process known as decoherence.

To address the constraints of current quantum computers, researchers at Pacific Northwest National Laboratory (PNNL) are constructing simulations that demonstrate how quantum computers work.

When we try to directly observe the behavior of quantum systems, like qubits, their quantum states will collapse, explained PNNL Computer Scientist Ang Li. Li is also a researcher at the Quantum Science Center and the Co-Design Center for Quantum Advantage, two of the five Department of Energy National Quantum Information Science Research Centers. To get around this, we use simulations to study qubits and their interaction with the environment.

Artists rendering of a quantum computer. Credit: Image by Jeffrey London | Pacific Northwest National Laboratory

Li and collaborators at Oak Ridge National Laboratory and Microsoft employ high-speed computing to create simulators that imitate genuine quantum devices for executing sophisticated quantum circuits. They recently integrated two distinct kinds of simulations to produce the Northwest Quantum Simulator (NWQ-Sim), which is used to test quantum algorithms.

Testing quantum algorithms on quantum devices is slow and costly. Also, some algorithms are too advanced for current quantum devices, said Li. Our quantum simulators can help us look beyond the limitations of existing devices and test algorithms for more sophisticated systems.

Nathan Wiebe, a PNNL joint appointee from the University of Toronto and an affiliate professor at the University of Washington, is taking a different approach to writing quantum computer code. Though being constrained by the capabilities of existing quantum devices might be irritating at times, Wiebe views this obstacle as an opportunity.

Noisy quantum circuits produce errors in calculations, said Wiebe. The more qubits that are needed for a calculation, the more error-prone it is.

Wiebe and collaborators from the University of Washington developed novel algorithms to correct for these errors in certain types of simulations.

This work provides a cheaper and faster way to perform quantum error correction. It potentially brings us closer to demonstrating a computationally useful example of a quantum simulation for quantum field theory on near-term quantum hardware, said Wiebe.

Quantum circuit simulation can reveal the impact of noise on intermediate-scale quantum devices. Credit: Composite image by Donald Jorgensen | Pacific Northwest National Laboratory

While Wiebe seeks to reduce the noise by developing error-correcting algorithms, physicist Ben Loer and his colleagues turn to the environment to manage external sources of noise. Loer employs his experience in creating ultra-low levels of natural radioactivity, which is required to search for experimental evidence of dark matter in the universe, to aid in the prevention of qubit decoherence.

Radiation from the environment, such as gamma rays and X-rays, exists everywhere, said Loer. Since qubits are so sensitive, we had an idea that this radiation may be interfering with their quantum states.

To test this, Loer, project lead Brent VanDevender, and colleague John Orrell, teamed up with researchers at the Massachusetts Institute of Technology (MIT) and MITs Lincoln Laboratory used a lead shield to protect qubits from radiation. They designed the shield for use within a dilution refrigeratora technology used to produce the just-above-absolute-zero temperature necessary for operating superconducting qubits. They saw that qubit decoherence decreased when the qubits were protected.

While this is the first step toward understanding how radiation affects quantum computing, Loer plans to look at how radiation disturbs circuits and substrates within a quantum system. We can simulate and model these quantum interactions to help improve the design of quantum devices, said Loer.

Loer is taking his lead-shielded dilution refrigerator research underground in PNNLs Shallow Underground Laboratory with the help of PNNL Chemist Marvin Warner

If we develop a quantum device that doesnt perform as it should, we need to be able to pinpoint the problem, said Warner. By shielding qubits from external radiation, we can start to characterize other potential sources of noise in the device.

Video: Pacific Northwest National Laboratory

PNNL supports a wide variety of quantum-related research, from quantum simulations and developing algorithms for quantum chemistry to the development of precision materials for quantum devices.

PNNL also partners with other institutions in the Pacific Northwest to accelerate quantum research and develop a quantum information science-trained workforce through the Northwest Quantum Nexus (NQN). Additionally, the NQN hosts a seminar series featuring leaders in quantum research. The NQN synergizes partnerships between companies, such as Microsoft and IonQ, as well as the University of Oregon, the University of Washington, and Washington State University.

PNNLs cultivation of both industry and university collaborations are building a foundation for quantum computing in the Pacific Northwest that sets the stage for future hybrid classical-quantum computing, said James (Jim) Ang. Ang is the chief scientist for computing and PNNLs sector lead for the Department of Energy (DOE) Advanced Scientific Computing Research program.

Lis research was supported by the DOE Office of Science (SC), National Quantum Information Science Research Centers: Quantum Science Center and Co-Design Center for Quantum Advantage. He was also supported by the Quantum Science, Advanced Accelerator laboratory-directed research and development initiative at PNNL.

Wiebes research was supported by the DOE, SC, Office of Nuclear Physics, Incubator for Quantum Simulation, and the DOE QuantISED program. Wiebe is also supported by DOE, SC, National Quantum Information Science Research Centers, Co-Design Center for Quantum Advantage, where he is the Software thrust leader.

Loers research was supported by the DOE, SC, Office of Nuclear Physics and Office of High Energy Physics. Warners research was supported by the DOE, SC, National Quantum Information Science Research Centers, Co-Design Center for Quantum Advantage.

References: Impact of ionizing radiation on superconducting qubit coherence by Antti P. Vepslinen, Amir H. Karamlou, John L. Orrell, Akshunna S. Dogra, Ben Loer, Francisca Vasconcelos, David K. Kim, Alexander J. Melville, Bethany M. Niedzielski, Jonilyn L. Yoder, Simon Gustavsson, Joseph A. Formaggio, Brent A. VanDevender, and William D. Oliver, 26 August 2020, Nature.DOI: 10.1038/s41586-020-2619-8

Quantum Error Correction with Gauge Symmetries by Abhishek Rajput, Alessandro Roggero and Nathan Wiebe, 9 December 2021, arXiv.DOI: 10.48550/arXiv.2112.05186

Visit link:

Quantum Future: Developing the Next Generation of Quantum Algorithms and Materials - SciTechDaily

Read More..

Outstanding Seniors in the College of Science: Justin Hink – University of Arizona News

This spring, each department in the University of Arizona's College of Science nominated an outstanding senior who went above and beyond during their time as a Wildcat. We are pleased to share their stories as they reflect on their time at UArizona. Next up in the senior spotlight series is Justin Fink.

Hometown: Marana, AZ

Degrees: Physics and Astronomy

College of Science:Why did you choose your area of study?

Justin: At Marana High School, I started learning physics sophomore year. My teacher, Mark Calton, taught me Newtons kinematic equations. I thought it was fascinating to learn so much about the motion of objects from a few initial conditions. I had started an engineering club with Mark where we created trebuchets, a duct tape water bottle, a duct tape boat, and many other projects. I used my introductory physics knowledge to know how much force our trebuchet was applying and the distance the golf ball would travel. I wanted to learn more. I went through two years of AP physics classes learning thermodynamics, optics, electromagnetism, and quantum mechanics, of course, all in a simplistic manner. I was able to take an Astronomy course with Mark as well. My physics classes and teacher got me out of my bubble and convinced me to put in the effort necessary to take on and achieve this degree.

COS:Tell us about a class or research project you really enjoyed.

Justin: The most memorable research project I have worked on throughout these four years of college was with the Thomas Jefferson National Accelerator Facility (JLab). I have worked with them since the summer after my junior year. This was the first time I had to search through textbooks and teach myself a topic for research. This experience gave me an abundance of opportunities, from seminars to writing papers, all the way to a poster presentation at Rice University. This internship even led me to learn more about medical physics and change the direction of my career.

COS:What is one specific memory from your time at UA that you'll cherish forever?

Justin: I will always remember going up to Mt Lemmon with a group of astronomy friends. They had an 8 telescope so we could see the rings of Jupiter. It is a whole new experience to see the rings in person rather than a nice picture online. Surprisingly, it was my first time on Mt Lemon, even though I have lived here my whole life.

COS:What is next for you after graduation?

Justin: After graduation, I am working with the Thomas Jefferson National Accelerator Facility over the summer. Then, I am moving on to UCLA this coming Fall. I was accepted into their Department of Physics and Biology in Medicine to work towards a Medical Physics Ph.D.

Continue reading here:

Outstanding Seniors in the College of Science: Justin Hink - University of Arizona News

Read More..

Being a responsible CTO isn’t just about moving to the cloud – Information Age

Cloud migration alone won't be enough to tackle the climate change crisis.

Nick Westall, CTO of CSI, discusses why being a responsible CTO when it comes to sustainability isnt just about moving to the cloud

IT departments are being asked by their boards to demonstrate improved sustainability. Often, they attribute their move to the cloud as a way of showing commitment to the reduction of their carbon footprint and the green agenda. While its a good example of carbon offset, is it just a way to pass the buck to the cloud provider rather than look in more depth about being a responsible user of IT? How much real consideration goes into calculating and demonstrating responsible computing in business today?

In 2020, the IBM Academy of Technology (AoT) brought together experts in many fields to discuss responsible computing. They analysed the anxieties of over 100 CTOs and looked at all aspects of IT to answer questions such as, Am I doing enough to be sustainable?; Are we being ethical in our use of data?; and Am I doing enough to ensure that the infrastructure we use is minimising its impact on the environment?. As a result, the initiative has created a framework which outlines important considerations of responsible computing.

From the more obvious topics associated with running your computing infrastructure to its wider impacts, the AoT framework summarised that responsible computing comprised of six main domains:

The reasons for needing to be a responsible CTO are just as strong as the need to be a tech-savvy one if a company wants to thrive in a digital economy. There are many facets to being a responsible CTO, such as making sure that code is being written in a diverse way, and that citizen data is being used appropriately.

In a BCS webinar, IBM fellow and vice-president for technology in EMEA, Rashik Parmar, summarised that the three biggest forces driving unprecedented change today included post-pandemic work; digitalisation; and the climate emergency. With many organisations turning to technology to help solve some of the biggest challenges theyre facing today, its clear that there will need to be answers about how this tech-heavy economy will impact the environment. It makes sense that this is often the first place that a CTO will start when deciding how to drive a more responsible future.

When you consider that each of the six domains above will make a huge difference to how responsible an organisation is deemed, its easy to see why its about much more than a move to the cloud.

If we focus on the environmental considerations, its becoming more commonly known that whilst a move to the cloud may be better for reducing an organisations carbon emissions than running multiple on-premises systems, the initiative alone isnt going to spell good news for climate change. In fact, if everyone were to move to the cloud in droves, the internet would need to quickly find a way of being more sustainable.

A large part of this is the requirement for major cloud providers to switch most of their data centres to more renewable energy sources. This is an area that falls outside of a customers control. But in fact, the issue of reducing emissions can be influenced far more greatly by the activities undertaken by the organisations themselves. It can all come down to understanding how their workloads are running and whether it is driving high levels of utilisation.

When it comes to running at high levels of utilisation, the answer isnt necessarily all cloud driven. For example, the performance and scale of the new IBM Power server, the E1080, delivers the benefits of consolidation at levels far higher than is possible with x86-based alternatives.

When compared with the cloud hyperscalers, such as Azure and AWS, it offers a greater ability to scale. In a recent webinar, David Spurway, IBM systems technology architect, modelled IBMs Power10 against Azure and AWS, looking at the largest of their offerings which could then be filled up with workloads such as multiple containers*.

*In the model, David used the conservative performance metric of the IDC QPI, which gives relative performance across architectures but does not consider advantages for specific workloads. For example, the new Power10 cores can hold over four times more containers per core than x86, so the real values may be even higher.

This is how the performance of the virtual servers, bare metal server, and physical IBM Power servers look when considering their maximum capacity:

When dividing the server performance rating by the number of cores, David then demonstrates a rough estimate of performance per core. This demonstrates that by doing much more per core, fewer cores are needed, which leads to less energy needed.

Whilst virtual servers in the cloud can run at high levels of utilisation near the 90% mark the average utilisation is much lower. This is because the number of workloads that smaller servers can hold is lower, and that means more servers are needed. With IBMs large E1080 server, average and peak utilisations demonstrate a more efficient use of the computational resources available. When utilisation is combined with scale and performance it provides a very powerful and efficient option.

If an organisation is on a software licencing model that adjusts depending on the level of utilisation and usage, it is also possible to achieve significant cost savings. The ability to turn applications on and off when not in use will reduce the number of processors needed. If the organisation is charged by the processor core, this could be as much as five times less expensive because of the software savings when running fewer cores.

There are detailed methodologies available that help organisations establish where they are today in terms of responsible computing. As sustainability is such a relatively new factor, its important for an organisation to understand where it is so that it can measure what improvements are being made in the form of Key Performance Indicators.

The three main considerations can be summarised as:

The responsible computing deliverables include methodologies for all six domains to help companies understand where they are today and demonstrate how they are improving and delivering against their KPIs. While its true that cloud infrastructure does deliver carbon savings, organisations should also think about the environmental and cost saving benefits in other on-premises measures.

Related:

A guide to responsible technology practices Taking a look at four areas of tech that look to create a more positively impactful future.

Everyone likes to talk sustainability, but who takes responsibility? Michiel Verhoeven, managing director at SAP UKI, explores who needs to take responsibility for sustainability initiatives in the organisation.

View post:
Being a responsible CTO isn't just about moving to the cloud - Information Age

Read More..

Data Management News for the Week of May 6; Updates from AtScale, Immuta, SingleStore, and More – Solutions Review

The editors at Solutions Review have curated this list of the most noteworthy data management news items for the week of May 6, 2022.

Keeping tabs on all the most relevant big data and data management news can be a time-consuming task. As a result, our editorial team aims to provide a summary of the top headlines from the last week, in this space. Solutions Review editors will curate vendor product news, mergers and acquisitions, venture capital funding, talent acquisition, and other noteworthy big data and data management news items.

This new release features enhanced interface support for the Amazon S3 REST API; security improvements for sensitive applications with strict encryption compliance and regulatory requirements; and strengthened automated data movement functionality across heterogeneous storage systems without the need to manually move or copy the data.

Read on for more.

Each episode of the series explores a different facet of modern cloud data analytics. Set in a dramatized cloud data warehouse, the main characters are personifications of common business data types including CRM, Finance, Support, and HR. A high-strung Query Engine attempts to corral the data sets into satisfying requests from The Business.

Read on for more.

Metadata Metrics scan existing query logs to automatically track key operational metrics, including the time since tables were last loaded, the number of rows inserted, and the number of read queries run on every dataset. Metadata Metrics take only minutes to set up, with zero manual configuration and almost no additional load to the warehouse.

Read on for more.

Companies searching Googles marketplace for cloud solutions to integrate and connect disparate systems (apps, servers, data, clouds, gadgets and other things) will now be able to partner with Digibee. Digibees entrance to Google Cloud Marketplace follows the companys$25 millionSeries A and will support its international expansion.

Read on for more.

An integration plugin now ships with every Starburst Enterprise instance and features include scalable attribute-based access control (ABAC), sensitive data discovery and classification, data policy enforcement and advanced policy building, and dynamic data masking auditing.

Read on for more.

This subscription-based software, which can be deployed on prem, in private clouds, or in public clouds, is designed to enable customers to scale out as needed; enjoy the familiarity of a relational database while being able to handle complex queries for analytics; and process, query and serve near real-time and historic data using a single, multi-model database.

Read on for more.

Building on the platforms history of success in complex enterprise data migrations, the company is introducing the fully cloud-based version of Syniti Advanced Data Migration, now called Syniti Migrate. With Syniti Migrate, customers will have greater control of their data in a faster, more secure offering from any source to any target.

Read on for more.

The TDWI Data Management Maturity Model Assessment Guide provides a primer on whats driving the need for data management (including data fabrics, increasing diverse data types, hybrid cloud environments, AI and other modern analytics, and augmented intelligence). The model provides a framework for companies to understand where they are, where theyve been, and where they still need to go to support strong data management.

Read on for more.

For consideration in future data analytics news roundups, send your announcements to tking@solutionsreview.com.

Tim is Solutions Review's Editorial Director and leads coverage on big data, business intelligence, and data analytics. A 2017 and 2018 Most Influential Business Journalist and 2021 "Who's Who" in data management and data integration, Tim is a recognized influencer and thought leader in enterprise business software. Reach him via tking at solutionsreview dot com.

Here is the original post:
Data Management News for the Week of May 6; Updates from AtScale, Immuta, SingleStore, and More - Solutions Review

Read More..

Why Soon-To-Be Unsupported Windows 2012 and 2012 R2 Servers Pose Serious Corporate, Personal Risk for Execs – TechDecisions

Microsoft will officially end extended support for Windows Server 2012 and 2012 R2 in October of 2023, meaning the company will stop providing users with critical security updates and patches.

Organizations that continue to leverage Windows Server 2012 and 2012 R2 after this date will become increasingly vulnerable to cyber attack and compliance risks.

Any business that is still running Windows Server 2012 and 2012 R2 needs to institute a migration policy as soon as possible. Migrations can take months to years to complete depending on the number of servers and the size of the company.

IT execs without an upgrade path will soon find themselves at a critical point of no return that may leave their business and their executives personally liable for the risk caused by unsupported servers.

When Windows ended support for Windows 7 in January 2021, the US Federal Bureau of Investigation issued a warning to industry users that the platform had become unsafe.

As time passes, Windows 7 becomes more vulnerable to exploitation due to lack of security updates and new vulnerabilities discovered, the FBI notice said. With fewer customers able to maintain a patched Windows 7 system after its end of life, cybercriminals will continue to view Windows 7 as a soft target.

As expected, hackers thrive in attacking environments that no longer receive security support. These attacks do not just hit the technology product in question, but also serve as an entry point into your entire enterprise.

Related: What IT Pros Need to Know About Windows Server 2022

That was the case in 2018 when Zoll, a medical device vendor, sued Barracuda Networks. Zoll contended that Barracuda failed to manage a server migration properly, leaving the data of more than 275,000 of its users exposed.

As a result of those failures, Zoll is now liable for injury and damages incurred by its patients because of the breach. Failing to ensure all systems remain in compliance can put your company at risk and for senior management, possibly even making them criminally liable in the case of a security breach on unsupported OSs.

Along with security challenges, there is also the loss of functionality. Your organization relies on Windows Server 2012 or 2012 R2 to run applications and manage data on a daily basis. Microsofts Modern Lifecycle Policy calls for an organization to use the most current and updated applications. However, when those applications are updated, they are done without outdated servers. This creates issues in performance, compatibility, and reliability.

If you find yourself behind on the migration for Windows Server 2012 and 2012 R2 there is still time to act. Microsoft offers four primary ways for users to transfer data and applications to a new server platform. These include:

Microsoft and other technology providers give users plenty of runway to prepare for end-of-life events. Following the decommission of Windows Server 2012, the countdown begins for both Windows Server 2016 and Windows Server 2019, along with their different versions.

While many organizations rely on extended support, organizations should look to migrate servers before the standard end-of-life date. Extended support costs more, and organizations that delay a migration could find themselves quickly migrating data up to the last minute, elevating their risk.

Take a proactive approach to migrating server data. While these migrations offer a significant effort from technology teams, they are critical to maintaining operations and reducing security risk. Create a robust action plan for future migrations, and dont let end-of-life deadlines sneak up on you.

Paul Deur is co-founder of ReadyWorks, a digital platform conductor (DPC), which collects and aggregates data from IT and business systems and spreadsheets, then cleans and analyzes information about the entire IT estate, including endpoints, users, applications, servers, and all their interdependencies. The company identifies risk/what needs to be upgraded, defines the rules for change, uses artificial intelligence (AI) and intelligent automation to automate and orchestrate all human and system workflows, and reports on results. ReadyWorks provides up-to-date audit trails that can be used to demonstrate security compliance.

Link:
Why Soon-To-Be Unsupported Windows 2012 and 2012 R2 Servers Pose Serious Corporate, Personal Risk for Execs - TechDecisions

Read More..

Upon 30th Anniversary of Chernobyl, ShelterZoom Announces Blockchain-based Solution For Management Of Nuclear Waste On A Global Scale – Digital…

Innovative partnership with impact initiative Unite For Italy and advisory firm Morichi Atelier drives sustainable solutions for nuclear waste management through a worlds first use case for distributed ledger technology

NEW YORK, NY / ACCESSWIRE / May 5, 2022 / As the world observes the 30thth anniversary of Chernobyl, ShelterZoom, Unite For Italy and Morichi Atelier announce the launch of the worlds first nuclear waste management use case for blockchain-based distributed ledger technology. This endeavor, dubbed Hercules, developed as a part of the MICADO European project and funded by the European Unions Horizon 2020 research and innovation program, aims to protect and secure all digi-waste (digital nuclear waste) sensitive data and digi-waste management operations through a powerful multi-functional platform.

Left to right: Chao Cheng-Shorland, CEO of ShelterZoom and Giordano Morichi, CEO of Morichi Atelier / Unite for Italy interviewed at Nasdaq television studios.

This innovative idea stemmed from Unite for Italy (a non-profit impact initiative) and was facilitated by Morichi Atelier, a boutique advisory firm, that identified a potential 30M USD market opportunity in the digi-waste sector. The Hyperledger blockchain Hercules collaboration with ShelterZoom aims to fix the outdated management and data protocols of the nuclear decommissioning and dismantling phases. The potential vulnerabilities caused by security breaches and/or human error can be largely prevented with this efficient digital solution. Until today, the physical data collected in analysis stages has not being fully secured through sufficient infrastructure ensuring data protection during waste characterization analysis. To solve the problem and eliminate any form of human error, the MICADO project developed a series of interconnected smart instrumentations to digitalize, process, and link all sensitive information to its protected internal cloud servers. These new systems are able to scan, measure, and track any form of radioactive elements (barrels, containers, bags, and any other radioactive material) in a much more secure and efficient manner. What was processed before via paper documents is now being immediately digitized through the use of technology marking the beginning of a new age for the industry called digi-waste.

At ShelterZoom, using our technology powered by blockchain, we are working on providing the highest level of security in a variety of fields. Our layered interconnected templates help categorize information, provide different levels of access, and tokenize data strings needed for waste characterization. We are thrilled to be part of this groundbreaking opportunity provided by Unite for Italy and are ready to improve the security level of data in the digi-waste field,. comments Chao Cheng-Shorland, CEO of ShelterZoom.

Companies must be built with the commitment to develop a better tomorrow for our global society and our planet following sustainable, impact-based business models. Sharing our same core values, ShelterZoom and their powerful technology follow these criteria and we believe they can add value in this specific sector. We are excited to be working with them and help guide business operations that will transform some of the complex problems in the field of nuclear energy into tangible solutions with a high level of impact, said Giordano Morichi, CEO & Founder of Morichi Atelier / Unite for Italy.

The innovative partnership between ShelterZoom, Unite For Italy and Morichi Atelier brings together a cutting-edge SaaS technology provider, a non-profit impact initiative aiming to drive advocacy for other impact initiatives, and a boutique sustainable development consulting firm to provide solutions for one of the most difficult issues the nuclear industry faces: data security in waste management. The team plans to develop a fully comprehensive roadmap to provide technology-oriented results that enhance the protection of nuclear data, sensitive information, and management procedures aimed at the resolution of the bigger issues at hand. By creating a more sustainable world, furthering a change in peoples perspectives regarding this energetic sector and paving the way for more companies to use their know-how to make a difference, ShelterZoom and Morichi Atelier have started building the first digitally-secure and fully scalable model that can accommodate the dozens of players in the nuclear waste management landscape.

About ShelterZoom Corp:

ShelterZoom, is a leading provider of enterprise-level blockchain-based Smart Documents, Smart Contracts and Blockchain API integration services. The blockchain-based SaaS software company was founded in 2017, servicing large enterprises, government agencies, law firms, non-profits, the publishing industry, academic institutions, real estate and small businesses with fully supported blockchain smart document applications, tokenization and digital asset solutions. As part of the companys commitment to improving the lives of people around the world ShelterZoom is a member of Humanity 2.0, an international consortium of organizations supporting human flourishing, and, as a signatory of the United Nations Global Compact, has several tools to support the UNs Sustainable Development Goals.

About Morichi Atelier LLC:

Morichi Atelier LLC is built with the commitment to develop a better tomorrow, today. The company, overseeing business development, technological innovation, environmental, philanthropic, and impact investment horizons, helps to transform ideas of development into solid plans of actions for all clients in the areas of its expertise; this happens through a series of project ventures with external partners that share the business philosophy, circular economy ideas, and sustainable impact. At the heart of Morichi Atelier is the core belief that todays companies have to be built to create a better future for our global society and our planet following ESG principles, circular economy action plans, and sustainable business models based on impact. All projects pursued, developed singularly or through the joint effort of our partners, try to integrate as much as possible the UN SDG program; Morichi Ateliers vision translates those strong values into tangible applications to resolve some of the complex problems present in its operational fields while generating solutions with a high level of sustainable impact.

Contacts:

Strategic Partnership Contact:Matt BirdCEOCommPro WorldwideC: +1 (646) 401-4499E: [emailprotected]See ESG News for more media coverage.

SOURCE: ShelterZoom

View source version on accesswire.com: https://www.accesswire.com/700326/Upon-30th-Anniversary-of-Chernobyl-ShelterZoom-Announces-Blockchain-based-Solution-For-Management-Of-Nuclear-Waste-On-A-Global-Scale

Follow this link:
Upon 30th Anniversary of Chernobyl, ShelterZoom Announces Blockchain-based Solution For Management Of Nuclear Waste On A Global Scale - Digital...

Read More..