Page 2,745«..1020..2,7442,7452,7462,747..2,7502,760..»

The Register just found 300-odd Itanium CPUs on eBay – The Register

Intel has stopped shipping the Itanium processor.

In January 2019, Chipzilla issued an advisory [PDF] warning that last orders for the CPU must be lodged by January 30, 2020, and that final shipments would head out the door on July 29, 2021.

Which was yesterday.

So concludes an odd story that started in the age of the minicomputer, when the likes of pre-split HP, Data General, Wang, and Prime dominated the server market with machines based on their own proprietary products.

By the mid-1990s, HP was worried that the minicomputer market was running out of steam because customers feared proprietary architectures would limit their software choices and lock them in to expensive ecosystems. But Sun was still in business, IBM never gave up on proprietary architectures, and DEC was telling anyone who would listen that its leap into the then-exotic realm of 64-bit CPUs with its Alpha platform represented a huge step forward.

HP therefore did a deal with Intel to use some of its research on a new architecture called Explicitly Parallel Instruction Computing, and its IA-64 instruction set, and turn it into a product. HP reckoned this tech would out-perform its few remaining minicomputer-derived rivals and Intels own server-directed efforts. Intel liked the idea of having another product line, so agreed to the arrangement. Major OS vendors were signed up to port their wares to Itanium, to make it a viable platform for all concerned.

Intel and HP made wrong calls. X86 performance improved, and went 64-bit and multi-core, competing against its Itanium cousin; customers really were almost totally over minicomputer-like systems; and Itaniums IA-64 architecture turned out to be hard to code for and its speed didn't impress. Intel bailed on it in 2004.

Itanium quickly earned the nickname "Itanic" a pun suggesting it was a supposedly brilliant innovation that sank on its maiden voyage.

Itanium did score plenty of decent wins, but never set the world on fire. IBM and Sun/Oracle ended up with the only sustainable proprietary stacks.

The few customers Itanium did attract were often the sort of outfits that don't and can't casually migrate to new platforms The Register knows of a stock exchange that bet on it and has run the platform for almost two decades. The fact that Itanium had been directed at those with unusually demanding applications also meant that plenty of users had tightly coupled software and hardware.

Even though Itanium development slowed, and new releases offered modest performance improvements, the platform lived on. Intel's Xeon, meanwhile, improved in leaps and bounds and outpaced Itanium on core count and clock speed. The server virtualization boom made Xeon-based systems more flexible, and Itanium became a curiosity.

The product spluttered along through new releases in 2012 and 2017, but its death notice came in 2019 and in February 2021 it was ejected from the Linux kernel.

And, as noted above, shipments of the silicon ceased yesterday.

Which brings us to our headline: The Register just found 300-plus Itanium CPUs on eBay. It's true! Some of the ads like this one depicting a brass rhinoceros may not be entirely convincing. However, plenty of others depict working servers, CPUs, or other components an Itanium user may find handy.

So while Intel won't ship you a new Itanium anymore, plenty of others are happy to offload their old kit to help the few remaining users keep rumbling along.

Warning: check before buying this stuff. HPE has pledged to support Itanium until 2025, though may not be so keen on you using second-hand kit.

Read more:
The Register just found 300-odd Itanium CPUs on eBay - The Register

Read More..

What is quantum computing? Everything you need to know about the strange world of quantum computers – Texasnewstoday.com

While researchers dont understand everything about the quantum world, what they do know is that quantum particles hold immense potential, in particular to hold and process large amounts of information.

Quantum computing exploits the puzzling behavior that scientists have been observing for decades in natures smallest particles think atoms, photons or electrons. At this scale, the classical laws of physics ceases to apply, and instead we shift to quantum rules.

While researchers dont understand everything about the quantum world, what they do know is that quantum particles hold immense potential, in particular to hold and process large amounts of information. Successfully bringing those particles under control in a quantum computer could trigger an explosion of compute power that would phenomenally advance innovation in many fields that require complex calculations, like drug discovery, climate modelling, financial optimization or logistics.

As Bob Sutor, chief quantum exponent at IBM, puts it: Quantum computing is our way of emulating nature to solve extraordinarily difficult problems and make them tractable, he tells ZDNet.

Quantum computers come in various shapes and forms, but they are all built on the same principle: they host a quantum processor where quantum particles can be isolated for engineers to manipulate.

The nature of those quantum particles, as well as the method employed to control them, varies from one quantum computing approach to another. Some methods require the processor to be cooled down to freezing temperatures, others to play with quantum particles using lasers but share the goal of finding out how to best exploit the value of quantum physics.

The systems we have been using since the 1940s in various shapes and forms laptops, smartphones, cloud servers, supercomputers are known as classical computers. Those are based on bits, a unit of information that powers every computation that happens in the device.

In a classical computer, each bit can take on either a value of one or zero to represent and transmit the information that is used to carry out computations. Using bits, developers can write programs, which are sets of instructions that are read and executed by the computer.

Classical computers have been indispensable tools in the last few decades, but the inflexibility of bits is limiting. As an analogy, if tasked with looking for a needle in a haystack, a classical computer would have to be programmed to look through every single piece of hay straw until it reached the needle.

There are still many large problems, therefore, that classical devices cant solve. There are calculations that could be done on a classical system, but they might take millions of years or use more computer memory that exists in total on Earth, says Sutor. These problems are intractable today.

At the heart of any quantum computer are qubits, also known as quantum bits, and which can loosely be compared to the bits that process information in classical computers.

Qubits, however, have very different properties to bits, because they are made of the quantum particles found in nature those same particles that have been obsessing scientists for many years.

One of the properties of quantum particles that is most useful for quantum computing is known as superposition, which allows quantum particles to exist in several states at the same time. The best way to imagine superposition is to compare it to tossing a coin: instead of being heads or tails, quantum particles are the coin while it is still spinning.

By controlling quantum particles, researchers can load them with data to create qubits and thanks to superposition, a single qubit doesnt have to be either a one or a zero, but can be both at the same time. In other words, while a classical bit can only be heads or tails, a qubit can be, at once, heads and tails.

This means that, when asked to solve a problem, a quantum computer can use qubits to run several calculations at once to find an answer, exploring many different avenues in parallel.

So in the needle-in-a-haystack scenario about, unlike a classical machine, a quantum computer could in principle browse through all hay straws at the same time, finding the needle in a matter of seconds rather than looking for years even centuries before it found what it was searching for.

Whats more: qubits can be physically linked together thanks to another quantum property called entanglement, meaning that with every qubit that is added to a system, the devices capabilities increase exponentially where adding more bits only generates linear improvement.

Every time we use another qubit in a quantum computer, we double the amount of information and processing ability available for solving problems. So by the time we get to 275 qubits, we can compute with more pieces of information than there are atoms in the observable universe. And the compression of computing time that this could generate could have big implications in many use cases.

Quantum computers are all built on the same principle: they host a quantum processor where quantum particles can be isolated for engineers to manipulate.

There are a number of cases where time is money. Being able to do things more quickly will have a material impact in business, Scott Buchholz, managing director at Deloitte Consulting, tells ZDNet.

The gains in time that researchers are anticipating as a result of quantum computing are not of the order of hours or even days. Were rather talking about potentially being capable of calculating, in just a few minutes, the answer to problems that todays most powerful supercomputers couldnt resolve in thousands of years, ranging from modelling hurricanes all the way to cracking the cryptography keys protecting the most sensitive government secrets.

And businesses have a lot to gain, too. According to recent research by Boston Consulting Group (BCG),the advances that quantum computing will enable could create value of up to $850 billion in the next 15 to 30 years, $5 to $10 billion of which will be generated in the next five years if key vendors deliver on the technology as they have promised.

Programmers write problems in the form of algorithms for classical computers to resolve and similarly, quantum computers will carry out calculations based on quantum algorithms. Researchers have already identified that some quantum algorithms would be particularly suited to the enhanced capabilities of quantum computers.

For example, quantum systems could tackle optimization algorithms, which help identify the best solution among many feasible options, and could be applied in a wide range of scenarios ranging from supply chain administration to traffic management. ExxonMobil and IBM, for instance, are working together to find quantum algorithmsthat could one day manage the 50,000 merchant ships crossing the oceans each day to deliver goods, to reduce the distance and time traveled by fleets.

Quantum simulation algorithms are also expected to deliver unprecedented results, as qubits enable researchers to handle the simulation and prediction of complex interactions between molecules in larger systems, which could lead to faster breakthroughs in fields like materials science and drug discovery.

With quantum computers capable of handling and processing much larger datasets,AI and machine learning applications are set to benefit hugely, with faster training times and more capable algorithms. And researchers have also demonstrated that quantum algorithmshave the potential to crack traditional cryptography keys, which for now are too mathematically difficult for classical computers to break.

To create qubits, which are the building blocks of quantum computers, scientists have to find and manipulate the smallest particles of nature tiny parts of the universe that can be found thanks to different mediums. This is why there are currently many types of quantum processors being developed by a range of companies.

One of the most advanced approaches consists of using superconducting qubits, which are made of electrons, and come in the form of the familiar chandelier-like quantum computers. Both IBM and Google have developed superconducting processors.

Another approach that is gaining momentum is trapped ions, which Honeywell and IonQ are leading the way on, and in which qubits are housed in arrays of ions that are trapped in electric fields and then controlled with lasers.

Major companies like Xanadu and PsiQuantum, for their part, are investing in yet another method that relies on quantum particles of light, called photons, to encode data and create qubits. Qubits can also be created out of silicon spin qubits which Intel is focusing on but also cold atoms or even diamonds.

Quantum annealing, an approach that was chosen by D-Wave, is a different category of computing altogether. It doesnt rely on the same paradigm as other quantum processors, known as the gate model. Quantum annealing processors are much easier to control and operate, which is why D-Wave has already developed devices that can manipulate thousands of qubits, where virtually every other quantum hardware company is working with about 100 qubits or less. On the other hand, the annealing approach is only suitable for a specific set of optimization problems, which limits its capabilities.

What can you do with a quantum computer today?

Right now, with a mere 100 qubits the state of the art, there is very little that can actually be done with quantum computers. For qubits to start carrying out meaningful calculations, they will have to be counted in the thousands, and even millions.

Both IBM and Google have developed superconducting processors.

Right now, with a mere 100 qubits the state of the art, there is very little that can actually be done with quantum computers. For qubits to start carrying out meaningful calculations, they will have to be counted in the thousands, and even millions.

While there is a tremendous amount of promise and excitement about what quantum computers can do one day, I think what they can do today is relatively underwhelming, says Buchholz.

Increasing the qubit count in gate-model processors, however, is incredibly challenging. This is because keeping the particles that make up qubits in their quantum state is difficult a little bit like trying to keep a coin spinning without falling on one side or the other, except much harder.

Keeping qubits spinning requires isolating them from any environmental disturbance that might cause them to lose their quantum state. Google and IBM, for example, do this by placing their superconducting processors in temperatures that are colder than outer space, which in turn require sophisticated cryogenic technologies that are currently near-impossible to scale up.

In addition, the instability of qubits means that they are unreliable, and still likely to cause computation errors. This hasgiven rise to a branch of quantum computing dedicated to developing error-correction methods.

Although research is advancing at pace, therefore, quantum computers are for now stuck in what is known as the NISQ era: noisy, intermediate-scale quantum computing but the end-goal is to build a fault-tolerant, universal quantum computer.

As Buchholz explains, it is hard to tell when this is likely to happen. I would guess we are a handful of years from production use cases, but the real challenge is that this is a little like trying to predict research breakthroughs, he says. Its hard to put a timeline on genius.

In 2019, Googleclaimed that its 54-qubit superconducting processor called Sycamore had achieved quantum supremacy the point at which a quantum computer can solve a computational task that is impossible to run on a classical device in any realistic amount of time.

Google said that Sycamore has calculated, in only 200 seconds, the answer to a problem that would have taken the worlds biggest supercomputers 10,000 years to complete.

More recently,researchers from the University of Science and Technology of China claimed a similar breakthrough, saying that their quantum processor had taken 200 seconds to achieve a task that would have taken 600 million years to complete with classical devices.

This is far from saying that either of those quantum computers are now capable of outstripping any classical computer at any task. In both cases, the devices were programmed to run very specific problems, with little usefulness aside from proving that they could compute the task significantly faster than classical systems.

Without a higher qubit count and better error correction, proving quantum supremacy for useful problems is still some way off.

Organizations that are investing in quantum resources see this as the preparation stage: their scientists are doing the groundwork to be ready for the day that a universal and fault-tolerant quantum computer is ready.

In practice, this means that they are trying to discover the quantum algorithms that are most likely to show an advantage over classical algorithms once they can be run on large-scale quantum systems. To do so, researchers typically try to prove that quantum algorithms perform comparably to classical ones on very small use cases, and theorize that as quantum hardware improves, and the size of the problem can be grown, the quantum approach will inevitably show some significant speed-ups.

For example, scientists at Japanese steel manufacturer Nippon Steelrecently came up with a quantum optimization algorithm that could compete against its classical counterpartfor a small problem that was run on a 10-qubit quantum computer. In principle, this means that the same algorithm equipped with thousands or millions of error-corrected qubits could eventually optimize the companys entire supply chain, complete with the management of dozens of raw materials, processes and tight deadlines, generating huge cost savings.

The work that quantum scientists are carrying out for businesses is therefore highly experimental, and so far there are fewer than 100 quantum algorithms that have been shown to compete against their classical equivalents which only points to how emergent the field still is.

With most use cases requiring a fully error-corrected quantum computer, just who will deliver one first is the question on everyones lips in the quantum industry, and it is impossible to know the exact answer.

All quantum hardware companies are keen to stress that their approach will be the first one to crack the quantum revolution, making it even harder to discern noise from reality. The challenge at the moment is that its like looking at a group of toddlers in a playground and trying to figure out which one of them is going to win the Nobel Prize, says Buchholz.

I have seen the smartest people in the field say theyre not really sure which one of these is the right answer. There are more than half a dozen different competing technologies and its still not clear which one will wind up being the best, or if there will be a best one, he continues.

In general, experts agree that the technology will not reach its full potential until after 2030. The next five years, however, may start bringing some early use cases as error correction improves and qubit counts start reaching numbers that allow for small problems to be programmed.

IBM is one of the rare companies thathas committed to a specific quantum roadmap, which defines the ultimate objective of realizing a million-qubit quantum computer. In the nearer-term, Big Blue anticipates that it will release a 1,121-qubit system in 2023, which might mark the start of the first experimentations with real-world use cases.

In general, experts agree that quantum computers will not reach their full potential until after 2030.

Developing quantum hardware is a huge part of the challenge, and arguably the most significant bottleneck in the ecosystem. But even a universal fault-tolerant quantum computer would be of little use without the matching quantum software.

Of course, none of these online facilities are much use without knowing how to speak quantum, Andrew Fearnside, senior associate specializing in quantum technologies at intellectual property firm Mewburn Ellis, tells ZDNet.

Creating quantum algorithms is not as easy as taking a classical algorithm and adapting it to the quantum world. Quantum computing, rather, requires a brand-new programming paradigm that can only be ran on a brand-new software stack.

Of course, some hardware providers also develop software tools, the most established of which is IBMs open-source quantum software development kit Qiskit. But on top of that, the quantum ecosystem is expanding to include companies dedicated exclusively to creating quantum software. Familiar names include Zapata, QC Ware or 1QBit, which all specialize in providing businesses with the tools to understand the language of quantum.

And increasingly, promising partnerships are forming to bring together different parts of the ecosystem. For example, therecent alliance between Honeywell, which is building trapped ions quantum computers, and quantum software company Cambridge Quantum Computing (CQC), has got analysts predicting that a new player could be taking a lead in the quantum race.

The complexity of building a quantum computer think ultra-high vacuum chambers, cryogenic control systems and other exotic quantum instruments means that the vast majority of quantum systems are currently firmly sitting in lab environments, rather than being sent out to customers data centers.

To let users access the devices to start running their experiments, therefore, quantum companies have launched commercial quantum computing cloud services, making the technology accessible to a wider range of customers.

The four largest providers of public cloud computing services currently offer access to quantum computers on their platform. IBM and Google have both put their own quantum processors on the cloud, whileMicrosofts Azure QuantumandAWSs Braketservice let customers access computers from third-party quantum hardware providers.

The jury remains out on which technology will win the race, if any at all, but one thing is for certain: the quantum computing industry is developing fast, and investors are generously funding the ecosystem. Equity investments in quantum computing nearly tripled in 2020, and according to BCG, they are set to rise even more in 2021 to reach $800 million.

Government investment is even more significant: the US has unlocked $1.2 billion for quantum information science over the next five years, while the EU announced a 1 billion ($1.20 billion) quantum flagship. The UKalso recently reached the 1 billion ($1.37 billion) budget milestonefor quantum technologies, and while official numbers are not known in China,the government has made no secret of its desire to aggressively compete in the quantum race.

This has caused the quantum ecosystem to flourish over the past years, with new start-ups increasing from a handful in 2013 to nearly 200 in 2020. The appeal of quantum computing is also increasing among potential customers: according to analysis firm Gartner,while only 1% of companies were budgeting for quantum in 2018, 20% are expected to do so by 2023.

Although not all businesses need to be preparing themselves to keep up with quantum-ready competitors, there are some industries where quantum algorithms are expected to generate huge value, and where leading companies are already getting ready.

Goldman Sachs and JP Morgan are two examples of financial behemoths investing in quantum computing. Thats because in banking,quantum optimization algorithms could give a boost to portfolio optimization, by better picking which stocks to buy and sell for maximum return.

In pharmaceuticals, where the drug discovery process is on average a $2 billion, ten-year-long deal that largely relies on trial and error, quantum simulation algorithms are also expected to make waves. This is also the case in materials science: companies like OTI Lumionics, for example,are exploring the use of quantum computers to design more efficient OLED displays.

Leading automotive companies including Volkswagen and BMW are also keeping a close eye on the technology, which could impact the sector in various ways, ranging from designing more efficient batteries to optimizing the supply chain, through to better management of traffic and mobility. Volkswagen, for example,pioneered the use of a quantum algorithm that optimized bus routes in real time by dodging traffic bottlenecks.

As the technology matures, however, it is unlikely that quantum computing will be limited to a select few. Rather, analysts anticipate that virtually all industries have the potential to benefit from the computational speedup that qubits will unlock.

There are some industries where quantum algorithms are expected to generate huge value, and where leading companies are already getting ready.

Quantum computers are expected to be phenomenal at solving a certain class of problems, but that doesnt mean that they will be a better tool than classical computers for every single application. Particularly, quantum systems arent a good fit for fundamental computations like arithmetic, or for executing commands.

Quantum computers are great constraint optimizers, but thats not what you need to run Microsoft Excel or Office, says Buchholz. Thats what classical technology is for: for doing lots of maths, calculations and sequential operations.

In other words, there will always be a place for the way that we compute today. It is unlikely, for example, that you will be streaming a Netflix series on a quantum computer anytime soon. Rather, the two technologies will be used in conjunction, with quantum computers being called for only where they can dramatically accelerate a specific calculation.

Buchholz predicts that, as classical and quantum computing start working alongside each other, access will look like a configuration option. Data scientists currently have a choice of using CPUs or GPUs when running their workloads, and it might be that quantum processing units (QPUs) join the list at some point. It will be up to researchers to decide which configuration to choose, based on the nature of their computation.

Although the precise way that users will access quantum computing in the future remains to be defined, one thing is certain: they are unlikely to be required to understand the fundamental laws of quantum computing in order to use the technology.

People get confused because the way we lead into quantum computing is by talking about technical details, says Buchholz. But you dont need to understand how your cellphone works to use it.

People sometimes forget that when you log into a server somewhere, you have no idea what physical location the server is in or even if it exists physically at all anymore. The important question really becomes what it is going to look like to access it.

And as fascinating as qubits, superposition, entanglement and other quantum phenomena might be, for most of us this will come as welcome news.

What is quantum computing? Everything you need to know about the strange world of quantum computers Source link What is quantum computing? Everything you need to know about the strange world of quantum computers

Read the rest here:
What is quantum computing? Everything you need to know about the strange world of quantum computers - Texasnewstoday.com

Read More..

So you want to migrate to the cloud? – ITWeb

It seems nearly impossible to avoid the cloud as a business these days, and for many companies, the benefits cloud computing offers are just too great to ignore for much longer. Because of these, youve already taken the first step and made the decision you want to migrate to the cloud. But now what?

Luckily, with the plethora of tools created by both cloud providers and those built by software vendors, kicking off your migration to the cloud has never been easier whether youre looking to move onsite workloads or build cloud-native solutions from the start.

As cloud experts with experience advising, migrating, architecting, managing and optimising workloads in the cloud, BBD understands the nitty-gritty of what you need to consider before you take the plunge. There is, of course, quite a long list of things we can add here, but we know you also have work to do, so we will keep the rest of this to the point.

Although the right partner on this journey definitely makes your move to the cloud much more streamlined, there are multiple steps in the process. Over the next couple of weeks, this migration-focused series will unpack these steps and the processes you need to run through to ensure you ultimately deploy a secure, compliant, cost-effective and resilient environment.

Two of the most important aspects to consider from the start are security and compliance, because they often help establish whether your initial migration plan is viable or not, and if so, in which direction.

Security

Understanding your security goals and how you should be handling data will create a good foundation for you to know what services to use when architecting your environment.

Jaco Venter, head of BBDs managed cloud services team (MServ), says security should always be top of mind when planning your migration. There are the how do I keep my customers' information secure? and the how do I ensure my applications do not get compromised? conversations. These are both important to unpack with your cloud solution partner.

Both these topics can be addressed by planning for and implementing an architecture that includes best practices. BBD has done well-architected reviews on customer environments and often finds that the basics are covered, and thats a great start, but when looking at security, just the basics wont do, especially if it could lead to your environment being compromised.

As an example, AWS has created an Architecture Center on its website that provides reference architecture diagrams, vetted architecture solutions, well-architected best practices, patterns, icons and more. This easily accessible guidance was contributed to by AWS cloud architecture experts, including solutions architects, professional services consultants and partners.

For AWS migrations, Venter explains there is a shared responsibility model that pretty much goes like this: AWS is responsible for the security of the cloud. AWS will look after all things physical, from the security guards standing in front of their various data centres' doors, all the way through to the security and management of the infrastructure your services will be running on. You (or your cloud enablement partner), on the other hand, will be responsible for security in the cloud. This means you will still have to ensure your data is being protected and backed up.

There are, however, some AWS services that are fully managed, like RDS (relational database service), where AWS will manage and secure everything for you up until the DB table level.

Compliance

Understanding the compliance frameworks your organisation has to comply with is a recommended starting point, as it will influence a lot of the architecture youll need to devise before your cloud migration. An example of this would be when the customers you service are in a country with data residency restrictions/laws (such as GDPR, POPIA, PCI, ISO, etc). Here you need to plan for how you will handle and process those customers data versus the data of your customers in other countries without those restrictions/laws.

When looking at data residency again as an example, AWS has a couple of tools, such as Control Tower, that allow you to manage how data is transferred between regions or if it even can be transferred to another region.

On the whole, compliance will often dictate where you can or cannot deploy your workloads, and which services you can or cannot use. The great thing about AWS having obtained various compliance framework certifications for their infrastructure is that it makes it so much easier for you to be compliant, says Venter. But think about this in the same way as the shared responsibility model; AWS will make sure the infrastructure is compliant you will need to make sure your applications also meet the compliance framework requirements.

Ultimately, its worth understanding that the services you plan to leverage as part of your architecture can sometimes make it a bit easier to comply with the relevant compliance frameworks.

What else needs to be considered before finalising a cloud migration strategy?

It is always best to look at what migration tools the cloud provider you are migrating to has made available to you often at no additional charge.

Venter explains this is exactly the case when looking at the tools made available by AWS. AWS has made more than six tools available at no cost, and some of these tools are perfect for the planning phase, while others make the migration of your servers, applications and databases just so much easier.

One example of such a tool is the AWS Server Migration Service, which is an agentless service applicable when migrating virtual-only workloads from on-premises infrastructure, or from Microsoft Azure to AWS. It allows you to automate, schedule and track incremental replications of live server volumes making it easier to co-ordinate large-scale server migrations.

There is a long list of other considerations youll need to consider before kicking off your migration, some that are more important than others, but each could have an impact on your final architecture and how you manage that environment in the long run. These will be discussed in more detail as this series unfolds.

BBD has helped various clients reach their security and compliance goals in preparation for their coming migration to the cloud, and understands the importance of tool selection to aid in an efficient migration to the cloud.

If youve made the decision and are looking for a cloud enablement partner to guide you through devising a relevant strategy, implementing the migration and optimising as your business grows reach out to BBD at http://www.bbdsoftware.com.

Go here to see the original:
So you want to migrate to the cloud? - ITWeb

Read More..

AMD 3rd Gen Epyc CPUs Put Intel Xeon SPs On Ice In The Datacenter – The Next Platform

SPONSORED Sometimes, bad things turn into excellent opportunities that can utterly transform markets. Many years hence, when someone writes the history of the datacenter compute business, they will judge AMD tapping Taiwan Semiconductor Manufacturing Corp to etch the cores in its second and third generation Epyc server processors to be extremely fortuitous. This allowed AMD to leapfrog Intel a generation ago and set itself up for a sustainable process lead while AMD had a parallel architectural advantage over its server CPU arch-rival.

We have not seen Intel knocked down so hard in the datacenter since AMDs 64-bit Opterons, with their integrated memory controllers, multicore architecture, HyperTransport interconnect, and other advanced features, made the 32-bit Xeon server chips look ridiculous in the early 2000s. It wasnt until Intel cloned many of the elements of the Opteron designs with its Nehalem Xeon E5500 processors in 2009 that it could field a server CPU that was technically and economically competitive with the Opteron alternatives.

History is repeating itself with the third generation Epyc 7003 series processors (formerly codenamed Milan), which came out in March of this year. (Our initial analysis of the SKU stacks is at this link and our deep dive into the Epyc 7003 architecture is here.) While Intels Ice Lake Xeon SP server processors, also the third generation of its most recent family, are a big improvement over their predecessors, they do not even come close to matching the Epyc 7003 series processors when it comes to single-core or total socket throughput performance. And when it comes to price/performance and compatibility with existing server designs, AMD is winning this matchup against Intel in datacenter compute hands down. As we have said, Intel has improved considerably with its Ice Lake chips compared to the Skylake and Cascade Lake predecessors in the Xeon SP line. But AMD is cleaning its clocks. And caches. And vector units. And so on.

And now, we are finally getting the data to do competitive analysis pitting the AMD 3rd Gen Epyc chips against the Intel Ice Lake chips, and given how AMD is running a clean sweep, it is no surprise that Intel has brought back Pat Gelsinger to try to reinvigorate the Xeon SP lineup and save the server CPU business. AMD has broken through the 10 percent server shipment share after seven years of research, development, and product rollouts and seems poised to double that share and maybe more because the company will have a sustainable architecture and manufacturing process advantage. (Our best guess is that about a year from now, AMD will have 25 percent server shipment share with some big error bars around that number to take into account macroeconomic factors and Intels pricing and bundling reactions.)

We are very excited about the momentum we are seeing across our customer base, Ram Peddibhotla, corporate vice president of product management for datacenter products at AMD, tells The Next Platform. And if you look at the kind of total cost of ownership savings possible from 3rd Gen Epyc versus Ice Lake, you can plough that into your core business and you are able to bring efficiencies to the business across the board. I have said this before, and I will say it again. The risk actually lies in not adopting Epyc. And if you dont adopt Epyc, I think you are actually at a severe competitive disadvantage.

It is hard to argue that point at the server CPU level, particularly after you look at the performance comparisons we are going to do. And then lets add in the fact that AMD is working with technology partners to bring Epyc chips to bear on particular software stacks and solutions that are relevant to the enterprise. This will significantly reduce friction in deals and drive enterprise adoption like we have already seen with HPC centers, public cloud builders, and hyperscalers.

First, lets look at some relevant performance matchups, and we will start with the SPEC CPU benchmarks that gauge integer and floating point performance. These are table stakes to be in the server CPU; if you cant deliver decent SPEC numbers, you wont get hyperscalers, cloud builders, and OEMs to answer the phone when you call. If you look at the SPECspeed2017 and SPECrate2017 tests which come in one-socket and two-socket versions with both integer and floating point performance ratings AMDs Epyc processors have the number one ranking on all 16 possible categories. (SPECspeed2017 measures the time for workloads to complete while SPECrate2017 measures throughput per unit of time, so they are slightly different in this regard.) And on power efficiency tests, AMD has swept the SPECpower2008 benchmarks and has the top ranking on all but one of the SPEC CPU 2017 energy efficiency benchmarks. This is unprecedented but could be the new normal for the next several generations of X86 server CPUs and maybe even across all classes of server CPUs. In many cases, the second generation Epyc 7002 series processors can beat Intels third generation Ice Lake Xeon SPs, and then the Epyc 7003s open an even larger gap. And here is the stunning thing that must have Intel fuming: AMD has now delivered better per core performance as well as better throughput up and down the SKU stack.

Here is how the top-bin parts compare, with Ice Lake Xeon SPs on the left, Epyc 7002s in the center, and Epyc 7003s on the right, on the SPECrate2017 integer, floating point, and Java benchmarks for two-socket systems:

The gap between Ice Lake and Epyc 7002 is bad enough for these top-bin systems, but the gap between Ice Lake and Epyc 7003 is large. On the integer test, the advantage to AMD is 47.2 percent, on the floating point test it is 36.5 percent, and on the SPECjbb2015 test it is 49.8 percent.

So how does it look at a constant number of cores, say perhaps 32 cores? Still not good for Intel. Here are the SPECrate2017 tests for 32-core Epyc 7002, 32-core Ice Lake, and 32-core Epyc 7003 parts:

The Ice Lake core has a tiny bit more oomph than the Epyc 7002 core it was intended to compete against, but Intel didnt make it into the field in time to do that, and the Epyc 7003 core, based on the Zen 3 design, has quite a bit more performance. Therefore, a 32-core Epyc 7003 chip can do 34.2 percent more integer work and 30.6 percent more floating point work than the 32-core Ice Lake chip.

Even if you scale down the Intel Ice Lake and AMD Epyc 7003 chips, the situation is still not great for Intel, as you can see here in this comparison showing integer performance on the SPECrate2017 test:

The message here is that if Intel wants to maintain shipments of its Xeon SPs, it will have to cut CPU prices and bundle in motherboards, NICs, FPGAs, and anything else it can in the deal to try to keep the revenue stream flowing. And even if it does this, Intels Data Center Group margins will take a big hit, as they did the first quarter of 2021. This is just the beginning of a potential price war and sustained technology campaign in the X86 server CPU market.

Here is a chart that shows how the Epyc 7002 and Epyc 7003 Epyc 7003 chip SKU stack compares against the most common SKUs in the Intel Ice Lake Xeon SP stack, which makes it easier to see the competitive positioning.

AMD purposely designed the Epyc server platform to have longevity while steadily increasing the value delivered in each generation of the Epyc family of processors, explains Peddibhotla. Many servers in the market will continue to support the second generation Epyc and the new third generation Epyc to co-exist together as the latest generation enhances performance per core even further and adds other core-count options to meet varying workload needs. The entry market with 8 to 16 cores will deliver great value with Epyc 7002 series with TCO-optimized volume. Per-core or high-density performance needs can be filled with the Epyc 7003. And the second generation Epyc is a great price/performance value at all available core counts.

Intel, by contrast, is making customers move from the Purley platform for Skylake and Cascade Lake Xeon SPs to the Whitley platform for Ice Lake and then the Eagle Stream platform for the future Sapphire Rapids fourth generation Xeon SPs.

Although raw performance on the SPEC tests is an important thing that all enterprises consider, what they want to know is how much more oomph can they get if they are upgrading servers that are several generations back, perhaps four years old. There is always a consolidation factor, but this one is playing out in favor of AMD:

As is usually the case, it will take far fewer servers to meet the same capacity or much more capacity will be available in the same number of physical servers. In this case, for just under 4,000 aggregate SPECrate2017 integer units of performance, you can replace 20 two-socket Broadwell Xeon E5 v4 servers with five Epyc 7003 Epyc 7763 servers to get the same performance or install 20 servers and get 4X the performance. Assuming that the Intel Ice Lake and AMD Epyc 7003 servers shown above cost about the same, for the same number of servers, you will get around 50 percent more performance, which means you can cut about a third of the server count to get the same performance and spend a third less money, too.

You can dice and slice this a lot of different ways, of course.

Here is a deep TCO analysis over three years that shows how this might play out for a 10,000 SPECrate2017 integer units of performance, showing the cost of acquiring the machines, administering them, paying for datacenter space, power, and cooling. It bears out what we just said above:

AMD has fought a long time to get back to this position. And datacenters the world over should be grateful. We really needed some competition here.

Sponsored by AMD

Go here to see the original:
AMD 3rd Gen Epyc CPUs Put Intel Xeon SPs On Ice In The Datacenter - The Next Platform

Read More..

10 mind-boggling things you should know about quantum physics

1. The quantum world is lumpy

The quantum world has a lot in common with shoes. You cant just go to a shop and pick out sneakers that are an exact match for your feet. Instead, youre forced to choose between pairs that come in predetermined sizes.

The subatomic world is similar. Albert Einstein won a Nobel Prize for proving that energy is quantized. Just as you can only buy shoes in multiples of half a size, so energy only comes in multiples of the same "quanta" hence the name quantum physics.

The quanta here is the Planck constant, named after Max Planck, the godfather of quantum physics. He was trying to solve a problem with our understanding of hot objects like the sun. Our best theories couldnt match the observations of the energy they kick out. By proposing that energy is quantized, he was able to bring theory neatly into line with experiment.

J. J. Thomson won the Nobel Prize in 1906 for his discovery that electrons are particles. Yet his son George won the Nobel Prize in 1937 for showing that electrons are waves. Who was right? The answer is both of them. This so-called wave-particle duality is a cornerstone of quantum physics. It applies to light as well as electrons. Sometimes it pays to think about light as an electromagnetic wave, but at other times its more useful to picture it in the form of particles called photons.

A telescope can focus light waves from distant stars, and also acts as a giant light bucket for collecting photons. It also means that light can exert pressure as photons slam into an object. This is something we already use to propel spacecraft with solar sails, and it may be possible to exploit it in order to maneuver a dangerous asteroid off a collision course with Earth, according to Rusty Schweickart, chairman of the B612 Foundation.

Wave-particle duality is an example of superposition. That is, a quantum object existing in multiple states at once. An electron, for example, is both here and there simultaneously. Its only once we do an experiment to find out where it is that it settles down into one or the other.

This makes quantum physics all about probabilities. We can only say which state an object is most likely to be in once we look. These odds are encapsulated into a mathematical entity called the wave function. Making an observation is said to collapse the wave function, destroying the superposition and forcing the object into just one of its many possible states.

This idea is behind the famous Schrdingers cat thought experiment. A cat in a sealed box has its fate linked to a quantum device. As the device exists in both states until a measurement is made, the cat is simultaneously alive and dead until we look.

The idea that observation collapses the wave function and forces a quantum choice is known as the Copenhagen interpretation of quantum physics. However, its not the only option on the table. Advocates of the many worlds interpretation argue that there is no choice involved at all. Instead, at the moment the measurement is made, reality fractures into two copies of itself: one in which we experience outcome A, and another where we see outcome B unfold. It gets around the thorny issue of needing an observer to make stuff happen does a dog count as an observer, or a robot?

Instead, as far as a quantum particle is concerned, theres just one very weird reality consisting of many tangled-up layers. As we zoom out towards the larger scales that we experience day to day, those layers untangle into the worlds of the many worlds theory. Physicists call this process decoherence.

Danish physicist Niels Bohr showed us that the orbits of electrons inside atoms are also quantized. They come in predetermined sizes called energy levels. When an electron drops from a higher energy level to a lower energy level, it spits out a photon with an energy equal to the size of the gap. Equally, an electron can absorb a particle of light and use its energy to leap up to a higher energy level.

Astronomers use this effect all the time. We know what stars are made of because when we break up their light into a rainbow-like spectrum, we see colors that are missing. Different chemical elements have different energy level spacings, so we can work out the constituents of the sun and other stars from the precise colors that are absent.

The sun makes its energy through a process called nuclear fusion. It involves two protons the positively charged particles in an atom sticking together. However, their identical charges make them repel each other, just like two north poles of a magnet. Physicists call this the Coulomb barrier, and its like a wall between the two protons.

Think of protons as particles and they just collide with the wall and move apart: No fusion, no sunlight. Yet think of them as waves, and its a different story. When the waves crest reaches the wall, the leading edge has already made it through. The waves height represents where the proton is most likely to be. So although it is unlikely to be where the leading edge is, it is there sometimes. Its as if the proton has burrowed through the barrier, and fusion occurs. Physicists call this effect "quantum tunneling".

Eventually fusion in the sun will stop and our star will die. Gravity will win and the sun will collapse, but not indefinitely. The smaller it gets, the more material is crammed together. Eventually a rule of quantum physics called the Pauli exclusion principle comes into play. This says that it is forbidden for certain kinds of particles such as electrons to exist in the same quantum state. As gravity tries to do just that, it encounters a resistance that astronomers call degeneracy pressure. The collapse stops, and a new Earth-sized object called a white dwarf forms.

Degeneracy pressure can only put up so much resistance, however. If a white dwarf grows and approaches a mass equal to 1.4 suns, it triggers a wave of fusion that blasts it to bits. Astronomers call this explosion a Type Ia supernova, and its bright enough to outshine an entire galaxy.

A quantum rule called the Heisenberg uncertainty principle says that its impossible to perfectly know two properties of a system simultaneously. The more accurately you know one, the less precisely you know the other. This applies to momentum and position, and separately to energy and time.

Its a bit like taking out a loan. You can borrow a lot of money for a short amount of time, or a little cash for longer. This leads us to virtual particles. If enough energy is borrowed from nature then a pair of particles can fleetingly pop into existence, before rapidly disappearing so as not to default on the loan.

Stephen Hawking imagined this process occurring at the boundary of a black hole, where one particle escapes (as Hawking radiation), but the other is swallowed. Over time the black hole slowly evaporates, as its not paying back the full amount it has borrowed.

Our best theory of the universes origin is the Big Bang. Yet it was modified in the 1980s to include another theory called inflation. In the first trillionth of a trillionth of a trillionth of a second, the cosmos ballooned from smaller than an atom to about the size of a grapefruit. Thats a whopping 10^78 times bigger. Inflating a red blood cell by the same amount would make it larger than the entire observable universe today.

As it was initially smaller than an atom, the infant universe would have been dominated by quantum fluctuations linked to the Heisenberg uncertainty principle. Inflation caused the universe to grow rapidly before these fluctuations had a chance to fade away. This concentrated energy into some areas rather than others something astronomers believe acted as seeds around which material could gather to form the clusters of galaxies we observe now.

As well as helping to prove that light is quantum, Einstein argued in favor of another effect that he dubbed spooky action at distance. Today we know that this quantum entanglement is real, but we still dont fully understand whats going on. Lets say that we bring two particles together in such a way that their quantum states are inexorably bound, or entangled. One is in state A, and the other in state B.

The Pauli exclusion principle says that they cant both be in the same state. If we change one, the other instantly changes to compensate. This happens even if we separate the two particles from each other on opposite sides of the universe. Its as if information about the change weve made has traveled between them faster than the speed of light, something Einstein said was impossible.

Join our Space Forumsto keep talking space on the latest missions, night sky and more! And if you have a news tip, correction or comment, let us know at:community@space.com.

Read the original here:

10 mind-boggling things you should know about quantum physics

Read More..

Rochester researchers join national initiative to advance quantum science – University of Rochester

July 30, 2021

Todd Krauss, chair of the Department of Chemistry at the University of Rochester, and his fellow researchers are joining a $73 million initiative, funded by the US Department of Energy, to advance quantum science and technology. Krausss project, Understanding coherence in lightmatter interfaces for quantum science, is one of 29 projects intended to help scientists better understand and to harness the quantum world in order to eventually benefit people and society.

Its exciting to see the University recognized for its work in the emerging field of quantum information science, says Krauss.

The University has a long history in quantum science, dating back to physicist Leonard Mandelconsidered a pioneer in quantum opticsin the 1960s. And Krauss says he and his colleagues are now building on the work of Mandel and other giants at Rochester, as well as leveraging the talents of the Universitys current crop of quantum researchers.

Quantum science represents the next technological revolution and frontier in the Information Age, and America stands at the forefront, said Secretary of Energy Jennifer M. Granholm as part of the DOEs announcement of the funding. At DOE, were investing in the fundamental research, led by universities and our National Labs, that will enhance our resiliency in the face of growing cyber threats and climate disasters, paving the path to a cleaner, more secure future.

One of the principle challenges in this line of research, explains Krauss, is that quantum states of matter are typically stable only at temperatures below 10 degrees Kelvin; thats roughly 441 degrees Fahrenheit. By comparison, the coldest recorded temperature on Earth was 128.6 at Russias Vostok station in Antarctica in 1983. If stability can be achieved at room temperature, then the benefits of quantum applications can be realized on a broader scale.

More robust quantum states could yield exponentially faster computers, extremely responsive chemical or biological sensors, as well as more secure communication systems, an area that Krausss project is focused on. In quantum state communications, it will be possible to know when someone else is monitoring your messaging, says Krauss.

Krauss is being awarded $1.95 million over three years for his project on light-matter interfaces. Basically, says Krauss, were sticking colloidal nanoparticles into optical cavities in order to interact the nanoparticles with the quantum-light of the cavity. The work will be divided among four researchers:

We are excited to be taking the field of quantum optics in completely new and uncharted directions with our studies of the quantum optics of nanoparticles, says Krauss.

Tags: Arts and Sciences, Department of Chemistry, Hajim School of Engineering and Applied Sciences, Institute of Optics, Nick Vamivakas, Pengfei Huo, quantum optics, quantum physics, research funding, Todd Krauss

Category: Science & Technology

Originally posted here:

Rochester researchers join national initiative to advance quantum science - University of Rochester

Read More..

News Scientists create the world’s thinnest magnet – University of California

The development of an ultrathin magnet that operates at room temperature could lead to new applications in computing and electronics such as high-density, compact spintronic memory devices and new tools for the study of quantum physics.

The ultrathinmagnet,which was recentlyreported in the journal Nature Communications, could make big advances in next-gen memory devices, computing, spintronicsand quantum physics. It was discovered by scientists at the Department of Energys Lawrence Berkeley National Laboratory (Berkeley Lab) and UC Berkeley.

Were the first to make a room-temperature 2D magnet that is chemically stable under ambient conditions, said senior authorJie Yao, afaculty scientist in Berkeley Labs Materials Sciences Division and associate professor ofmaterials science and engineering at UC Berkeley.

This discovery is exciting because it not only makes 2D magnetism possible at room temperature, but it also uncovers a new mechanism to realize 2Dmagneticmaterials, added Rui Chen, a UC Berkeley graduate student in theYao Research Groupand lead author on the study.

The magnetic component of todays memory devices is typically made of magnetic thin films. But at the atomic level, these materials are still three-dimensional hundreds or thousands of atoms thick. For decades, researchers have searched for ways to make thinner and smaller 2D magnets and thus enable data to be stored at a much higher density.

Previous achievements in the field of 2D magnetic materials have brought promising results. But these early 2D magnets lose their magnetism and become chemically unstable at room temperature.

State-of-the-art 2D magnets need very low temperatures to function. But for practical reasons, a data center needs to run at room temperature, Yao said. Our 2D magnet is not only the first that operates at room temperature or higher, but it is also the first magnet to reach the true 2D limit: Its as thin as a single atom!

The researchers say that their discovery will also enable new opportunities to study quantum physics. It opens up every single atom for examination, which may reveal how quantum physics governs each single magnetic atom and the interactions between them, Yao said.

The researchers synthesized the new 2D magnet called a cobalt-doped van der Waals zinc-oxide magnet from a solution of graphene oxide, zinc, and cobalt.

Just a few hours of baking in a conventional lab oven transformed the mixture into a single atomic layer of zinc-oxide with a smattering of cobalt atoms sandwiched between layers of graphene.

In a final step, the graphene is burned away, leaving behind just a single atomic layer of cobalt-doped zinc-oxide.

With our material, there are no major obstacles for industry to adopt our solution-based method, said Yao. Its potentially scalable for mass production at lower costs.

To confirm that the resulting 2D film is just one atom thick, Yao and his team conducted scanning electron microscopy experiments at Berkeley LabsMolecular Foundryto identify the materials morphology, and transmission electron microscopy (TEM) imaging to probe the material atom by atom.

X-ray experiments at Berkeley LabsAdvanced Light Sourcecharacterized the 2D materials magnetic parameters under high temperature.

Additional X-ray experiments at SLAC National Accelerator Laboratorys Stanford Synchrotron Radiation Lightsource verified the electronic and crystal structures of the synthesized 2D magnets. And at Argonne National Laboratorys Center for Nanoscale Materials, the researchers employed TEM to image the 2D materials crystal structure and chemical composition.

The researchers found that the graphene-zinc-oxide system becomes weakly magnetic with a 5-6 percentconcentration of cobalt atoms. Increasing the concentration of cobalt atoms to about 12 percentresults in a very strong magnet.

To their surprise, a concentration of cobalt atoms exceeding 15 percentshifts the 2D magnet into an exotic quantum state of frustration, whereby different magnetic states within the 2D system are in competition with each other.

And unlike previous 2D magnets, which lose their magnetism at room temperature or above, the researchers found that the new 2D magnet not only works at room temperature but also at 100 degrees Celsius (212 degrees Fahrenheit).

Our 2D magnetic system shows a distinct mechanism compared to previous 2D magnets, said Chen. And we think this unique mechanism is due to the free electrons in zinc oxide.

When you command your computer to save a file, that information is stored as a series of ones and zeroes in the computers magnetic memory, such as the magnetic hard drive or a flash memory.

And like all magnets, magnetic memory devices contain microscopic magnets with two poles north and south, the orientations of which follow the direction of an external magnetic field. Data is written or encoded when these tiny magnets are flipped to the desired directions.

According to Chen, zinc oxides free electrons could act as an intermediary that ensures the magnetic cobalt atoms in the new 2D device continue pointing in the same direction and thus stay magnetic even when the host, in this case the semiconductor zinc oxide, is a nonmagnetic material.

Free electrons are constituents of electric currents. They move in the same direction to conduct electricity, Yao added, comparing the movement of free electrons in metals and semiconductors to the flow of water molecules in a stream of water.

The new material which can be bent into almost any shape without breaking, and is a million times thinner than a sheet of paper could help advance the application of spin electronics or spintronics, a new technology that uses the orientation of an electrons spin rather than its charge to encode data. Our 2D magnet may enable the formation of ultra-compact spintronic devices to engineer the spins of the electrons, Chen said.

I believe that the discovery of this new, robust, truly two-dimensional magnet at room temperature is a genuine breakthrough, said co-author Robert Birgeneau, a faculty senior scientist in Berkeley Labs Materials Sciences Division and professor of physics at UC Berkeley who co-led the study.

Our results are even better than what we expected, which is really exciting. Most of the time in science, experiments can be very challenging, Yao said. But when you finally realize something new, its always very fulfilling.

Co-authors on the paper include researchers from Berkeley Lab, including Alpha NDiaye and Padraic Shafer of the Advanced Light Source; UC Berkeley; UC Riverside; Argonne National Laboratory; and Nanjing University and the University of Electronic Science and Technology of China.

The Advanced Light Source and Molecular Foundry are DOE national user facilities at Berkeley Lab.

The Stanford Synchrotron Radiation Lightsource is a DOE national user facility at SLAC National Accelerator Laboratory.

The Center for Nanoscale Materials is a DOE national user facility at Argonne National Laboratory.

This work was funded by the DOE Office of Science, the Intel Corporation, and the Bakar Fellows Program at UC Berkeley.

Here is the original post:

News Scientists create the world's thinnest magnet - University of California

Read More..

Google’s ‘time crystals’ could be the greatest scientific achievement of our lifetimes – The Next Web

Eureka! A research team featuring dozens of scientists working in partnership with Googles quantum computing labs may have created the worlds first time crystal inside a quantum computer.

This is the kind of news that makes me want to jump up and do a happy dance.

These scientists may have produced an entirely new phase of matter. Im going to do my best to explain what that means and why I personally believe this is the most important scientificbreakthrough in our lifetimes.

However, for the sake of clarity, theres two points I need to make first:

In colloquial terms, its a big screw you to Sir Isaac Newton.

Time crystals are a new phase of matter. For the sake of simplicity, lets imagine a cube of ice.

When you put a cube of ice in glass of water, youre introducing two separate entities (the ice cube and the liquid water) to each other at two different temperatures.

Everyone knows that the water will get colder (thats why we put the ice in there) and, over time, the ice will get warmer and turn into water. Eventually youll just have a glass of room-temperature water.

We call this process thermal equilibrium.

Most people are familiar with Newtons first law of motion, its the one that says an object at rest tends to stay at rest and an object inmotion tends to stay in motion.

An important side-effect of this law of physics is that it means a perpetual motion machine is classically impossible.

According to classical physics, the universe is always moving towards entropy. In other words: if we isolate an ice cube and a room-temperature glass of water from all other external forces, the water will always melt the ice cube.

The entropy (the movement towards change) of any system will always remain the same if there are no processes, and it will always increase if there are processes.

Since our universe has stars exploding, black holes sucking, and people lighting things on fire chemical processes entropy is always increasing.

Except when it comes to time crystals. Time crystals dont give a damn what Newton or anyone else thinks. Theyre lawbreakers and heart takers. They can, theoretically, maintain entropy even when theyre used in a process.

Think about a crystal youre familiar with, such as a snowflake. Snowflakes arent just beautiful because each one is unique, theyre also fascinating formations that nearly break the laws of physics themselves.

Crystalline structures form in the physical world because, for whatever fundamental scientific reason, the atoms within them want to exist in certain exact points.

Want is a really weird word to use when were talking about atoms Im certainly not implying theyre sentient but its hard describe the tendency toward crystalline structures in abstracts such as why.

A time crystal is a new phase of matter that, simplified, would be like having a snowflake that constantly cycled back and forth between two different configurations. Its a seven-pointed lattice one moment and a ten-pointed lattice the next, or whatever.

Whats amazing about time crystals is that when they cycle back and forth between two different configurations, they dont lose or use any energy.

Time crystals can survive energy processes without falling victim to entropy. The reason theyre called time crystals is because they can have their cake and eat it too.

They can be in a state of having eaten the whole cake, and then cycle right back to a state of still having the cake and they can, theoretically, do this forever and ever.

Most importantly, they can do this inside of an isolated system. That means they can consume the cake and then magically make it reappear over and over again forever, without using any fuel or energy.

Literally everyone should care. As I wrote back in 2018, time crystals could be the miracle quantum computing needs.

Nearly every far-future tech humans can imagine, from teleportation to warp drives and from artificial food synthesizers to perpetual motion reactors capable of powering the world without burning fuels or harnessing energy, will require quantum computing systems.

Quantum computers can solve really hard problems. Unfortunately, theyre brittle. Its hard to build them, hard to maintain them, hard to get them to do anything, and even harder to interpret the results they give. This is because of something called decoherence, which works a lot like entropy.

Computer bits in the quantum world, qubits, share a funky feature of quantum mechanics that makes them act differently when observed than when theyre left alone. That sort of makes any direct measurements of qubit states (reading the computers output) difficult.

But time crystals want to be coherent. So putting them inside a quantum computer, and using them to conduct computer processes could potentially serve an incredibly important function: ensuringquantum coherence.

[Greetings Humanoids! Did you know we have a newsletter all about AI and quantum computing? You can subscribe to itright here]

No. No, no, no, no no. Dont get me wrong. This is baby steps. This is infancy research. This is Antony van Leeuwenhoek becoming the first person to use a microscope to look at a drop of water under magnification.

What Googles done, potentially, is prove that humans can manufacture time crystals. In the words of the researchers themselves:

These results establish a scalable approach to study non-equilibrium phases of matter on current quantum processors.

Basically they believe theyve proven the concept, so now its time to see what can be done with it.

Time crystals have always been theoretical. And by always, I mean: since 2012 when they were first hypothesized.

If Googles actually created time-crystals, it could accelerate the timeline for quantum computing breakthroughs from maybe never to maybe within a few decades.

At the far-fetched, super-optimistic end of things we could see the creation of a working warp drive in our lifetimes. Imagine taking a trip to Mars or the edge of our solar system, and being back home on Earth in time to catch the evening news.

And, even on the conservative end with more realistic expectations, its not hard to imagine quantum computing-based chemical and drug discovery leading to universally-effective cancer treatments.

This could be the big eureka weve all been waiting for. I cant wait to see what happens in peer-review.

If you want to know more, you can read Googles paper here. And if youre looking for a technical deep-dive into the scientific specifics of what the researchers accomplished in the lab, this piece on Quanta Magazine byNatalie Wolchover is the bees knees.

Read the original here:

Google's 'time crystals' could be the greatest scientific achievement of our lifetimes - The Next Web

Read More..

Nonprofits Get a New Type of Donation: Cryptocurrency – The New York Times

Still, she said, there are plenty of resources, like the Giving Block, that allow people to donate cryptocurrency and nonprofits to receive it safely and relatively easily.

Donor-advised funds, which allow people to make donations today for tax purposes and recommend charitable grants at a later date, have seen an increase in cryptocurrency donations. Among them are Fidelity Charitable, the largest donor-advised fund in the United States, with over $35 billion in assets, and its main competitor, Schwab Charitable, with over $17 billion.

So far this year, Fidelity Charitable has received $150 million in cryptocurrency, up from $28 million for all of 2020 and $13 million in 2019, said a spokesman, Stephen Austin. The appreciated value of cryptocurrency is prompting more donors to use this asset to fund their charitable giving as well as increasing the average size of each contribution, he said.

What neither Fidelity Charitable nor Schwab Charitable does is manage the cryptocurrency, meaning that they sell it and put marketable securities or cash into the clients donor-advised fund.

Generally, charities are conservative with how they want to manage assets, said Todd Eckler, executive director of Fiduciary Trust Charitable, a donor-advised fund that has about $250 million in assets and does not have cryptocurrency abilities. You could see the value evaporate pretty quickly. Its highly volatile, and its not a good fit for many charitable institutions.

For Mr. Zeller, who helped broker the Bitcoin donation at Penn, the ability to accept cryptocurrency is what matters most.

Its very nice to have the capacity to do it when a donor says, I have some Bitcoin, he said. We can accept it now without it grinding the university to a halt.

View post:
Nonprofits Get a New Type of Donation: Cryptocurrency - The New York Times

Read More..

Bitcoin-based scams mean the federal government now needs a crypto bank – Vox.com

Due to a surge of cryptocurrency-fueled crimes, federal law enforcement is seizing a lot of bitcoin. Now the US government is figuring out what to do with all of it.

This week, a small platform for safekeeping cryptocurrency called Anchorage Digital announced it had won a contract from the Department of Justice to store and liquidate digital assets that federal law enforcement seizes following criminal investigations. The government has essentially hired a bank to store and sell billions of dollars worth of forfeited cryptocurrency, including troves of bitcoin and ethereum. Anchorage Digital, which is based in San Francisco, is an obvious choice for a partner, as its the first federally chartered bank for crypto.

Theres no traditional bank that actually offers these services because this is extremely complex from a technical perspective, Diogo Monica, Anchorages co-founder and president, told Recode. Its very hard to store these safely. In fact, there are many, many stories of people losing access to their bitcoin and other cryptocurrency wallets and just losing access completely to them without the ability to be recovered.

That the US Marshals Service needs to hire a cryptocurrency company for help is a reminder that, as these kinds of digital assets go mainstream, theyre also becoming more popular with criminals. In fact, as law enforcement shut down illegal cryptocurrency operations, from ransomware schemes to illegal online markets, its clear that the US government could hold a very large amount of bitcoin, ethereum, and other cryptocurrency. Accordingly, Uncle Sam might even become a more significant player in the crypto marketplace in the months and years to come.

Since its creation, cryptocurrency has been popular for criminals because the accounts and transactions are difficult to trace back to any one person. Now crypto is at the center of a wide swath of illegal schemes, including blackmail scams, Covid-19 vaccine counterfeits, money laundering operations, and illicit sales on the darknet. In the first half of this year, people sent more than $2 million worth of cryptocurrency to Elon Musk impersonators following a grift on social media, according to the Federal Trade Commission (FTC). And earlier this month, a Swedish man was sentenced to 15 years in prison after he pleaded guilty to orchestrating one of the largest cryptocurrency-based Ponzi schemes the US government has ever prosecuted. The man had tricked people into sending him bitcoin, as well as other digital payments, under the guise of a (fake) gold-backed investment opportunity.

Cryptocurrency is not government currency, so its very international in scope, which is why it has become even more popular with transnational organized crime, as well as terrorism, said Suzanne Lynch, a Utica College professor who focuses on economic crime.

Through investigating these crimes and prosecuting the perpetrators, federal law enforcement has acquired a sizable cache of cryptocurrency. In June, the DOJ seized about $2.3 million worth of bitcoin the FBI had obtained after tracking the movement of a ransom payment associated with the Colonial Pipeline cyberattack earlier this summer. This was after the agency seized about $1 billion in cryptocurrency that once belonged to Ross Ulbricht, creator of the online black market Silk Road, which federal officials shut down in 2013. Ulbricht was arrested that year and convicted in 2015 of distributing narcotics and money laundering.

Theres no differentiation here between crypto and an oil tanker, for lack of a better example, or car or fiat [currency], when it comes to how it will ultimately be used in an asset forfeiture regime, said Ari Redbord, a former prosecutor and the head of government affairs at TRM, a cryptocurrency fraud detection startup.

The US Marshals Service is the agency in charge of holding and auctioning off many seized assets, including art, rare collectibles, and real estate, from disgraced pharmaceuticals CEO Martin Shkrelis Wu-Tang album to Bernie Madoffs apartments. Since at least 2014, the DOJs asset forfeiture program, which is run by the marshals, has taken the same approach with cryptocurrency and opened up the stores of crypto it seizes to bids from the public. But the Marshals Service announced in 2019 that it was looking for more help managing all these digital assets.

Pricing, how to price them, how to evaluate it, how to liquidate it, how to safe keep it people are being forced to deal with the asset class because its so prevalent now, Monica, of Anchorage, told Recode. To do that well can be especially tricky since cryptocurrency markets can be extremely volatile.

As the DOJ moves forward with its plan to manage digital assets, calls for tighter regulations on cryptocurrency are coming from higher and higher up. Sen. Elizabeth Warren (D-MA), for instance, said this month that cryptocurrencies should face tighter rules, while some senators recently proposed taxing cryptocurrency transactions to fund President Joe Bidens infrastructure plan. Earlier this month, Federal Reserve Chair Jerome Powell even suggested that the federal government could launch a digital version of the US dollar as an alternative to cryptocurrencies, though hes still undecided on whether thats a good idea.

Despite lawmakers and regulators growing concern about cryptocurrencies, their popularity is forcing the government to adapt. One recent survey from NORC, a research institute at the University of Chicago, found that 13 percent of people in the US bought or traded crypto in the past year alone, compared to the estimated half of US households that have invested in the stock market, according to Pew.

This all serves as a reminder that cryptocurrencies are only becoming more prevalent, which means that crypto scammers arent going away anytime soon. So beware of demands for cryptocurrency payments from fishy romantic prospects, too-good-to-be-true investment opportunities, supposed blackmailers, and people claiming to be Elon Musk. If youre not careful, your bitcoin might end up in the federal governments new crypto bank.

See the original post:
Bitcoin-based scams mean the federal government now needs a crypto bank - Vox.com

Read More..