Page 3,073«..1020..3,0723,0733,0743,075..3,0803,090..»

Measure to attract more cryptocurrency mining facilities in Kentucky passes the House – The Lane Report

FRANKFORT, Ky. The House of Representatives passed a measure Wednesday that positions Kentucky as an attractive location for future economic investment by companies engaged in the cryptocurrency mining. The bill, HB 230, would allow cryptocurrency businesses to qualify for exemptions particularly the sales tax levied on electricity.

House Majority Floor Leader Steven Rudy of Paducah is the bills primary sponsor and Representative Chris Freeland of Benton serves as the primary co-sponsor.

Cryptocurrency is a new, interesting and highly sophisticated industry thats getting a lot of international attention, Rudy said. Mining for cryptocurrency is highly technical and it is a highly sophisticated industry. Available jobs in this industry can be lucrative and increasing rapidly. As we look to building our economy, we have to be in the right position to welcome the jobs of tomorrow.

Essentially, cryptocurrency is a digital currency that is exchanged between peers without the need of a third party. It enables consumers to digitally connect directly through a transparent process, showing the financial amount, but not the identities of the people conducting the transaction. The network consists of a chain of computers, which are all required to approve a cryptocurrency exchange and prevent duplication of the same transaction. Because of its transparency, this type of transaction has the potential to reduce fraud.

The cryptocurrency procedure uses digital safeguards to ensure the security of transactions. In addition, each transaction must be confirmed in a digital public ledger, called a blockchain, through a process known as mining. Cryptocurrency exchange is somewhat similar to the online payment systems, PayPal and Venmo, except the currency being exchanged is not traditional money.

In 2019, Core Scientific, one of the largest block chain hosting providers in the United States, opened a facility in Calvert City. The facility in Calvert City brought an industry with new innovation and jobs to my district, Representative Chris Freeland R-Benton added. As this industry continues to grow, this effort has great potential for our commonwealth.

HB 230 now moves to the Senate for consideration. To review co-sponsors and specific details of the measure, please visit the Legislative Research Commission website or follow thelinkhere.

Read the rest here:
Measure to attract more cryptocurrency mining facilities in Kentucky passes the House - The Lane Report

Read More..

Nvidia Wins Lawsuit Over $1 Billion in Cryptocurrency Mining-Related Sales – Tom’s Hardware

On Tuesday, U.S. District Court Judge Haywood Gilliam dismissed a lawsuit alleging that Nvidia misled investors over $1 billion in sales to cryptocurrency miners.

The lawsuit claimed that roughly 60-70% of Nvidias sales in China, its largest market, were to miners in 2017 and 2018. That alone might not have been an issue, but the company was accused of keeping the extent of the mining industrys influence on its success a secret from investors by attributing those sales to its Gaming division.

Nvidia didnt share information specifically related to cryptocurrency mining until the first quarter of 2018, and that was to warn investors that it expected those sales to decline by 66% the following quarter, largely because of the crypto market bust. The disclosure caused a 7.85% drop in the companys share price despite record profits.

Its not hard to see why some Nvidia shareholders were upset about the news. But it wasnt exactly a secret that GeForce-branded graphics cards were popular with miners, either, despite the fact that they were originally developed for PC gaming. That appears to be why Gilliam sided with Nvidia by dismissing the lawsuit.

Gilliam essentially said in the filing that the plaintiffs failed to provide adequate evidence that Nvidia misled investors throughout 2017 and 2018. The company acknowledged that some of the sales of GeForce products were to miners, even if it didnt provide exact figures, and that appears to have satisfied Gilliam.

Mining remains a lucrative business for Nvidia the company estimated that between $100 and $300 million of its Q4 2020 revenues were from sales to miners. That variance shows two things. The first is that Nvidia still cant determine exactly how much of its sales can be attributed to people mining cryptocurrency.

The second is that mining remains a relatively small aspect of Nvidias business. The company reported $5 billion in revenues, $2.5 billion of which came from the Gaming division, last quarter. Even if the $300 million attributed to miners is a conservative estimate, the vast majority of Nvidias revenues came from elsewhere.

That probably wont be particularly comforting to enthusiasts competing with cryptocurrency miners over the short supply of available graphics cards (and gaming notebooks) for their builds. It should help Nvidia shareholders understand the mining industrys effect on the company, though, so its still a win of sorts.

Read the rest here:
Nvidia Wins Lawsuit Over $1 Billion in Cryptocurrency Mining-Related Sales - Tom's Hardware

Read More..

What is Machine Learning? | IBM

Machine learning focuses on applications that learn from experience and improve their decision-making or predictive accuracy over time.

Machine learning is a branch of artificial intelligence (AI) focused on building applications that learn from data and improve their accuracy over time without being programmed to do so.

In data science, an algorithm is a sequence of statistical processing steps. In machine learning, algorithms are 'trained' to find patterns and features in massive amounts of data in order to make decisions and predictions based on new data. The better the algorithm, the more accurate the decisions and predictions will become as it processes more data.

Today, examples of machine learning are all around us. Digital assistants search the web and play music in response to our voice commands. Websites recommend products and movies and songs based on what we bought, watched, or listened to before. Robots vacuum our floors while we do . . . something better with our time. Spam detectors stop unwanted emails from reaching our inboxes. Medical image analysis systems help doctors spot tumors they might have missed. And the first self-driving cars are hitting the road.

We can expect more. As big data keeps getting bigger, as computing becomes more powerful and affordable, and as data scientists keep developing more capable algorithms, machine learning will drive greater and greater efficiency in our personal and work lives.

There are four basic steps for building a machine learning application (or model). These are typically performed by data scientists working closely with the business professionals for whom the model is being developed.

Training data is a data set representative of the data the machine learning model will ingest to solve the problem its designed to solve. In some cases, the training data is labeled datatagged to call out features and classifications the model will need to identify. Other data is unlabeled, and the model will need to extract those features and assign classifications on its own.

In either case, the training data needs to be properly preparedrandomized, de-duped, and checked for imbalances or biases that could impact the training. It should also be divided into two subsets: the training subset, which will be used to train the application, and the evaluation subset, used to test and refine it.

Again, an algorithm is a set of statistical processing steps. The type of algorithm depends on the type (labeled or unlabeled) and amount of data in the training data set and on the type of problem to be solved.

Common types of machine learning algorithms for use with labeled data include the following:

Algorithms for use with unlabeled data include the following:

Training the algorithm is an iterative processit involves running variables through the algorithm, comparing the output with the results it should have produced, adjusting weights and biases within the algorithm that might yield a more accurate result, and running the variables again until the algorithm returns the correct result most of the time. The resulting trained, accurate algorithm is the machine learning modelan important distinction to note, because 'algorithm' and 'model' are incorrectly used interchangeably, even by machine learning mavens.

The final step is to use the model with new data and, in the best case, for it to improve in accuracy and effectiveness over time. Where the new data comes from will depend on the problem being solved. For example, a machine learning model designed to identify spam will ingest email messages, whereas a machine learning model that drives a robot vacuum cleaner will ingest data resulting from real-world interaction with moved furniture or new objects in the room.

Machine learningmethods (also called machine learning styles) fall into three primary categories.

Supervised machine learning trains itself on a labeled dataset. That is, the data is labeled with information that the machine learning model is being built to determine and that may even be classified in ways the model is supposed to classify data. For example, a computer vision model designed to identify purebred German Shepherd dogs might be trained on a data set of various labeled dog images.

Supervised machine learning requires less training data than other machine learningmethods and makes training easier because the results of the model can be compared to actual labeled results. But, properly labeled data is expensive to prepare, and there's the danger of overfitting, or creating a model so closely tied and biased to the training data that it doesn't handle variations in new data accurately.

Learn more about supervised learning.

Unsupervised machine learning ingests unlabeled datalots and lots of itand uses algorithms to extract meaningful features needed to label, sort, and classify the data in real-time, without human intervention. Unsupervised learning is less about automating decisions and predictions, and more about identifying patterns and relationships in data that humans would miss. Take spam detection, for examplepeople generate more email than a team of data scientists could ever hope to label or classify in their lifetimes. An unsupervised learning algorithm can analyze huge volumes of emails and uncover the features and patterns that indicate spam (and keep getting better at flagging spam over time).

Learn more about unsupervised learning.

Semi-supervised learning offers a happy medium between supervised and unsupervised learning. During training, it uses a smaller labeled dataset to guide classification and feature extraction from a larger, unlabeled data set. Semi-supervised learning can solve the problem of having not enough labeled data (or not being able to afford to label enough data) to train a supervised learning algorithm.

Reinforcement machine learning is a behavioral machinelearning model that is similar to supervised learning, but the algorithm isnt trained using sample data. This model learns as it goes by using trial and error. A sequence of successful outcomes will be reinforced to develop the best recommendation or policy for a given problem.

The IBM Watson system that won the Jeopardy! challenge in 2011 makes a good example. The system used reinforcement learning to decide whether to attempt an answer (or question, as it were), which square to select on the board, and how much to wagerespecially on daily doubles.

Learn more about reinforcement learning.

Deep learning is a subset of machine learning (all deep learning is machine learning, but not all machine learning is deep learning). Deep learning algorithms define an artificial neural network that is designed to learn the way the human brain learns. Deep learning models require large amounts of data that pass through multiple layers of calculations, applying weights and biases in each successive layer to continually adjust and improve the outcomes.

Deep learning models are typically unsupervised or semi-supervised. Reinforcement learning models can also be deep learning models. Certain types of deep learning modelsincluding convolutional neural networks (CNNs) and recurrent neural networks (RNNs)are driving progress in areas such as computer vision, natural language processing (including speech recognition), and self-driving cars.

See the blog post AI vs. Machine Learning vs. Deep Learning vs. Neural Networks: Whats the Difference? for a closer look at how the different concepts relate.

Learn more about deep learning.

As noted at the outset, machine learning is everywhere. Here are just a few examples of machine learning you might encounter every day:

IBM Watson Machine Learning supports the machine learning lifecycle end to end. It is available in a range of offerings that let you build machine learning models wherever your data lives and deploy them anywhere in your hybrid multicloud environment.

IBM Watson Machine Learning on IBM Cloud Pak for Data helps enterprise data science and AI teams speed AI development and deployment anywhere, on a cloud native data and AI platform. IBM Watson Machine Learning Cloud, a managed service in the IBM Cloud environment, is the fastest way to move models from experimentation on the desktop to deployment for production workloads. For smaller teams looking to scale machine learning deployments, IBM Watson Machine Learning Server offers simple installation on any private or public cloud.

To get started, sign up for an IBMid and create your IBM Cloud account.

Read this article:
What is Machine Learning? | IBM

Read More..

Next Raspberry Pi CPU Will Have Machine Learning Built In – Tom’s Hardware

At the recent tinyML Summit 2021, Raspberry Pi co-founder Eben Upton teased the future of 'Pi Silicon' and it looks like machine learning could see a massive improvement thanks to Raspberry Pi's news in-house chip development team.

It is safe to say that the Raspberry Pi Pico and its RP2040 SoC have been popular. The Pico has only been on the market for a few weeks, but already has sold 250,000 units with 750,000 on back order. There is a need for more boards powered by the RP2040 and partners such as Adafruit, Pimoroni, Adafruit and Sparkfun are releasing their own hardware, many with features not found on the Pico.

Raspberry Pi's in house application specific integrated circuit (ASIC) team are working on the next iteration, and seems to be focused on lightweight accelerators for ultra low power machine learning applications.

During Upton's talk at 40 minutes the slide changes and we see "Future Directions" a slide that shows three current generation 'Pi Silicon' boards, two of which are from board partners, SparkFun's MicroMod RP2040 and Arduino's Nano RP2040 Connect. The third is from ArduCam and they are working on the ArduCam Pico4ML which incorporates machine learning, camera, microphone and screen into a the Pico package.

The last bullet point hints at what the future silicon could be. It may come in the form of lightweight accelerators possibly 4-8 multiply-accumulates (MACs) per clock cycle. In Upton's talk he says that it is "overwhelmingly likely that there will be some other piece of silicon from Raspberry Pi".

Want to learn more about the Raspberry Pi Pico? We have just the page for you.

More here:
Next Raspberry Pi CPU Will Have Machine Learning Built In - Tom's Hardware

Read More..

Introducing Helios, The New Corrugated AI/Machine Learning Platform – WhatTheyThink

The OEM-Agnostic Industrial Internet of Things (IIoT) solution from SUN Automation Group maximizes profitability for corrugated manufacturers through actionable data and insight

Glen Arm, Md. Helios, the new AI and Machine Learning platform tailored specifically to the corrugated converting industry, launched today. The platform is OEM-agnostic and engineered to provide corrugated manufacturers access to robust, actionable insights into the performance of their machines -- enabling minimized downtime, optimized maintenance schedules, and maximized profit. Helios is a product of SUN Automation Group.

IIoT makes every bit of data actionable, says Helios Director of Technology Matthew C. Miller. So many corrugated plants rely on human intuition and experience to drive their decisions. With Helios, anomalies that are imperceptible to even the most well-trained operators can be detected in real time and acted upon. And the machine learning capabilities will mean that the platform only gets smarter the more data and user reactions that it is able to process.

The new platform is designed to minimize downtime, maximize profitability, and decrease the opportunity cost associated with only taking machines offline for preventative maintenance (as opposed to for major malfunctions). These high-level benefits manifest themselves in specific cost and resource-optimizing operational benefits and actionable insights.

Some of the most beneficial insights are preventative/proactive parts ordering, knowledge about the exact time and cost of parts replacements, the ability for operators to pinpoint the source of slowdowns and other issues, and operator-efficiency training to help machine operators learn and adapt to best practices.

"We understand that data is only as powerful as the actionable insights it can provide, says Chris Kyger, President, SUN Automation Group. That's why we are so excited to bring Helios to the corrugated industry. This incredible technology will help box plants increase productivity and efficiency while reducing costs and downtime.

Helios provides core insights from an accessible, user-friendly dashboard enabling three key benefits: remote monitoring, predictive maintenance, and anomaly detection.

Remote monitoring provides deep insights into current and historical machine operation and performance that can be seen and accessed in real time from any device. Predictive maintenance optimizes machine maintenance intervals using artificial intelligence that adapts based on the machine operation and usage. Anomaly detection notifies users about abnormal machine states that allow operators to react to a potential issue before the failure occurs. More robust predictive analytics will be phased into the platform over time.

Corrugated manufacturers will have access to a free Helios demo, allowing them to experience the platform. The site also provides a Return On Investment calculator that can showcase the benefits that Helios can offer to operations of all sizes and scales.

SUN Automation Group is a corrugated converting industry leader in innovative solutions and aftermarket support and services. Since 1986, SUN has led the way for automated production solutions in the corrugated manufacturing space.

Read the original post:
Introducing Helios, The New Corrugated AI/Machine Learning Platform - WhatTheyThink

Read More..

Ann Coulter: Attack of the woke teen career killers – Today’s News-Herald

I was a mere 70 pages into Donald McNeils brief about his firing from The New York Times when I emailed a dozen of my friends to demand they read it immediately. But they dont have my perseverance, so here are the highlights.

Two years after McNeil chaperoned a group of high schoolers on a trip to Peru to learn about rural health care, The Daily Beast published an article detailing the students list of denunciations against him, including the career-ending claim that hed used the N-word.

Days later, it came out that he had used the word in response to a students question about a high school girl whod been suspended from school for using the infamous word. He repeated it in order to ask how shed said it.

This paragraph, particularly the parenthetical, is all you need to know about McNeils misadventure in Peru:

At some point, a student took issue with my having said the U.S. wasnt a colonial power, saying something like: Dont you realize what the CIA has done? Dont you realize that the United Fruit Company interfered in central America to protect its banana monopoly? ... (This student herself was white, from Greenwich, CT and went to Andover but mentioned multiple times over the week that she had a Latino boyfriend and he had opened her eyes to a different view of the world ...)

None of the students on this resume-padding trip were black. There was one Asian, and the rest were white, dripping with white privilege. (Who else goes on a Princeton-bait trip to Peru in high school to learn about rural health care?) Twenty of the 22 students were girls. All appear to be complete idiots.

McNeil went on the exact same trip and gave the same lectures to a different group of high school students the summer before and got rave reviews. But the 2019 batch were in the advanced Spotting Racism class.

During McNeils struggle sessions with his interrogators at the Times, he was accused of an array of crimes against political correctness. Heres a sampling:

Charlotte (Behrendt, associate managing editor for employee relations): Did you say the word n****r on this trip?

McNeil: Yes, I did. [Explains context.]

Charlotte: Did you say theres no such thing as white privilege?

McNeil: No. Thats ridiculous ...

Charlotte: So you didnt say there was no such thing?

McNeil: No. Absolutely not. That doesnt even make any sense.

Charlotte: Did you say there is no such thing as institutional racism?

McNeil: No, I didnt ...

Charlotte: Did you say it was OK to wear blackface?

McNeil: No, I didnt.

Charlotte: Did you say climate change didnt matter because it only killed poor people?

McNeil: What? No, of course not.

Charlotte: Did you make fun of a students hometown?

McNeil: I dont think so. What hometown?

McNeils unprovoked attack on someones hometown consisted of his hearing that one student was from Boston, and saying, Nice town ... except for that baseball team. [Yankees-Red Sox rivalry ensues.]

Charlotte: Did you tell a joke about a doctor and a Jewish mother?

McNeil: A doctor and a Jewish mother ...? I dont think so ... Do you know the joke?

McNeil later remembered that hed used a stock joke from his usual speech to doctors:

I was pre-med for a year, but when I told my mother what I was thinking, she laughed and said: Donald, youre never going to be a doctor. You dont have the patience to get through medical school.

So, if any of you are wondering what its like to NOT be raised by a Jewish mother, thats pretty much it: You say you want to be a doctor, she laughs at you and says, Itll never happen.

The endless questioning of McNeils jokes and comments feels like a weird, stressful dream. But the little Nazi block watchers held a trump card: Theyd asked him about the N-word and ... HE RESPONDED!

Fired.

McNeils story goes far beyond him, a crotchety leftist, angry about people walking in parks during the COVID shutdowns. Way too much of his response consists of his submission to the woke overlords, admitting that maybe he IS a racist and denouncing his grandfather as an anti-Semite. So forget McNeil. Its Iran-Iraq.

Nonetheless, his story gives readers a terrifying glimpse of the next generation of grim conformists being pumped out by the nations education establishment.

These holy terrors are tormenting newsrooms across New York City at New York magazine, The New Yorker and The New York Times. They are true believers, not original thinkers race-obsessed, gender-obsessed, anti-white, anti-American, and much, much stupider than reporters used to be. Just tell me what Im supposed to think and Ill think it. These are the sort of people who ought to be office managers ordering staples and mousepads, not people who report news.

These sourpuss zealots are in such a mad race to show their wokeness, they are useless as conduits for the news. What they do isnt reporting. Its terrorism.

Excerpt from:
Ann Coulter: Attack of the woke teen career killers - Today's News-Herald

Read More..

What is Quantum Computing | Microsoft Azure

It's the use of quantum mechanics to run calculations on specialized hardware.

To fully define quantum computing, we need to define some key terms first.

The quantum in "quantum computing" refers to the quantum mechanics that the system uses to calculate outputs. In physics, a quantum is the smallest possible discrete unit of any physical property. It usually refers to properties of atomic or subatomic particles, such as electrons, neutrinos, and photons.

A qubit is the basic unit of information in quantum computing. Qubits play a similar role in quantum computing as bits play in classical computing, but they behave very differently. Classical bits are binary and can hold only a position of 0 or 1, but qubits can hold a superposition of all possible states.

Quantum computers harness the unique behavior of quantum physicssuch as superposition, entanglement, and quantum interferenceand apply it to computing. This introduces new concepts to traditional programming methods.

In superposition, quantum particles are a combination of all possible states. They fluctuate until they're observed and measured. One way to picture the difference between binary position and superposition is to imagine a coin. Classical bits are measured by "flipping the coin" and getting heads or tails. However, if you were able to look at a coin and see both heads and tails at the same time, as well as every state in between, the coin would be in superposition.

Entanglement is the ability of quantum particles to correlate their measurement results with each other. When qubits are entangled, they form a single system and influence each other. We can use the measurements from one qubit to draw conclusions about the others. By adding and entangling more qubits in a system, quantum computers can calculate exponentially more information and solve more complicated problems.

Quantum interference is the intrinsic behavior of a qubit, due to superposition, to influence the probability of it collapsing one way or another. Quantum computers are designed and built to reduce interference as much as possible and ensure the most accurate results. To this end, Microsoft uses topological qubits, which are stabilized by manipulating their structure and surrounding them with chemical compounds that protect them from outside interference.

A quantum computer has three primary parts:

For some methods of qubit storage, the unit that houses the qubits is kept at a temperature just above absolute zero to maximize their coherence and reduce interference. Other types of qubit housing use a vacuum chamber to help minimize vibrations and stabilize the qubits.

Signals can be sent to the qubits using a variety of methods, including microwaves, laser, and voltage.

Quantum computer uses and application areas

A quantum computer can't do everything faster than a classical computer, but there are a few areas where quantum computers have the potential to make a big impact.

Quantum computers work exceptionally well for modeling other quantum systems because they use quantum phenomena in their computation. This means that they can handle the complexity and ambiguity of systems that would overload classical computers. Examples of quantum systems that we can model include photosynthesis, superconductivity, and complex molecular formations.

Classical cryptographysuch as the RivestShamirAdleman (RSA) algorithm thats widely used to secure data transmissionrelies on the intractability of problems such as integer factorization or discrete logarithms. Many of these problems can be solved more efficiently using quantum computers.

Optimization is the process of finding the best solution to a problem given its desired outcome and constraints. In science and industry, critical decisions are made based on factors such as cost, quality, and production timeall of which can be optimized. By running quantum-inspired optimization algorithms on classical computers, we can find solutions that were previously impossible. This helps us find better ways to manage complex systems such as traffic flows, airplane gate assignments, package deliveries, and energy storage.

Machine learning on classical computers is revolutionizing the world of science and business. However, training machine learning models comes with a high computational cost, and that has hindered the scope and development of the field. To speed up progress in this area, we're exploring ways to devise and implement quantum software that enables faster machine learning.

A quantum algorithm developed in 1996 dramatically sped up the solution to unstructured data searches, running the search in fewer steps than any classical algorithm could.

Azure Quantum resources

Build quantum solutions today as an early adopter of Azure Quantum Preview, a full-stack open cloud ecosystem. Access software, hardware, and pre-built solutions and start developing on a trusted, scalable, and secure platform.

View post:
What is Quantum Computing | Microsoft Azure

Read More..

Explainer: What is a quantum computer? | MIT Technology Review

This is the first in a series of explainers on quantum technology. The other two are on quantum communication and post-quantum cryptography.

A quantum computer harnesses some of the almost-mystical phenomena of quantum mechanics to deliver huge leaps forward in processing power. Quantum machines promise to outstrip even the most capable of todaysand tomorrowssupercomputers.

They wont wipe out conventional computers, though. Using a classical machine will still be the easiest and most economical solution for tackling most problems. But quantum computers promise to power exciting advances in various fields, from materials science to pharmaceuticals research. Companies are already experimenting with them to develop things like lighter and more powerful batteries for electric cars, and to help create novel drugs.

The secret to a quantum computers power lies in its ability to generate and manipulate quantum bits, or qubits.

Today's computers use bitsa stream of electrical or optical pulses representing1s or0s. Everything from your tweets and e-mails to your iTunes songs and YouTube videos are essentially long strings of these binary digits.

Quantum computers, on the other hand, usequbits, whichare typically subatomic particles such as electrons or photons. Generating and managing qubits is a scientific and engineering challenge. Some companies, such as IBM, Google, and Rigetti Computing, use superconducting circuits cooled to temperatures colder than deep space. Others, like IonQ, trap individual atoms in electromagnetic fields on a silicon chip in ultra-high-vacuum chambers. In both cases, the goal is to isolate the qubits in a controlled quantum state.

Qubits have some quirky quantum properties that mean a connected group of them can provide way more processing power than the same number of binary bits. One of those properties is known as superposition and another is called entanglement.

Qubits can represent numerous possible combinations of 1and 0 at the same time. This ability to simultaneously be in multiple states is called superposition. To put qubits into superposition, researchers manipulate them using precision lasers or microwave beams.

Thanks to this counterintuitive phenomenon, a quantum computer with several qubits in superposition can crunch through a vast number of potential outcomes simultaneously. The final result of a calculation emerges only once the qubits are measured, which immediately causes their quantum state to collapse to either 1or 0.

Researchers can generate pairs of qubits that are entangled, which means the two members of a pair exist in a single quantum state. Changing the state of one of the qubits will instantaneously change the state of the other one in a predictable way. This happens even if they are separated by very long distances.

Nobody really knows quite how or why entanglement works. It even baffled Einstein, who famously described it as spooky action at a distance. But its key to the power of quantum computers. In a conventional computer, doubling the number of bits doubles its processing power. But thanks to entanglement, adding extra qubits to a quantum machine produces an exponential increase in its number-crunching ability.

Quantum computers harness entangled qubits in a kind of quantum daisy chain to work their magic. The machines ability to speed up calculations using specially designed quantum algorithms is why theres so much buzz about their potential.

Thats the good news. The bad news is that quantum machines are way more error-prone than classical computers because of decoherence.

The interaction of qubits with their environment in ways that cause their quantum behavior to decay and ultimately disappear is called decoherence. Their quantum state is extremely fragile. The slightest vibration or change in temperaturedisturbances known as noise in quantum-speakcan cause them to tumble out of superposition before their job has been properly done. Thats why researchers do their best to protect qubits from the outside world in those supercooled fridges and vacuum chambers.

But despite their efforts, noise still causes lots of errors to creep into calculations. Smart quantum algorithmscan compensate for some of these, and adding more qubits also helps. However, it will likely take thousands of standard qubits to create a single, highly reliable one, known as a logical qubit. This will sap a lot of a quantum computers computational capacity.

And theres the rub: so far, researchers havent been able to generate more than 128 standard qubits (see our qubit counter here). So were still many years away from getting quantum computers that will be broadly useful.

That hasnt dented pioneers hopes of being the first to demonstrate quantum supremacy.

Its the point at which a quantum computer can complete a mathematical calculation that is demonstrably beyond the reach of even the most powerful supercomputer.

Its still unclear exactly how many qubits will be needed to achieve this because researchers keep finding new algorithms to boost the performance of classical machines, and supercomputing hardware keeps getting better. But researchers and companies are working hard to claim the title, running testsagainst some of the worlds most powerful supercomputers.

Theres plenty of debate in the research world about just how significant achieving this milestone will be. Rather than wait for supremacy to be declared, companies are already starting to experiment with quantum computers made by companies like IBM, Rigetti, and D-Wave, a Canadian firm. Chinese firms like Alibaba are also offering access to quantum machines. Some businesses are buying quantum computers, while others are using ones made available through cloud computing services.

One of the most promising applications of quantum computers is for simulating the behavior of matterdown to the molecular level. Auto manufacturers like Volkswagen and Daimler are using quantum computers to simulate the chemical composition of electrical-vehicle batteries to help find new ways to improve their performance. And pharmaceutical companies are leveraging them to analyze and compare compounds that could lead to the creation of new drugs.

The machines are also great for optimization problems because they can crunch through vast numbers of potential solutions extremely fast. Airbus, for instance, is using them to help calculate the most fuel-efficient ascent and descent paths for aircraft. And Volkswagen has unveiled a service that calculates the optimal routes for buses and taxis in cities in order to minimize congestion. Some researchers also think the machines could be used to accelerate artificial intelligence.

It could take quite a few years for quantum computers to achieve their full potential. Universities and businesses working on them are facing a shortage of skilled researchersin the fieldand a lack of suppliersof some key components. But if these exotic new computing machines live up to their promise, they could transform entire industries and turbocharge global innovation.

Follow this link:
Explainer: What is a quantum computer? | MIT Technology Review

Read More..

Cambridge Quantum Computing’s entanglements are at the heart of a new technological era – Cambridge Independent

Cambridge Quantum Computing is developing a leadership position in four quantum domains quantum cybersecurity, quantum chemistry, quantum machine learning and quantum finance.

Founded in 2014, the company was initiated by Ilyas Khan, the founding chairman of The Stephen Hawking Foundation and a fellow at St Edmunds college.

I was one of three founders and the sole original founding investor of the Accelerate Cambridge programme, which is run from Cambridge Judge Business School, Ilyas says of the exegesis of one of the worlds key quantum technology companies from its butterfly cocoon. Cambridge Quantum Computing emerged from the idea that Cambridge could produce a successful deep science company and, when this company was founded in 2014, there were three motivating factors.

Firstly, the experience at Accelerate Cambridge was very exciting and secondly, the emergence of quantum computing hardware, which had until then been an aspiration.

Thirdly, Google and IBM were by then involved, and so it shifted from a subject within academia to business in the private sector.

Indeed, the UK National Quantum Technologies programme had started in 2013, with quantum engineers and technologists meeting the entrepreneurial sector for the first time. The goal a mere aspiration back then was to develop products and services which made use of quantum superposition and quantum entanglement. The results are now starting to bear fruit.

Cambridge Quantum Computing is a result of the success of the National Quantum Technologies programme, Ilyas notes. An analogy would be to say that it would not be dissimilar to someone setting up a business to focus on the internet in 1996 or 97. Early in 2014 the themes were coming together. At that time I thought the business might be viable by 2024, and obviously since then its been far faster.

Indeed, just this yearCambridge Quantum Computing (CQC) announced a collaboration with Roche to design and implement noisy-intermediate-scale-quantum (NISQ) algorithms for early-stage drug discovery and development. The partnership will employ CQCs leading quantum chemistry platform, EUMEN, to augment Roches Alzheimers disease research efforts.

And last week Crown Bioscience, JSR Life Sciences and CQC announced a partnership agreement, with the initial approach being to focus on identifying cancer treatment biomarkers and driving the next generation of bioinformatics.

The upsurge coincides with a move from the Cambridge Union Society building on Bridge Street to Station Road, says Ilyas.

We outgrew the space at the Cambridge Union and decided to look around last summer well have between 50 and 60 people there.

There are other sites, in London, Chessington, San Francisco and Washington DC in the US, and Tokyo.

The company as a whole has more than 130 people now, Ilyas says. Were very science-heavy, with more than 100 scientists more than 60 with PhDs with a very strong business development team, a very strong legal and finance team.

The quantum sector divides into three areas: quantum technologies, which is quantum clocks and metrology, and were not in that. Second is quantum computers the hardware and were not there either. Third is applications, algorithms and software; were very active in that area.

So what are the possible applications? CQC develops specific products and platforms for quantum chemistry (EUMEN); and t|ket>, an architecture-agnostic quantum software stack and best in class compiler which translates machine-independent algorithms into executable circuits, optimising for physical qubit layout.

And with its IronBridge quantum encryption technology, CQC has developed methods to provide current and post-quantum cybersecurity by solving the most fundamental vulnerabilities in cryptographic protocols and procedures.

One thing that is rarely mentioned in the same breath as quantum is autonomous driving why is that?

There is no informed consensus on whether machine learning will be capable of having a day-to-day impact on autonomous driving any time soon, Ilyas replies.

My view is that some way in the future, however theoretical, machine learning is a very exciting area for the development of quantum computing.

Machine learning is here and, at Cambridge Quantum Computing, is an area of AI weve been most interested in, and without question are a global leader in meaning-aware language processing so the ability of a computer or device is not just word or speech recognition as in Alexa, for example but full-sentence, paragraphs and full conversations.

There are technical reasons why a quantum computer will ultimately be able to do something a classical computer will not, for example, quantum chemistry is one area where a quantum computer can do something a classical computer will not. The other area is meaning-aware language processing, and Id say this is an extremely powerful and global area for quantum computing.

So thats drug discovery, linguistic processing and cybersecurity from a defensive standpoint.

Its difficult to predict when it could be one or two years, or seven to 10. In other areas the jury is out.

And any sign of an operating system on the way?

Were many years away from an operating system for quantum computers, Ilyas answers. There will be operating systems, but at the moment anybody trying to say theyre working on an operating system is like me saying Im practising living on Mars because one day I want to be there.

All this is of a fit with an overarching goal the introduction of quantum computing to as many areas of business and science as possible.

As weve entered 2021, continues Ilyas, an increasing number of large global corporations from pharma to banks to logistical to petrochemical are already users of high-performance computers and in 2021 a larger number of corporations are starting to budget for quantum computing for one of two different reasons.

Either they believe a quantum computer has a credible chance of delivering a result, or they want to experiment for themselves what a quantum computer can do.

People are on a journey, starting to learn, but some organisations are already on that journey, as Microsoft has been for 20 years, IBM has been for decades, and Google has for ten years. CQC is a member of partnership organisations for all three.

It looks like a win-win-win situation for Cambridge Quantum Computing.

Link:
Cambridge Quantum Computing's entanglements are at the heart of a new technological era - Cambridge Independent

Read More..

UK Research and Innovation Initiative to Invest 153M in Quantum Tech which will Significantly Impact Financial Services – Crowdfund Insider

Quantum tech is expected to have a major impact on the financial services sector.

This is notably part of a larger investment in the United Kingdoms National Quantum Technologies Program which is set to provide 1 billion worth of investments over a 10-year period.

Large banking institutions, insurance service providers and regulatory agencies are currently assessing the different opportunities and advising their clients on quantum computers for quantitative finance, asset pricing and effective portfolio management.

Precise quantum clocks for accurately timestamping digital transactions to advanced high frequency trading and various quantum security solutions to protect financial data are also being developed.

The Commercializing Quantum Technologies Challenge, via UKRIs Industrial Strategy Challenge Fund (ISCF), has awarded 90 million across 42 initiatives in order to realize the potential of the latest quantum technologies.

A project led by Rigetti UK in partnership with Standard Chartered Bank, Oxford Instruments, Phasecraft and the University of Edinburgh has received 6.4 million in funding to support the commercialization of quantum computing in the United Kingdom. The 3-year initiative will focus on creating a sophisticated commercial quantum computer which will be accessible via the Cloud and will develop practical applications in machine learning, materials simulation and finance.

Roger McKinlay, Challenge Director, stated:

Quantum technologies are expected to have a huge impact on the financial services industry. Banks, insurance providers and regulators are already thinking ahead to the implications this technology will have on businesses, the economy and society. We are looking to fund the best teams of UK companies and research organizations to help them develop their ideas for innovation and commercialization.

The challenge will be launched via a 3-phased approach, allocating a share of 7 million in funding for conducting feasibility studies, 1 million for germinator initiatives and 47 million for large projects requiring extensive collaboration.

Follow this link:
UK Research and Innovation Initiative to Invest 153M in Quantum Tech which will Significantly Impact Financial Services - Crowdfund Insider

Read More..