Page 630«..1020..629630631632..640650..»

Tips on where and when to use a quantum computer – TechHQ

Where and when to use a quantum computer? Its one of the most common questions that experts, such as Kirk Bresniker Chief Architect at Hewlett Packard Labs, get asked by business leaders. Enterprises want to know where in the IT portfolio quantum computers will bring the most significant rewards and when is the right time for firms to invest in solutions.

For decades, quantum computing developers have been promising big things from quantum computers, which is understandable. Quantum computers are costly to develop, and being modest about the technology isnt going to win over investors. However, its important to note that quantum computers arent universal computing devices.

Quantum computing promises transformational gains for solving some problems, but little or none for others, write MIT Sloan School of Management researchers in a paper dubbed The Quantum Tortoise and the Classical Hare submitted to arXiv.

The team, led by Neil Thompson whose career includes appointments at Lawrence Livermore National Laboratories, Bain and Company, The United Nations, the World Bank, and the Canadian Parliament has come up with a simple framework for understanding which problems quantum computing will accelerate (and which it will not).

Quantum computers open the door to probabilistic computing, with quantum gates adding a twist to each of the qubits in the calculation. As the system evolves, the qubits interact and point to the most likely solution to the problem that theyve been arranged to describe.

Prodding a bit further, if we consider classical machines as mapping business questions onto maths a perspective shared by Scott Buchholz, Global Quantum Lead at Deloitte Consulting, at this years D-Wave Qubits conference then quantum computers give us the chance to use physics instead.

It turns out that some questions are easier to map onto physics than others, and this gets to one of the key considerations in the MIT framework on where and when to use a quantum computer.

Much of the talk on progress in quantum computing surrounds the number of qubits. Systems are notoriously noisy, which adds to the number of physical qubits that are required to facilitate error correction on logical qubits. On top of this, there are multiple ways of engineering the superposition of ones and zeros through the use of superconducting, trapped ion, photonic, or silicon spin qubits.

Each quantum computing developer has its own preferred approach, and as you walk down the path of trying to understand how quantum computing works, the discussion becomes one about the technology. And this is fine. Large companies can engage their R&D teams and have conversations with hardware developers.

However, just as you dont need to understand whats happening inside a CPU to benefit from a laptop, companies can focus their attention on the kinds of problems that quantum computers can help with, rather than getting bogged down with the numbers and types of qubits.

In their decision-making framework, Thompson and his colleagues identify two determinants in understanding when to use a quantum computer the efficiency of the algorithm and the scale of the problem that needs to be solved.

The problem size matters because the benefit of an algorithmic advantage is larger for larger problems, explains the team. This means that if a problem is too small, the classical computer will have already completed the problem by the time the quantum computers algorithmic benefit kicks in.

Quantum computers are often mentioned in terms of being able to tackle problems that are effectively impossible with classical machines. But the researchers want to guide enterprises on other opportunities too, where a quantum economic advantage exists.

Their analysis also considers technology roadmaps so that companies can assess when the window for using a quantum computer could open up for them.

Problems that become exponentially harder to solve as the size of the problem increases are interesting candidates when thinking about alternatives to using classical computing machines. And Thompson and his co-authors Sukwoong Choi and William Moses provide a useful rule of thumb.

If a classical algorithm takes exponential time and there exists a polynomial quantum algorithm, youre likely to get a speedup, they comment when discussing their framework on when to use a quantum computer.

Its worth pointing out that companies dont have to invest in bare metal hardware. For most customers, their first experience of what qubits are capable of will be via the cloud using one of a number of QCaaS providers.

Amazon Braket makes it straightforward for firms to work with different types of quantum computers and circuit simulators. Amazon advertises that Braket comes with one free hour of simulation time per month, lowering the cost barrier to getting started.

QCaaS hardware associated with Bracket includes gate-based superconducting processors from Rigetti and OQC, neutral atom-based quantum processors from QuEra, and IONQs gate-based ion-trap processors.

Microsofts Azure Quantum cloud service is another option for firms. Here, users get access to systems from Quantinuum, QCI, and PASQAL, as well as the quantum computing hardware mentioned above.

And companies can also access quantum computing solutions in the cloud using QCaaS platforms operated by developers such as IBM, Google, and D-Wave.

Theres no shortage of options, and with frameworks to guide enterprises on where and when to use a quantum computer, now is a good time to think about the types of algorithms supporting your operations and whether qubits can provide an economic advantage to the bottom line.

Read the original post:
Tips on where and when to use a quantum computer - TechHQ

Read More..

The Psychology of Success in Data Science Contest Design … – University of Waterloo

In today's data-driven world, holding data science competitions is a popular way to address real-world problems. Companies leverage these competitions to crowdsource solutions and strategically attract potential employees. Recent research from the University of Waterloo highlights the importance of motivating participants in these competitions through the appropriate contest structure and incentives to achieve success.

Dr. Keehyung Kim, a professor of Emerging Technologies from the School of Accounting and Finance, has endeavoured to understand what makes a data science competition truly motivating. In our study, we investigate the common design structures used in data science competitions and examine how a contest organizer can maximize the effort level exerted by contestants, says Kim. We want to know if the contest structure matters, and more specifically, if the contest should include one or two stages. His research stands out as one of the few studies examining a crucial aspect frequently ignored in the design of data science contests: the psychological and behavioral dynamics of participants.

In his study, Kim uses principles rooted in behavioral economics to investigate one- and two-stage contest design scenarios. Behavioral economics explores how psychological factors impact decision-making. Surprisingly, Kim's findings reveal that contestants exert significantly more effort in both stages of a two-stage contest compared to a one-stage contest.

Our study uses a behavioural model that provides the psychological explanation behind these new findings, explains Kim. Contestants exhibit a psychological aversion to being eliminated early. Having a second stage makes the separation of winning and losing more salient compared to the one-stage contest. Thus, to avoid falling behind, contestants in a two-stage contest are more likely to exert a high level of effort in the first stage.

Kim also determined that allocating most of the prize money to the winner of a two-stage contest is more effective in motivating contestants. In contrast, the prize allocation in a one-stage contest does not significantly affect the level of a contestants effort. These findings indicate that financial incentives alone may not be sufficient to motivate contestants and that psychological factors must be considered in contest design.

This study offers immediate implications that contest organizers can implement in data science contest design. Organizers should adopt a multi-stage contest whenever possible to encourage maximum effort, says Kim. Next, contest organizers can underscore winning and losing more prominently, such as announcing the contest results publicly. All in all, it is crucial that organizers factor in nonmonetary factors like psychological motivations to maximize contestant behaviour, decision-making, and results.

This research examines an important aspect that every contest organizer faces when they design an open competition and offers a fresh perspective on how the design of contests can inspire excellence and innovation.

The paper, Designing contests for data science competitions: Number of stages and the prize structures was published on September 8th, 2023, in the premier journal Production and Operations Management.

See the original post:

The Psychology of Success in Data Science Contest Design ... - University of Waterloo

Read More..

New Collaborations Designed to Increase Access to Data Science … – University of California, Merced

UC Merced is part of several new initiatives aimed at increasing the accessibility and inclusivity of data science studies and opening new opportunities for historically underserved students after graduation.

New grants from the National Science Foundation (NSF), the Department of Energy (DOE) and the California Learning Lab are funding collaborations with a sister campus and several community colleges as well as the Joint Genome Institute (JGI) to accomplish these goals.

Department of Applied Mathematics Chair Professor Suzanne Sindi, the principal investigator for UC Merced on these projects, said the emerging field of data science offers a wide variety of options for students to learn new skills that will benefit them in almost any field they choose to follow.

The NSF grant funds a partnership between UCs Merced and Berkeley and Tuskegee University, which brings together a Historically Black University, a Hispanic-Serving Institution and the power of these three important institutions. They have received a $1.8 million, three-year grant to create an introductory interdisciplinary computing and social science course.

Students wont need coding experience before they take the class a common barrier to entry for students at the college level because not all California high schools offer coding training. The class will also teach computing through a data lens, illustrating how it can help address societal issues.

Diversifying data scientists could expand the views represented in the development of, and problems solved by, technology that has the power to shape and change the world, UC Berkeley said.

Sindi and fellow UC Merced professors Heather Bortfeld, Roummel Marcia, Juan Meza,Erica Rutter and Teaching Professor Rosemarie Bongers are part of the collaborative that will develop and pilot teaching individual modules or parts of the course. In the second year, the team will assess what was effective, put together the full class and develop related materials such as a free online textbook.

In the third year, the three institutions will jointly teach the hybrid course. Through virtual sessions, students from all three schools will learn together, but the in-person seminars will be shaped for students from each university.

The team is also working on parts of the course with stakeholders from community colleges and plans to work with high school teachers in the future. The class will also be geared toward those audiences and could be used to further increase access to data science after the grant ends.

Another way to increase representation in the data science fields is by giving students a pathway through community college to complete the coursework they will need to prepare them to finish their undergraduate degrees at a university.

Through the California Learning Labs support, UCs Merced and Berkeley, Berkeley City College, the City College of San Francisco and Laney City College will collaborate to build curriculum to increase pathways for students from two- to four-year colleges by creating introductory, interdisciplinary curriculum that is scalable across the state.

Berkeley has really led the effort throughout the state to standardize education at the community college level, and we are really glad to be part of this, Sindi said. It's not easy to build a curriculum, especially one that is scalable and will benefit all of the states community colleges, the CSUs and the UCs.

The collaborators intend to provide baseline training in core computing, statistics and quantitative social science concepts. They will work in teams to develop course modules, including a coordinated collection of materials coding notebooks, discussion guides, assessments and out-of-class assignments. They said foundations in data science and introductory computing courses are critical for student success, and they intend to create curriculum that is relevant to students from all backgrounds and helps students integrate data science into both their academic and social identities.

Through the DOE grant, Sindi, fellow UC Merced professors Carolin Frank, Tomas Rube and Fred Wolf, and Zhong Wang from Lawrence Berkeley National Laboratory (LBNL) and the JGI, will build on an established internship program for graduate students.

The summer program matches students with projects and mentor scientists at JGI, where they get hands-on experience in genome research and computational tools to solve biology and genomics challenges.

The DOE aims to support research by historically underrepresented groups in science, technology, engineering and mathematics (STEM) and diversify leadership in the physical sciences.

Yumary Vasquez said her experience in the summer internship helped her immensely in her research to understand the diversity and functional capabilities of new symbiotic lineages.

My experience in data gathering was minimal before JGI, although I did have some background in analyses. Before JGI I worked on small datasets fewer than 20 genomes were sequenced by the lab I was in. But at JGI, I was working with more than 1,000 genomes, she said. I used some skills I already had from UC Merced and applied them at a larger scale and with much more complexity.

When she was a graduate student at UC Merced, Vasquez knew she didn't want a career in academia but wasn't sure what other opportunities there would be for her.

Since the internship last year, Vasquez graduated from UC Merced with her Ph.D. and is now a postdoctoral researcher at JGI.

The skills I learned in my internship are vital for my current position. Much of the work I am doing in my position builds on the work I started last year, she said. My experience in the program was amazing. I had a great mentor, Juan Villada, who taught me a lot about working with large amounts of data.

The internship afforded her the opportunity to meet a lot of other mentors from JGI, network with them and talk to them about their experience working in a government laboratory. Now she hopes to get a permanent job in a national lab. Her husband, Oscar Davalos, also did his graduate studies at UC Merced, went through the JGI internship and is a postdoc at Lawrence Berkeley National Lab.

Since 2014, the program has supported 60 students who have contributed to approximately 40 JGI projects.

This UC Merced-JGI training program will be expanded through the DOEs Reaching a New Energy Sciences Workforce (RENEW) program in four directions:

The RENEW program gave out more than $70 million in grants to support such programs at 65 institutions, including 40 higher-learning institutions that serve minority populations.

Ensuring Americas best and brightest students have pathways to STEM fields will be key to leading the worlds energy transition and achieving President Bidens ambitious energy and climate goals, the DOE said.

Read more:

New Collaborations Designed to Increase Access to Data Science ... - University of California, Merced

Read More..

Is Quantum Computing Stock IonQ a Buy? – The Motley Fool

In the rapidly evolving world of technology, artificial intelligence (AI) burst onto the scene as a hot topic in late 2022, and quantum computing may be next. Quantum computers could become a key component of AI's evolution since they have the potential to be far more powerful than today's biggest supercomputers.

Quantum computers will be able to process complex computations so quickly, using techniques unusable by standard computers, that they'd be able to easily crack most cybersecurity protections in use today. This kind of power could make a substantial impact on existing AI technology since AI models must process enormous amounts of data to correctly execute tasks.

One company focused solely on the quantum computing field is IonQ (IONQ -0.48%). It has created a unique technology that uses trapped ions to form its qubits (quantum bits). The company believes its solution will enable it to build practical quantum computers for use in industries such as finance and medicine.

The question for investors is whether IonQ's technology has found success in the market, enabling the company's growth.

IonQ was founded in 2015, and went public in 2021. The young organization's focus on quantum technology enabled rapid revenue growth thanks to the potency of quantum computers. Traditional computers store and process data in binary form -- each bit of information is either represented by a zero or a one.

Quantum computers are in a sense binary as well, but they process data very differently, with qubits. These are made by isolating charged particles or ions that can occupy a quantum superposition of states -- both one and zero and a cloud of values in between -- until they are measured and the solution to the calculation the computer was tasked with is determined.

Beyond that, qubits can be entangled with each other -- their states quantum mechanically connected. These (admittedly hard-for-the-layperson-to-grasp) qualities are expected to allow quantum computers to rapidly handle complex tasks that classical computers cannot.

IonQ generates revenue by selling access to its quantum computers to various organizations, such as research institutes and government agencies. So far, the company's technology is winning customers. In the third quarter, IonQ generated revenue of $6.1 million, a whopping 122% increase from the prior-year period's $2.8 million. That also exceeded its forecast range for revenue of $4.8 million to $5.2 million .

Its success was no fluke. IonQ's revenue over the first three quarters of 2023 was $15.9 million, more than double the $7.3 million it made in the same period of 2022.

Data source: IonQ. YOY = year over year.

Those sales were strong enough for management to raise its 2023 revenue forecast to a minimum of $21.2 million, up from its original forecast of at least $18.4 million.

While its revenue growth was spectacular, its losses have exploded as well. The company booked a net loss of $44.8 million in Q3, compared to the prior-year period's net loss of $24 million. Through the first three quarters of 2023, IonQ's net loss reached $115.9 million, nearly 4 times its net loss of $29.9 million over the same period of 2022.

These rising net losses are understandable. IonQ's biggest expense through those three quarters was the $60.7 million it spent on research and development, a necessary and important cost given that the company is attempting to develop potentially revolutionary new technology. It's also common for young tech companies to run at a loss for years as they build the foundations of their businesses.

And IonQ's efforts have netted some impressive clients, including Fidelity Investments, the U.S. Defense Advanced Research Projects Agency (DARPA), and a $25.5 million deal with the U.S. Air Force Research Lab.

The company is battling against more established names in the tech sector, such as IBM, which has built its own quantum computing solutions. IonQ claims it possesses a superior technology because it's been able to scale up the capacity of its quantum computers quickly while maintaining a high level of accuracy.

This isn't an easy feat because qubits must be kept extremely isolated from any external disturbances or they'll break down. Usually, this means keeping qubits inside nested chambers kept at temperatures near absolute zero -- something that IonQ claims it has mastered.

Certainly, its revenue growth to date has been impressive, and that suggests that IonQ's technology is as good as it claims. But because IonQ is such a young company, it could be years before it turns a profit, and the race for leadership in the quantum computing space has just begun.

Consequently, this stock is likely only appropriate for investors with high risk tolerances -- and a lot of patience -- since quantum computing industry experts estimate it could take until 2040 before quantum computers are scalable enough to tackle growing demand. Buying shares of IonQ at this stage is more of a speculative investment.

Go here to see the original:
Is Quantum Computing Stock IonQ a Buy? - The Motley Fool

Read More..

Communication’s Ai4Ai Lounge: The future of communication is … – Marquette Today

Theres something unusual happening in the basement of Johnston Hall. Behind most doors of the historic building are typical classrooms. But in the quiet lower hallway, theres a learning space that screams unconventional.

Warm oranges and reds are splashed on plush mid-century modern chairs scattered throughout the 1,100-sq.-ft. collaborative workspace. Corner lamps with bright red shades illuminate the white walls, and six high-performance workstations are topped with computer monitors. There are three, 86-inch 4K display screens that can be wheeled around the room on which students are able to project content from their personal devices be it phones or laptops onto the large screens for shared endeavors.

The Diederich College of Communications Artificial Intelligence for Analytics and Insights Lounge, or Ai4Ai Lounge, was designed by Dr. Larry Zhiming Xu, assistant professor of strategic communication.

Inaugurated in 2022, Ai4Ai operates under the joint sponsorship of the Diederich College of Communication and Northwestern Mutual Data Science Institute, Xu says. The lounge is designed to foster a synergistic environment where students can learn how to effectively transform data analytics into actionable insights.

Data science is a new force in the communication field, and it shows no signs of stopping. Xu teaches his students about the applications of data analytics in various contexts, including sentiment analysis for gauging public opinion on social media, network analysis to map community structures and to pinpoint influencers, and predictive analytics to guide strategic planning. Most recently, he explored the capabilities of large language models for generating and curating content.

A cornerstone of my teaching approach is equipping students with the skills to translate data analytics into visually engaging dashboards and logically coherent narratives. In alignment with this focus, the Ai4Ai has been specially designed to offer advanced visual capabilities, Xu says.

Gwen Viegut is in her last semester of graduate school and is studying digital communication strategy. One of Xus former students, Viegut points out that data science and communication work seamlessly, especially through AI.

Its like trying to teach an alien how to be human. You have to be able to understand communication and break it down to its most elementary form to be able to input and teach an AI model. Viegut says. Machine learning is trying to imitate the process of the human brain. AI is often trying to find the most efficient ways of processing and communicating data, but it cant do that unless it learns how to communicate properly,

Abdallah Alqaoud, Grad 23, got his masters degree in communication, and believes that artificial intelligence is the future not just for communication but for all industries.

As we slowly move toward a more virtual world with increased developments in virtual and augmented reality technologies and content streaming, now is a crucial time for students to understand AI and how it can be implemented. This technology serves as a great tool for communication students and professionals to help with data collection and calculations, Alqaoud says.

The Ai4Ai Lounge is helping to dispel the common stereotype that communication students are typically math averse.

The reality is that data informs communication strategy and decision making. Communication professionals and researchers rely on data to understand the impact of messages and the behaviors of stakeholders, says Dr. Sarah Feldner, dean of the Diederich College of Communication. At the same time, numbers do not speak for themselves. Communication plays a key role in extracting insights from data and making it actionable.

The constant evolution of communication makes the colleges partnership with data science essential the reason the Ai4Ai Lounge was created.

Because our professions and practitioners are increasingly looking to data for developing strategy, we wanted to make sure our students had access to the tools, perspectives and technology, Feldner adds. Having a space where students can work collaboratively and work with students from across areas creates opportunities for students.

Peering into the future, Xu says that change in the AI and communication fields will be constant. He says technology is happening so rapidly that its hard to predict even what next month looks like, and that science and communication will be forever intertwined.

Over time, I expect AI to become so ingrained in our daily lives that we may cease to explicitly label it as such, much in the same way that we have stopped labeling every computer-powered technology as computer-based, Xu says. I think one thing that wont be easily affected (or determined) by AI would be our moral compass that helps us tell what is ethically right or wrong.

See more here:

Communication's Ai4Ai Lounge: The future of communication is ... - Marquette Today

Read More..

Xanadu hardware CTO shares views on why silicon photonics is the … – DIGITIMES

Xanadu's X8 photonic quantum computing chip. Credit: Xanadu

Silicon photonics (SiPh), the manufacturing of integrated photonics on CMOS platform, has been a buzzword in the recent two years, given the technology's promising prospect to deliver a faster, securer and more efficient solution to data centers increasingly burdened by the ever-growing transmission demand of AI. However, the potential of silicon photonics is not confined to the realm of conventional computing and communication.

Xanadu, a quantum computing company founded in 2016 and headquartered in Toronto, Canada, has been building fault-tolerant computers based on silicon photonics chips. Using photons as qubits, Xanadu believes that silicon photonics will be the quickest path to achieve a fault-tolerant quantum computer able to operate at room temperature. Zachary Vernon, Chief Technology Officer of Xanadu responsible for hardware, talked to DIGITIMES Asia about opportunities brought by photonics to the realm of quantum computing, as he visited Taiwan in November to attend the 2023 Asia Pacific Executive Forum hosted by the Global Semiconductor Alliance.

The race to achieve fault tolerance

Currently, there are several types of quantum computers based on different principles, including superconducting qubits, quantum dots, ion traps and photonics. "At the present moment, there is certainly tough competition between all of them, and you see a lot of approaches that are distinguished by what type of hardware they use," remarked Vernon, noting that as these various approaches are still at prototyping phase, different types of quantum computers are suitable for different near-term problems, and all these near-term problems are a bit short of real business applications. Ideally, if all approaches of building a quantum computer turn out successful, they should all be equivalent and able to address the same problems, according to Vernon.

"In order to get to that point, we need to achieve fault tolerance and error correction, and we think photonics will be the first to get there and the fastest to scale," the CTO explained, emphasizing that scaling and performance are both very important to achieve fault tolerance. "You need lots of qubits to encode error correction, but you also need high performance qubits."

Photonics enables one to network different chips with optical fiber in various patterns, obtaining better connectivity than one would usually be able to access in a superconducting approach. As a result of the better connectivity, Vernon indicated, one can access better codes, especially quantum low-density parity-check (LDPC) codes.

"Photonics is really the only approach that can access it, since the other approaches are very constrained in the connectivity between their qubits, whereas photonics can leverage optical fibers to route qubits wherever you want," said Vernon. As the number of qubits - now usually measured in millions - has come to be synonymous with the global quantum race underway, the Xanadu CTO pointed out that photonics will also need millions of qubits to deliver an advantage. Due to Xanadu's ability to use better LDPC, it can access ten to one hundred times more logical qubits than competing approaches.

"We think it's very important for anyone working in the manufacturing of silicon photonics to pay attention to the quantum computing industry," emphasized Vernon, pointing to the two major advantages offered by silicon photonics: scalability and quantum computing at room temperature. "All of our actual computation happens on room-temperature devices."

Photonics will be the fastest path to scale

According to Xanadu's projection, once it reaches fault tolerance and begin to scale up to add hundreds of logical qubits per year, Xanadu alone would require hundreds of thousands of 300mm wafers per year, as a quantum computer is like a data center that literally takes thousands or millions of chips to build. "It's a very significant market opportunity that in a few years will essentially directly compare with silicon photonics wafer volume in the present day," observed Vernon. In the coming years, he said, Xanadu hopes to achieve fault-tolerance and scale up to 1000 error-corrected logical qubits. "That would look like a data center with about 10,000 racks," he said, indicating that the company's main priority now is to develop the hardware needed to deliver a cloud-deployed, fault-tolerant computer.

In the long run, the use of silicon photonics has the potential to deploy quantum computers closer to the edge. "In principle, there's no fundamental reason why a quantum computer that uses photonics can't be inside a consumer device," explained Vernon, "there are certain technologies that need to be developed for that, but fundamentally, that capability is there because they can all work at room temperature in principle." Whether there'll be a utility for that application for that, however, more time is needed to study how that will look.

In terms of immediate engagement with customers, Xanadu's software library for programming quantum computers, PennyLane, is the main product offering of the company. Xanadu partnered with Amazon Web Services in its development, in addition to cooperation with Nvidia. "PennyLane is one of the leading software APIs for developing algorithms for quantum computers," remarked Vernon, "it started out specializing in machine learning applications - quantum machine learning - but a community grew around it to take hold of quite a significant portion of the market for algorithm development." The Xanadu CTO also highlighted the hardware-agnostic characteristic of PennyLane: it's not limited only to photonic quantum computers or our hardware - one can use it on different platforms, and Xanadu has partnered with multiple hardware providers to enable that. In one example, Xanadu cooperated with multiple automakers like Volkswagen that leverage PennyLane to develop quantum algorithms for battery simulation.

The risk of missing out in a global competition

On the eve of an AI revolution, the Xanadu hardware CTO pointed out that a lot of work has been undertaken in quantum machine learning, even though it's still in its early days. "There's a lot of algorithmic development going on, and it does seem that quantum computers will be able to have the ability to address certain machine learning tasks in a very different way," said Vernon. Nevertheless, large scale fault-tolerant quantum computers are still needed before one can fully access the implication. "Once these things have scaled up to very substantial sizes, then it can address conventional machine learning basic operations - such as matrix operations - more efficiently," according to Vernon.

Instead of replacing data centers, Vernon believes that quantum computers will augment them, indicating that quantum computing doesn't address computational problems and applications that are currently being done by edge clusters. "The types of algorithms that quantum computing across all approaches address are completely different, and they are completely inaccessible by ordinary classical computers as a result of the mathematical structure of the problems to be solved," he pointed out, adding that quantum computing development is not something incremental that gains a slight edge over pre-existing technology. "A good example is the most recent cloud-deployed machine that we built, Borealis, which was able to beat the world's most powerful supercomputer - benchmarked against Fugaku - by many orders of magnitude."

Fundamentally, quantum computing tackles a completely different set of problems that can't merely be addressed by scaling up data center. "It opens up application markets that are simply out of reach and will always be out of reach with present day technology," said Vernon.

Given quantum computing's strategic significance, a global race has been underway. Regarding Canada's advantage in it, the Xanadu CTO observed that the country is punching above its weight in the ecosystem, especially in workforce development, and a number of stellar physicists and engineers coming out of Canadian universities were directly hired by Xanadu. When it comes to Taiwan's advantage, Vernon believes that Taiwan is "the Mecca of semiconductor" and therefore will also be a hub of silicon photonics one day, thus playing a critical role in Xanadu's supply chain in the future. He however stressed that the Taiwanese ecosystem has to pay attention to the customization and optimization "that'd be better to happen earlier than later."

For the photonics ecosystems in Taiwan and elsewhere, Vernon believes that silicon nitride and lithium niobates are two emerging platforms that are extremely important, and Xanadu has been working on them for quite some time. With the help of the Canadian Trade Office in Taipei, Xanadu has been building relationships with multiple foundries and OSATs based in Taiwan, especially those that have been involved in manufacturing Xanadu's devices. "We think Taiwan is perfectly positioned to be a dominant supplier, and what's needed right now is process customization and optimization to ensure the compatibility of the silicon photonics processes, both on the fabrication and packaging sides, with the requirements of photonic quantum computing," indicated Vernon.

"There's a risk of missing out on it if it's not active," he warned, noting that the US and Europe have gained a slight edge in photonics quantum computing since they have spent more time and paid more attention to the relevant requirements over the last couple of years. "The time to act is now to make sure Taiwan stays competitive, and there's no better place in the world that has the sorts of existing infrastructure to support this."

Zachary Vernon, CTOHardware at Xanadu.

Read this article:
Xanadu hardware CTO shares views on why silicon photonics is the ... - DIGITIMES

Read More..

5 Reasons Why Data Scientists Are Better Paid Than Medical Doctors – DataDrivenInvestor

5 reasons why a full-blown career in data science brings more money than what most doctors makePhoto by National Cancer Institute on Unsplash

During my last semester at the University, I thought a lot about what to do next.

After all, I realized that pure economics is probably as interesting as my latest bank statements.

(Mostly done in spreadsheets and Excel.)

These questions started occupying my mind:

Now, when I look at it with these eyes, it was a no-brainer

I liked mathematics, statistics (especially probability), and econometrics.

Also, there were a few programming subjects that I really loved.

All the rest out of the table!

So I got my answers I liked analyzing the data

Step by step, data science appeared to be a natural choice (minding these interests).

And yes, I am finally happy where I am.

(No regrets.)

I can tell you now that data science is a mix of a well-paid and interesting occupation.

Its definitely worth considering if you are still making up your mind or thinking of a career shift.

Here are several reasons that can help you make the right choice.

Checkout my NEW Gumroad product Learn From a $140k/y Engineer: Full Path To A Well-Paid Tech Career For Beginners

Have you ever dreamed of having a job that everyone else wishes?

Well, for years, the data scientist occupation has been amongst the most desired positions

But why?

Companies must continuously improve their businesses to stay competitive.

And how do they do it?

Well, they have to understand and explore the markets demand. (So, they must hire a data scientist.)

Data scientists are necessary to:

Data science is a profession of the future and as such in high demand. According to the U.S. Bureau of Labor Statistics, employment in data science occupations is projected to grow 35 percent from 2022 to 2032.

The market isnt flooded (yet) So, its an excellent environment for beginners (unlike in other IT industries).

The demand for data scientists is far beyond the supply, which allows you to find a job even if you are not on the level of know-it-all.

Checkout my NEW Gumroad product Learn From a $140k/y Engineer: Full Path To A Well-Paid Tech Career For Beginners

Once you learn the data science basics (e.g., Python, R, SQL), you open doors in many industries

Data science is crucial in various sectors for making good decisions.

So, you can end up working in any of the markets below (a single pass valid for all).

(Although some industry-specific knowledge is required.)

You will for sure learn lots of new things and work with experts from other branches.

Checkout my NEW Gumroad product Learn From a $140k/y Engineer: Full Path To A Well-Paid Tech Career For Beginners

You will never worry again if there is an opening for your occupation As a data scientist, you can work at any of these jobs.

The median wage in May 2022 was $103,500 per year, $49.76 per hour (U.S. Bureau of Labor Statistics).

Data science is a lucrative career. The more time you invest in studying, the faster your salary will increase to this level.

You can easily find a fully remote job.

How about work and travel? (Its easy to afford it with this salary this high.)

Data is a crucial resource today.

And yes, we generate tons of it every second, making it hard not to throw baby with water bath.

Hence, there is a huge demand for experts in exploring these huge data sets.

If you decide to go down this path

Data science guarantees you a high salary, career growth, and professional development.

With this post, I wanted to tell you all the important advantages of being a data scientist.

So that you can make the best possible decision.

If you start learning now, you can be a competent data scientist much sooner than you think (matter of months).

Just dont wait!

The sooner you start, the better.

Write a comment if you have any questions here.

Data science rocks!

P.S. This post was written by my girlfriend, a data scientist.

Checkout my NEW Gumroad product Learn From a $140k/y Engineer: Full Path To A Well-Paid Tech Career For Beginners

Follow this link:

5 Reasons Why Data Scientists Are Better Paid Than Medical Doctors - DataDrivenInvestor

Read More..

How Coherent Ising Machines work part4(Artificial Intelligence + … – Medium

Author : Sam Reifenstein, Satoshi Kako, Farad Khoyratee, Timothe Leleu, Yoshihisa Yamamoto

Abstract : We propose a network of open-dissipative quantum oscillators with optical error correction circuits. In the proposed network, the squeezed/anti-squeezed vacuum states of the constituent optical parametric oscillators below the threshold establish quantum correlations through optical mutual coupling, while collective symmetry breaking is induced above the threshold as a decision-making process. This initial search process is followed by a chaotic solution search step facilitated by the optical error correction feedback. As an optical hardware technology, the proposed coherent Ising machine (CIM) has several unique features, such as programmable all-to-all Ising coupling in the optical domain, directional coupling (JijJji) induced chaotic behavior, and low power operation at room temperature. We study the performance of the proposed CIMs and investigate how the performance scales with different problem sizes. The quantum theory of the proposed CIMs can be used as a heuristic algorithm and efficiently implemented on existing digital platforms. This particular algorithm is derived from the truncated Wigner stochastic differential equation. We find that the various CIMs discussed are effective at solving many problem types, however the optimal algorithm is different depending on the instance. We also find that the proposed optical implementations have the potential for low energy consumption when implemented optically on a thin film LiNbO3 platform.

2. Phase-space simulations of feedback coherent Ising machines(arXiv)

Author : Simon Kiesewetter, Peter D Drummond

Abstract : A new technique is demonstrated for carrying out exact positive-P phase-space simulations of the coherent Ising machine quantum computer. By suitable design of the coupling matrix, general hard optimization problems can be solved. Here, computational quantum simulations of a feedback type of photonic parametric network are carried out, which is the implementation of the coherent Ising machine. Results for success rates are obtained using this scalable phase-space algorithm for quantum simulations of quantum feedback devic

Excerpt from:
How Coherent Ising Machines work part4(Artificial Intelligence + ... - Medium

Read More..

Q-STAR: Advocating quantum technology in the business world – Innovation News Network

Q-STAR (Quantum STrategic industry Alliance for Revolution) was established in Japan in September 2021 to create new industries and business opportunities based on quantum technology. Its members come from various business sectors, including startups, small and medium-sized enterprises, large corporations, and academic institutions.

Q-STAR proactively collaborates with organisations in diverse fields globally, transcending industry and corporate boundaries to develop the quantum technology-related business of the future.

We have set five missions to achieve our goal:

We emphasise more on creating business with quantum rather than academically researching quantum technology.

Our focus is not simply on the implementation of quantum technology but on its creation and the creation of a path to a future that includes peripheral industries. We believe that we can promote the social implementation of quantum by broadening the scope of our activities to include quantum-inspired technologies and hybrid environments with existing computer technologies.

Currently, Q-STAR has six subcommittees and eight working groups. Subcommittees are developing use cases, and working groups are giving helpful information to subcommittees. Q-STAR now has 87 members (as of November 2023), many of which are user companies. This is the most important feature of Q-STAR. These user companies proactively participate in the subcommittees with vendor companies to develop use cases.

To date, Q-STAR has discussed over 50 quantum-technology use cases, selected 16 of them to be followed up on as the next step, and made their industry roadmap.

We use the quantum reference architecture model for industrialisation (QRAMI) as a tool for the standardisation of use cases. QRAMI was inspired by RAMI 4.0 as a model for viewing quantum business. It envisions the entire quantum-related domain on the three axes of domain, architecture, and technology, helping to develop quantum use cases. We aim to make it a global tool, a platform for common understanding, not only within Q-STAR but also with other industrial consortiums aroundthe globe.

In this way, putting use cases to practical use is the key to applying quantum technology to society faster. We expect to expand the technology to various industries, such as healthcare, finance, logistics, factories, transportation, etc.

The future we seek is to solve the present issues in society with quantum technology. We are trying to find ways to use quantum technology to solve these problems. Moreover, we aim to lead the world in quantum industry businesses and want to bring quantum technology to the lives of 5-10% of Japanese citizens in the near future.

We consider the following as the main challenges to implementing the technology socially:

Q-STAR emphasises collaboration among sectors such as government, academia, and industry. Recently, Q-STAR member companies, representatives of academia, and national institutes outside the council have been discussing an open software platform that is not dependent on the type of quantum computer. The plan here is to visualise a hierarchy extending from customer issues to various calculation methods, and to build a hypothesis for practical use as a platform.

Moreover, Q-STAR signed an MoU with three other overseas consortiums, such as QED-C (USA), QuIC (Europe), and QIC (Canada), to build a collaborative relationship.

The quantum industry is still in its infant stage of development. As for gate-type quantum computers, it may take years for social implementation to be realised. However, as for ising-type computers (also called annealing-type computers) and quantum-inspired computers, they have already been shifting to the demonstration phase, and cases of their applications are beginning to emerge.

Many technologies can be combined with quantum technology. It is Q-STARs role to take these combined technologies and create business opportunities with them.

Please note, this article will also appear in the sixteenth edition of ourquarterly publication.

Visit link:
Q-STAR: Advocating quantum technology in the business world - Innovation News Network

Read More..

IIScs Foundation for Science Innovation and Development launches Centre of Data for Public Good – The Hindu

In an initiative touted at aiming to leverage data for social good, theFoundation for Science Innovation and Development (FSID)within theIndian Institute of Science (IISc)announced the launch of theCentre of Data for Public Good (CDPG).

The Centre is dedicated to advancing research, innovation, collaboration, and best practices in the realm of data science, analytics, and policy to address critical societal challenges.

According to a release, CDPG will serve as a hub for multidisciplinary research, bringing together experts from academia, industry, and government to harness the power of data to benefit the public. With a focus on ethical data use, privacy, and responsible AI, the centre aims to develop solutions that positively impact areas such as smart cities, agriculture, logistics, geospatial, environmental sustainability, and so on.

Emphasising collaboration and innovation, the centre is set to bring under its umbrella learnings from pioneering projects such as theIndia Urban Data Exchange (IUDX)and theAgricultural Data Exchange (ADeX).

These projects, with their focus on urban and agricultural sectors, align seamlessly with the centres mission. By incorporating these initiatives, the CDPG will leverage the expertise and resources of IUDX and ADeX, creating a collaborative environment that will accelerate the development and implementation of data-centric solutions.

This amalgamation of efforts reflects the Centres commitment to harnessing the power of data in addressing real-world issues and advancing the field of data science for societal benefit, added the release.

To mark the launch of the centre, IISc hosted theSymposium on Data for Public Good, a flagship event that brought together thought leaders, researchers, and practitioners in the field. Distinguished speakers at the event included Kris Gopalakrishnan, Chairman, Axilor Ventures; Co-founder, Infosys, and President, Infosys Science Foundation; J Satyanarayana,Chief Advisor, C4IR India, World Economic Forum;Rajendra Kumar, Chief Postmaster General, Karnataka;Kunal Kumar, Joint Secretary and Mission Director, Smart Cities Mission, andPramod Varma, CTO of Ekstep Foundation.

In addition to panel discussions on urban data, data governance, and agricultural and geospatial data, the event culminated with the announcement of a hackathon focused on a transportation demand prediction problem for specific bus routes in Surat and an air quality prediction problem for certain road segments of Bengaluru.

Originally posted here:

IIScs Foundation for Science Innovation and Development launches Centre of Data for Public Good - The Hindu

Read More..