Page 879«..1020..878879880881..890900..»

Podcast with Will Oliver, a Professor at MIT, and Steve Suarez, an Innovation Advisor and CEO of HorizonX Consulting – Quantum Computing Report

Will Oliver, a professor at MIT, and Steve Suarez, an innovation advisor and CEO of HorizonX Consulting. are interviewed by Yuval Boger. Will elaborates on the challenges and opportunities in quantum computing, particularly the need for 3D integration to efficiently control large arrays of qubits. Steve shares his journey into quantum computing and his advice to innovation leaders. They then discuss quantums current state and future prospects, the intersection of quantum computing with AI, the role of cloud-based quantum services, and much more.

Yuval Boger: Hello Will, hello Steve. Thank you for joining me today.

Steve Suarez: Thank you.

Will Oliver: Yep, good morning.

Yuval: So Will, who are you and what do you do?

Will: Yeah, Yuval, thanks a lot for inviting me to your program. Im Will Oliver. Im a professor of electrical engineering and computer science and a professor of physics at MIT. Ive been working for twenty-plus years on quantum computing, in particular with superconducting qubits, but I also work on high-performance cryogenic technologies and 3D integration that were going to need to bring these one day to reality, and I enjoy it very much.

Yuval: And Steve, who are you and what do you do?

Steve: Ah, Yuval. Thank you for having me and good afternoon Im calling in from London Ive got varied positions: Im an external advisor at Bain and Company joining them to help them really push the innovation agenda. And one of those topics happened to be quantum computing so Im excited to be here with you to talk about that. Im also sitting on a couple of company boards and one of those companies is a quantum computing company out of Israel called Classiq. And lastly and actually its got me really excited right now, Im launching a new consulting company. On September 5 Im launching a company called Horizon X. where were going to help companies work in all three horizons of innovation and really help drive innovation at scale.

Yuval: So wonderful. Will, you mentioned 3D and that got me curious. Is that about qubit connectivity a 3D arrangement of qubits or is it something else?

Will: Well, it is all of the above. But, where we start is basically if you think about a large array of qubits, you know hundreds of thousands and more, the question is how do I bring in all the control signals to address those qubits. Today, mostly people are bringing them in from the sides laterally, because were not at very large qubit processor sizes yet. But of course, thats a losing battle, because the wires take up more and more room. And the only way to get the wires there is to spread the qubits further and further apart and thats not an extensible solution. But what is an extensible solution is to bring those signals in from the third dimension, and so this is literally 3D fabrication integration of the wiring and the control technologies needed to address qubits. You could think of the wiring on the ground floor and the qubits upstairs and the stairways are bringing the signals up and down and that is a much more space-efficient way to do it. And then maybe the last comment would be that once you have that, then, of course, you could start thinking about: maybe Ill put qubits on the second floor and the third floor, and maybe Ill bring signals also down from the roof, and so there are lots of opportunities to expand this. But where it starts is with bringing the control signals in from the third dimension.

Yuval: So that would be like vias or blind vias on a PCB thats on a multilayer PCB thats done today?

Will: Yeah, thats right, Thats exactly right.

Yuval: What do you think about qubit connectivity? I mean superconducting qubits have all these advantages speed and so on but connectivity and requirements for cooling (putting aside LK99) how do you think thats going to progress in the future?

Will: Yeah, well great question. I mean you know each of these qubit modalities whether were talking superconducting, trapped ions, you know two of the leaders today, up and coming is neutral atoms, semiconductor qubits, and others today you know each of them has their own strengths and weaknesses. And, you mentioned many of the strengths of superconducting qubits, and of course, that is one of the reasons that theyre in the lead today. But, one of their challenges is going to be connectivity. Currently, superconducting qubits talk very well to their nearest neighbors. So north, south, east, and west. But if I wanted to go two towns away or three towns away, thats quite challenging. And so one of our challenges is going to be how do we go beyond simply nearest neighbor connectivity. And, why is it important to do that? Well, we dont have to, but algorithms can become much more efficient in a hardware sense as well as in a software sense if we can achieve something beyond nearest-neighbor connectivity. And of course, the Holy Grail would be full connectivity, where any one qubit could directly talk to any other qubit in the processor. We probably wont get there. But I think we can get beyond nearest-neighbor connectivity for some advantage.

Yuval: So Steve you and I are probably the only two people in the quantum industry without a Ph.D. in physics. How did you get into quantum?

Steve: it was really interesting because I had the title of innovation. A lot of people were approaching me and saying: Steve, you got to look at this new technology and quantum computing and it was so far out there. And it wasnt until Nature magazine published that they reached quantum supremacy, and then there was this conversation in the news about RSAs security broken. Then it really piqued the attention of a lot of boards within the bank and by nature since I had the title head of innovation they were saying: were going to get Steve to come to our boards all around the world and Steves going to explain what it is and how were protected and whats going on. All of a sudden I had to really quickly first understand what it is and how it works and and then be able to explain to a board why we should be concerned or why we shouldnt be concerned and I was happy that I was. Explaining why we shouldnt be concerned at this time and that you know what are the efforts that were going to make to look for it. II spent a lot of time educating myself and I could tell you at the beginning I was really like oh is this really going to work? But as I studied it more I got more involved, I became a believer in this technology and I saw the power that this technology can have and so that caused me throughout the years to invest my time to learn the technology and be able to speak to it and look for opportunities both near term and long term. I got involved in quantum computing which was kind of what brought me to meet Will many years ago at MIT and then I became one of his students really learning around Will and Wills always been an inspiration to me. Ever since then, Ive just kind of dived in deeper and gone down the the rabbit hole.

Yuval: You mentioned boards, and I think the shiny new object in boards these days is generative AI and GPT. What do you tell boards when they say, oh, were gonna take away budget from quantum and were gonna move it to this shiny new object.

Steve: Yeah, unfortunately, its happening a lot. And I can tell you that in companies that Im consulting and working with, theyre actually redirecting those employees and saying, okay, youre gonna spend half your time on quantum and the other half on AI. And the response I get from a lot of the resources is, Im not an AI expert, thats not what I do. Ive been doing quantum and this is, But I think that theres this misconception that because Im smart and I know quantum physics, I can go ahead and now be an AI expert and to drive it.

So I think this whole rise about AI and a lot of boards and senior management getting this sense of a FOMO, the fear that Im missing out because everybodys talking about it and there are these amazing things is really driving everybody to say, Okay, lets just put all of our resources there. I dont think its the right approach. I think we have to be sensible and see what we can do. I think that there is an intersection between AI and quantum that we should be looking at. But I dont think we should be retooling quantum physicists to be going towards AI. I think we should focus the people who do that on the experiments on you know, what are the things that we can derive near term, and then we have really good people in the analytics space in that space that can drive that. But right now it seems like everything is being diverted to Gen AI and thats what everybody wants to talk about.

Yuval: And Will, people have been talking about quantum and AI as Steve mentioned, but AI models have billions of parameters and dont look like sort of quantum machine learning with todays qubits in terms of quantity and fidelity is going to get anywhere near that. In your mind Is quantum AI just a new buzzword to get some more budgets into quantum or is there substance behind it?

Will: Well, theres been work in quantum/AI for about 15 or 20 years and, in fact, folks here at MIT, including Eddie Farhi along with Seth Lloyd and others, have done a lot of work in this area. You raise a good point, and you know 10-12 years ago, Professor John Preskill from Caltech coined a term called NISQ, which stands for noisy intermediate scale quantum, and the idea at the time that he coined it was: lets try to find an algorithm with the hardware that we have today that will do a useful and hopefully even commercially useful task, which we can do in the interim while were working towards doing full error correction to enable these larger systems that you mentioned with millions or even billions of qubits.

So a lot of work has gone into that and theres been some back and forth between the quantum and the AI or conventional computing scientists saying, okay, Ive got an algorithm that seems to run a little better than yours. And then they (the conventional computer scientists) band together, and come up with an even better algorithm, and its gone back and forth. And its currently, I would say, in the classical computer scientists camp. Theyve come up with algorithms that they show are pretty darn efficient. And its looking more and more like we really do need to get to an error-corrected machine to really see the commercial advantage of quantum computers. Now, I may be wrong on that. It may go back in the other direction, But a lot of really smart people have been thinking about this for a while, and this is where it stands today.

Now, if you think about billions of parameters for an AI, say machine learning, then one challenge that people are thinking about is, how do I get all of that information into a quantum computer?, because quantum computers are run or operated by conventional classical computers. And so theres, as of yet, no quantum speed up on getting a lot of information into a quantum computer, because its classical information. We gather it from the world around us and then we stick it into a quantum computer. So thats called the data loading problem. And thats something that a lot of people are thinking about how to address it, or how to work with it because we may not be able to work around it.

Yuval: Youve both been in quantum for quite a few years and Im curious, you know, like a physicist, I wanna look at the derivative. What do you know today that you didnt know six months ago? Whats new in your mind, new and exciting in quantum?

Steve: I think its not so much that I didnt know, but I think all the time Im talking to new people, Im really understanding the different modalities and how the modalities are progressing. And at a certain point I was giving a presentation and a lot of the education that I received was on superconducting. And Im talking about, youre never gonna have a quantum computer in your room and all this other stuff. And in the room, theres a guy having a modality where hes actually trying to do quantum through diamonds. And so its very interesting to me seeing how all of these people are approaching the problem, trying to get to the same solution using these different modalities. How theyre progressing, how theyre driving forward, and how many people in the world are trying to get this done.

From my end, I find that interesting, but then better yet, finding the commercial uses for it and how people find the advantages or the opportunities today. And Im really excited about that because I think theres a lot of creativity, a lot of work being done where we can start finding some near-term ways to apply this technology where we can benefit now, were not talking about five, 10, 20 years down the line. And thats kind of where I like coming in from because if I can find that ability to find that value, it gives more investment into this industry that I believe in, and we can keep growing to see how we can benefit the industry. To see how we can benefit the industry.

Yuval: And Will, whats your viewpoint?

Will: Yeah, well, Im very fortunate to be working at a place like MIT where we have just really fantastic students, post-docs, and research scientists. I mean, Im learning something almost every day thats new. And yeah, its one of the reasons that Im in this field.

Im trained as an electrical engineer, by the way. And I was thinking, when I was taking those classes at college that, okay, Im learning about transistors, but transistors are pretty mature and I could make them yet a little bit better, but gee whiz, wouldnt it have been awesome to live in the 50s and really do this at the very beginning, at the dawn of classical computing?

And thats exactly what were doing right now in quantum. So I get that excitement every day. Just to answer your question, I think there are things that were learning at the fundamental level about quantum mechanics and quantum entanglement that we suspected were true, but we didnt really know. So, for example, quantum mechanics describes the very small like electrons and protons, thats where it started. But, can it work with macroscopic objects that you can see, like electrical circuits? Thats really what a superconducting qubit is.

And the answer from 20 years ago is, yeah, it looks like it works. And then you could ask, well, will it work for 10 of those or a hundred of those? Does it work with something I can see with my eye? And the point is that we are entangling larger and larger systems, they behave quantum mechanically, and thats very interesting.

The question of what happens when you get to many, many bodies entangled, is it different? If so, how is it different? And that can be quite interesting from an intellectual standpoint. From an engineering standpoint, part of the work that Ive been doing and contributing to is how are we going to think of this as a system how are we going to build a larger scale and useful quantum computer? It really draws from many disciplines, not just physics and not just electrical engineering, but much broader material science, fabrication engineering, et cetera, to really build something of a complexity that may be one of the most complex objects that humankind has ever built.

Yuval: If you think about it from a systems approach, as you were starting to describe, I think many people believe that the quantum processor is not gonna be a standalone processor, but its gonna coexist with CPUs and GPUs, and maybe every processor is gonna do something else. Were unlikely to run Zoom on a quantum computer anytime soon, I believe. Do you think its too early to start thinking about that integration? In theory and in practice, do you think HPC managers would say, Hey, quantum is coming, what do I do about it?

Will: You know, it is early, and technical development evolves over time. And so if, to use an analogy, if people in the 50s had said, look, these computers are really wimpy, what we really wanna do is run Zoom, whatever Zoom was in the 1950s, why dont we wait until we have that technology and then well do it? And if you think that way, of course, you never developed the technology, because its the journey, not the endpoint, that is what gets you there. The endpoint motivates us, but its the journey that takes us there.

And so with quantum, itll be the same thing. So my answer is yes, we should absolutely be thinking about this, but we should also be aware that engineering is hard. This is hardware and it takes time. And so we dont want to over-hype this or we could end up in a quantum winter, whether its, you know, a dark cold winter or even a shallow one, I think we want to manage expectations properly. But at the same time, we dont want to come across as wet blankets and say, this is never gonna work and its gonna be 20 or 30 years, because I dont believe that thats true either. We need to be highly engaged, we should work towards the applications that we want and solve the problems. And if we do that, we will get to a quantum future.

The other part of your question was about the necessity of having quantum computers operate in tandem with conventional computers. And that is absolutely true the way that we understand it today. If for no other reason, quantum computers will need error correction. So quantum computers run on qubits, those are the logical elements and theyre quantum mechanical, so theyre faulty. And even though were quite proud of ourselves as a community for getting them to error rates of one part in a thousand or one part in 10,000, a transistor in our computers today has an error rate of one part in 10 to the 20th power. So orders and orders of magnitude better.

You could also ask, what do we need? We need error rates of something like one part in a billion to one part in a trillion. And so to make up that difference, we will use quantum error correcting codes. Those codes are operated and implemented by conventional classical computers. And so they will always run in tandem.

Beyond that, there are ideas where a classical computer would run an algorithm, maybe 80, 90% of it, but then poll the quantum computer periodically for some advantage to get an answer back quickly. And lastly, of course, were not quantum mechanical objects, we sit down at a computer terminal, and type in a program. So of course a classical computer has to be interfaced with a quantum computer in some way.

Yuval: Steve I wanted to ask you, youre calling from London, I know youre a world traveler, you advise many boards in many countries. And one of the things that many countries are doing is starting up these national quantum programs where they say, We want a computer in country as a way to jumpstart the local quantum ecosystem. Do you think thats the right strategy or do you think today quantum computers should still remain on the cloud because their useful lifetime is relatively short, they get obsolete fairly quickly, and so on?

Steve: I mean, I think you said it yourself. I think the capex of buying quantum computing and putting it where you, theyre just in my mind doesnt really necessitate that cost and that investment to do it. And as technology gets better, why dont you buy it or use it as a service? And I think thats where using some of these clouds, and today you might be plugging into 433 qubits. tomorrow, it might be 1000 qubits. And if youre buying it, youre still stuck with what you bought and the value of that.

So I think being smart about it and leveraging it and using some of these different technologies. Also, if you buy a certain type of computer, youre stuck to that modality. And Im really big on being able to understand all the different modalities and seeing how you use the strengths of each one of those modalities to help you process what youre trying to process. And there might be better strengths on one for photonics or, you know, trapped ions versus, you know, superconducting. So I definitely wouldnt, again, Im not trying to rain on any hardware providers that wanna sell their machines, but if Im sitting there starting up and this is what we do in innovation, is I want to play around and I wanna try it out without making those big investments. And I think using the cloud and being able to access these technologies through that is a good first step to get engaged.

Yuval: Will, you are a superconducting expert and a quantum expert in general. So other than superconducting, what modality are you excited about?

Will: Well, there are many that Im excited about because its clear that, you know, if we think of this as a marathon, were in the first five or 10 miles. Theres still quite a ways to go. And whos leading today likely will not win the race or may not win the race. I shouldnt try to guess whats gonna happen.

But again, if we look back at technologies from the last century, we started with vacuum tubes, and they evolved into bipolar junction transmitters, emitter-coupled logic, CMOS, Technology evolves over time and I would expect the same to happen in quantum.

So the two that seem to be leading today are superconducting qubits and trapped ions. Theyre both quite exciting. Superconducting qubits are electrical circuits controlled by microwaves. And trapped ions, generally speaking, are atoms where theyve had the outermost electron ripped off, an ion, and theyre trapped using electromagnetic fields on a chip, and then theyre controlled generally by lasers. So different technologies, completely different technologies. But both are doing very, very well today, both in the academic as well as the commercial space.

Some other up-and-coming technologies that are quite exciting include neutral atoms. And those again are based on atoms, and theyre trapped with two counter-propagating lasers.

And theyre very good at quantum simulations and some of the largest quantum computers today have been built using neutral atoms. You know, were talking maybe 300 qubits at this point.

Photonic approaches, so using photons, the carriers of light, in integrated circuits looks very promising because, you know, they can rely and do rely very heavily on the existing fabrication infrastructure we have for CMOS, because those foundries are also making integrated photonics. And so with a little tweaking, you can just update it. Now youve got a photonic chip.

Semiconductor qubits are also quite interesting because they natively exist in those technologies from the foundries. So built from CMOS or silicon or silicon germanium. And so they right out of the box can leverage the last 50 years of fabrication technology.

So all of these different approaches have pros and cons, which we can go through in more detail if youd like to, but because they each have problems to address, its not clear yet whos going to be the first winner. But the two that are in the lead are trapped ions and superconducting qubits.

Yuval: Steve, youre coming off a fantastic tenure as a head of innovation in a very large financial services company. And I wanted to ask you two questions about that. First, if you have a friend whos now starting to be head of innovation at a large company, what advice would you give him? And two, your new company, Horizon XC, I believe, what is that going to do?

Steve: I think if I were to give advice to anyone running innovation at a large multinational organization, Number one is to make sure you have the right support and engagement from senior management. Thats number one at the top of the house. If you dont have that, dont waste your time because youre not going to be able to do it.

Everybody loves innovation because they think its a nice shiny job. And you wouldnt imagine the amount of times people have told me, Steve, youre the luckiest man on earth. Youve got the best job kind of running innovation. And innovation is, its a hard job, but its very satisfying for people that love to drive through it. And youre gonna hear a lot, theres gonna be a lot of obstacles, a lot of reasons why you cant do things.

So I think number one is to make sure you have the engagement and the support from senior management because it will be difficult. Make sure you have the budget that youre gonna need to drive it. So if senior management says, yes, we love innovation, we drive innovation, they have to commit to it. And to commit to it, which means were gonna give you the budget and the room you need to be able to innovate, which means you have to be okay with failure. Because if youre not failing, youre not truly innovating. And you have to be able to kind of create that culture to where Im gonna experiment, Im gonna try to do it very fast, cheap, and frictionless. And if you give me that opportunity, I can bring new things and really do innovation and bring new things to life. And I would say those are probably the main advice I would give to anybody looking to do this.

Yuval: And your new company. What is it going to do and how can it help others?

Steve: Horizon X Consulting, and the reason I called it Horizon X is that I look at it at the three horizons of innovation. Horizon one is core innovation. Horizon two is new products or things that exist in the world, but youre bringing in new to your organization. And then horizon three is your moonshots.

This is really the disruptive innovation, new business models. And Im looking to help organizations make sure theyve got the right balance between the three horizons and get results out of their innovation agenda. A lot of people have innovation teams. And theyre doing what I call innovation, theater or innovation, cheerleading, but theyre not really driving the value from that type of activity. And what I want to do is help organizations truly drive value out of their innovation agendas. Because were constantly driving to find new things, new products, new entrances. And either your organization is standing still, its going backward, or its moving forward, and you cant do that without innovating.

Yuval: As we get closer to the end of our conversation today, I wanted to ask you a brief question about ethics and quantum ethics. And Will, you work at MIT, you know, in the name, it says Institute of Technology. Do you guys deal with ethical aspects of quantum or other new technologies?

Will: Yeah, absolutely. I mean, we have a school of humanities and many people therein think very hard about these types of problems, whether, you know, for quantum is the new kid on the block, but for a long time, the ethics of AI, for example, or the ethics going back even further with the recent movie Oppenheimer, right? I think, you know, thinking of the ethics of nuclear power and nuclear weapons and therein. So, absolutely we are thinking very hard about it at MIT, and I know people around the world also. And not just the, you know, the technology ethics, but also maybe related to that is the access to these technologies, the potential for disparity that it may create, e.g., economic disparity. And ideas for how we can avoid the negative consequences of technologies as we develop them.

Yuval: And Steve, you talk with a lot of boards and Im sure the issue of ethics comes up from time to time. Is quantum ethics any different than AI ethics or is it really just, you know, another one of the same?

Steve: I think its another aspect of it. I think ethics should apply to everything from business-technology processes. So from us, its nothing new. And I think its just understanding the technology and making sure that you can use it for a force of good.

And I think I could tell you the reason Im personally really interested in this technology is that I think it can have, if done right, a significant impact on things like climate change. Ive got three boys. I want them to live in a better world than Ive lived. And if I can say, look, Im getting engaged in something that could have an impact on this world, I think thats kind of what excites me of people like Will allowing me to kind of get into this industry, allowing me to kind of help push quantum forward to me excites me and looking at how we make things more secure.

I know Will was talking about NISQ with a Q and looking at what NIST with a T, so the National Institute of Standards and Technology is doing around creating security, I think is key. And I think thats where quantum comes in. So Im actually really interested in kind of how quantum AI and cyber may come together and have an impact and be able to get into that early enough where I can add value.

Yuval: The last question I like to ask my guests is a hypothetical about if you could have dinner with one of the quantum greats dead or alive, who would that be? Now I know Will for you its a little bit of a tricky question because I think that some of my listeners would like to have dinner with you. But putting that aside, who would you want to have dinner with?

Will: Oh boy, thats a really interesting question and one that I havent really thought about before. But you know, off the cuff, I think it would be very interesting to me to have dinner with Niels Bohr, and the reason is that he formed many of the foundations for the intuition of how we think about quantum and quantum measurement. And I think it would be very interesting to have dinner and pick his brain and hear his thoughts on the subject.

Yuval: And Steve how about you?

Steve: I am going to maybe go boring. Id go with Albert Einstein. And I think when he talks about quantum, I think he described it as spooky science at a distance, or something like that. And I might not be quoting it perfectly. But the fact that Albert Einstein, this great mind that understood so many things, called it spooky science at a distance. Even him, he had a hard time conceptualizing or understanding, and putting this together. And Id love to maybe pick his brains to understand why you think it was spooky science or, you know, what are, I dont know. Thatd be great. I think Id probably just enjoy listening to whatever Albert Einstein said. I dont think Id understand it, but just to be in his presence, would be pretty cool on my side.

Yuval: I think so too. Steve, Will thank you so much for joining me today.

Will: Thank you.

Steve: Thank you.

Yuval Boger is the chief marketing officer for QuEra, the leader in neutral-atom quantum computers. Known as the Superposition Guy as well as the original Qubit Guy, he can be reached on LinkedIn or at this email.

September 12, 2023

See the original post here:
Podcast with Will Oliver, a Professor at MIT, and Steve Suarez, an Innovation Advisor and CEO of HorizonX Consulting - Quantum Computing Report

Read More..

Princeton expands its commitment to research and education in … – Princeton University

Princeton University is expanding its commitment in quantum science and engineering research and education, with plans for a new building, a new graduate program, and a broader leadership structure for its initiative. These expanded programs, along with ongoing recruitment of top faculty, graduate students and postdoctoral researchers, reflect the Universitys recognition of the transformative potential of quantum science and technology to benefit society in the decades ahead.

The University established the Princeton Quantum Initiative in 2019 and named Andrew Houck, professor of electrical and computer engineering, as director. Now, as Princeton builds towards establishing a permanent institute for quantum science and engineering, as described in the trustees recent strategic planning update, the initiative adds Ali Yazdani, the Class of 1909 Professor of Physics, as co-director alongside Houck.

This endowment-enabled initiative will be guided by an executive committee of faculty from four departments across engineering and the natural sciences. The vision for the new institute is to bring together and support faculty and students across science and engineering who are pushing the boundaries of discovery around quantum information, particularly in the areas of quantum computing, communication, and sensing.

Quantum information continues to be an exciting area with deep, fundamental impacts on science and transformative technological possibilities, Houck said. Princeton is playing a leading role in this, and we are ramping up efforts across campus to remain the leading place in the world for this kind of science and engineering for many decades.

Yazdani added that Princetons work in this area stands apart from quantum research at other institutions due to the Universitys inclusive approach across disciplines and across the spectrum from foundational science to innovative devices. With this commitment to constructing a building to house the institute, we have the opportunity to coalesce research and teaching across many disciplines under one roof, Yazdani said. It allows us to build a cohesive effort that has a core but touches many other areas of science and engineering.

The new building will be within easy reach of scholars in engineering, physics, and chemistry, Yazdani said.The initiative also benefits from a growing number of collaborations with scientists at the Princeton Plasma Physics Laboratory, a U.S. Department of Energy national laboratory managed by Princeton University, including work to design highly specialized materials such as diamonds and superconducting magnets that are needed for quantum experiments and technologies.

The newly established executive committee includes Waseem Bakr, professor of physics; David Huse, the Cyrus Fogg Brackett Professor of Physics; Nathalie de Leon, associate professor of electrical and computer engineering; Ran Raz, professor of computer science; Leslie Schoop, associate professor of chemistry; and Jeff Thompson, associate professor of electrical and computer engineering.

The vision for the new institute is to bring together and support faculty and students across science and engineering who are pushing the boundaries of discovery around quantum information.

In parallel, the University is launching a new graduate program in Quantum Science and Engineering, which will begin taking applications this fall. This new program will be one of the first few Ph.D. programs in quantum science and engineering, building on the global leadership role Princeton has already established in quantum education, said de Leon, the inaugural director of graduate studies.

The field of quantum information science is emerging from disparate disciplines, and almost none of the current practitioners have training across the combined areas. As researchers, we have been winging it to learn what we need to push into new territory, said de Leon, noting that the new Ph.D. program will build on the current curriculum to address these gaps.

Princeton faculty have been very forward-looking in developing a new curriculum in this space over the past 15 years, from a pioneering undergraduate course on quantum information accessible to students in engineering and math, to a graduate seminar on implementations of quantum information, to most recently, a new lab course on experimental methods of quantum computing, de Leon said.

Jennifer Rexford, Princeton University provost and the Gordon Y.S. Wu Professor in Engineering, said quantum research at Princeton reflects a full-stack approach in which faculty and students are pushing the boundaries at all the levels of science and technology that are needed to achieve the fields potential. Whats special is that we have really amazing researchers across several departments that span from the applications to the technology, the devices, the materials, and to the fundamental science, Rexford said.

Whats special is that we have really amazing researchers across several departments that span from the applications to the technology, the devices, the materials, and to the fundamental science.

Jennifer Rexford,Princeton University provost and the Gordon Y.S. Wu Professor in Engineering

Key to maintaining strength across these areas is Princetons collaborative culture, she said. We have low barriers to that kind of collaboration, and we are making them even lower, putting people in a building together and having a graduate program together so faculty can prepare their students to work in this cross-disciplinary mode as well, creating future leaders.

Broadly speaking, quantum research at Princeton seeks to understand and harness the strange behaviors of particles at and below the atomic scale, both to understand how the universe works and to develop useful technologies. The outlines of quantum science emerged throughout the early 20th century, often led by Princeton scientists, with the discovery that the smallest particles do not obey the classical laws of physics and that energy moves in small, undividable quantities, or quanta. This understanding has been incorporated into a wide range of common technologies, from GPS and atomic clocks to lasers and LEDs.

Further oddities emerged as scientists found phenomena such as one particle that could be in two places or two states at once, or two particles that could behave as one even though separated by many miles. In what scientists sometimes call the second quantum revolution, these fundamental insights are combining with the revolutions in information technology that fueled the growth of computing and communications. This convergence is driving rapid progress toward new realms of computing, sensing and communications, as well as new insights into the underlying physics.

Photo by

David Kelly Crow for the Office of Engineering Communications

Andrea Goldsmith, dean of the School of Engineering and Applied Science andthe Arthur LeGrand Doty Professor of Electrical and Computer Engineering, said this enhanced vision for quantum science and technology will position Princeton as a leader in this area long into the future. Quantum information science is at an inflection point similar to the dawn of the semiconductor era, when universities led the way to discoveries enabling the communication and computing devices and networks that underpin so many aspects of our lives today, Goldsmith said.

The information devices and networks of the future need significant leaps forward in performance, security and resilience, which quantum technology could provide, she said. Princetons expanded vision ensures we will play a critical role in developing the foundations of these future technologies.

Photo by Rick Soden for the Department of Physics

James Olsen, chair of the Department of Physics and professor of physics, also welcomed the new commitments. "Establishment of a dedicated quantum institute at Princeton is an opportunity to strengthen and expand existing vibrant collaborations across our engineering and science communities, Olsen said. "Placing fundamental science adjacent to advanced engineering, the 'why?' and 'how?' under one roof, will spark innovation leading to exciting new discoveries and technological advances in the quantum realm."

Rexford added that a benefit of Princetons breadth of expertise is the ability to examine numerous promising areas at once. In the area of quantum computing, for example, Princeton has leading efforts across most of the major approaches to replacing the ones and zeros of conventional computers with infinitely more complex units of information called qubits.

We are not picking a winner, she said. We are going to support the wide range of work necessary to figure out what the right answer, or combination of answers, might be. We are willing to invest broadly in this space, and we are committed to providing the resources needed so that we do not to have to pick a winner too early.

Some technologies underway at Princeton, such as quantum sensors capable of discerning changes within a single molecule, or quantum simulations that allow physicists to manipulate quantum behaviors in computer-like devices, may be ready for prime time in the near future, Rexford noted, while others, such as a general-purpose quantum computer, will likely take much longer.

We are going to invest across those timescales tooour goal is to invest in the short, medium and long-term in this space, and to let curiosity and creativity bloom, Rexford said.

See original here:
Princeton expands its commitment to research and education in ... - Princeton University

Read More..

Quantum Hopeful Zapata to go Public and Pivot to Industrial … – HPCwire

Zapata Computing, the quantum software company spun out from Harvard in 2017, yesterday announced plans to go public and reposition itself as a provider of industrial generative AI software. This is a marked departure from its earlyaspirations of delivering quantum advantage on current NISQ devices, but it is in keeping with fundamental strengths in quantum-inspired AI software approaches that the company possesses.

The new company, Zapata AI,will go public via the so-calledSPAC route. Heres more from yesterdays announcement: Andretti Acquisition Corp. (NYSE: WNNR), a publicly traded special purpose acquisition company, announced today that they have entered into a definitive business combination agreement that will result in Zapata AI becoming a U.S. publicly listed company. Upon closing of the transaction, the combined company is expected to be listed on the New York Stock Exchange under the new ticker symbol ZPTA.

The boards of directors of each of Zapata and Andretti Acquisition have approved the transaction, which is expected to close in the first quarter of 2024. Other quantum companies D-Wave,IonQ,Rigetti have also gone public via SPACs with varying resultstwo briefly faced delisting. The big challenge hasnt so much been the pace of quantum progress but the stretching time-to-payoff.

Christopher Savoie, CEO of Zapata AI, said in the official announcement, Our engineers and scientists have spent years building, testing, and refining our proprietary software to put Zapata AIand our customersat the forefront of the generative AI revolution. We believe generative AI is shaping a once-in-a-generation opportunity, and the capital and relationships afforded through this business combination will only strengthen our market position. We are participating in an enormous total addressable market where we have the potential to create disproportionate value for our customers and our investors.

Quantum watchers suggest many young quantum software specialists are likely reassessing business plans. While quantum computing advances have been steady, demand has focused on POC and exploratory projects. Current NISQ near-term intermediate scale quantum devices are still too error-prone for use in most production environments, and this is extending expected timelines for quantum payoff. Meanwhile, generative AI has taken off following ChatGPTs success, prompting some quantum software specialists to pivot or expand their strategies.

The initial hype that surrounded quantum computing resulted in a flood of investments even though the technology was still in the early stages of development. While quantum computing software will be necessary for running quantum computing applications, the promises made by some software vendors early on were perhaps too early. However, because generative AI can be used for similar use cases, it may be possible for organizations to leverage some of these applications now without having to wait for quantum to scale, said Heather West, research manager within IDCs Enterprise Infrastructure Practice and lead on quantum computing.

Here are a few financial details of the deal as discussed during the announcement call yesterday:

The transaction values Zapata at an implied pre-money equity value of $200 million, with existing Zapata shareholders set to roll over 100% of their equity into the combined entity, or 20.0 million shares at a price of $10.00. Andretti Acquisition Corp.s sponsors and certain investors that own or have the right to receive founder shares will own a combined 5.8 million shares, or an implied value of approximately $58 million. Andretti Acquisition Corp.s public shareholders currently hold approximately 7.9 million shares, all of which are subject to redemption. The pro forma equity value of the combined company (inclusive of the remaining cash in trust at Andretti Acquisition Corp. after redemptions) is expected to be between $281 million and $365 million, depending on the level of redemptions.

Zapata has been an active developer of IP and early products and says it had nearly twice as many international patent application filings as Meta or Google last year and over 100 global patents and patent applications covering various algorithms, use cases and supporting software and hardware.

The companys current product offerings include Zapata AI Prose, a large language model generative AI solution, and Zapata AI Sense, which generates new analytics solutions to complex industry problems. These industrial solutions, which uniquely process both text and numbers, run on Zapata AIs full-stack Quantum AI software platform, Orquestra, enabling Zapata to train and deliver AI models within customers hybrid cloud and multi-cloud environments, including Microsoft Azure, AWS, and others.

Zapata AIs plan is to provide enterprise-ready AI solutions and tools across a wide variety of industries, including life sciences, finance, chemicals, automotive, government/defense, aerospace and energy. The early emphasis is on creating custom models for industrial use cases, which Zapata AI says differ from consumer-grade LLM in term of required accuracy and security.

During the call, held jointly with Andretti, Savoie said, We have developed a suite of custom, industrial generative AI solutions that can harness the power of language and numerical models for critical, sensitive industrial grade applications. Our solutions are fine-tuned for our customers domain-specific problems.

Perhaps with a nod to the difficulty of attaining near-term success in the still nascent quantum computing market, he added, Our technology is derived from math-inspired quantum physics. The hard part is turning that discipline of physics into useful technology. Fortunately, our work in this area has many transferable and positive implications for generative AI. Being experts at quantum math is one of the surprise differentiators. It allows us to enhance key desirable qualities of generative models. Namely, quantum statistics can enhance generative models ability to generalize or extrapolate missing information, generate new, high-quality information, as well as their ability to generate a more varied range of solutions.

Except for acknowledging its technical expertise in quantum-inspired software, there wasnt any substantive discussion of Zapatas ongoing strategy in the quantum computing arena. Like other quantum software specialists, Zapata had been growing its emphasis on hybrid classical-quantum solutions for some time, as well as its emphasis on all things AI.Indeed, generative AI is generating gold rush aspirations from all quarters.

While the link with Andretti (of motorsport fame) may seem a bit unusual, the two companies have been collaborating for quite a while, exploring the use of Orquestra and other tools for analytics in connection with Andretti racing.

In the official announcement, Michael Andretti, Co-CEO of Andretti Acquisition Corp., said, Zapata AIs Industrial Generative AI solutions have demonstrated their applicability helping enterprises across a range of industries solve complex problems and make better business decisions we have experienced this firsthand in the AI-driven race strategy solutions and advanced analytics capabilities they are delivering to Andretti Autosport. [B]ased on our understanding of its vast capabilities, compelling go-to-market strategy, and ambitious growth plan, we believe there is tremendous enterprise revenue opportunity.

Time will tell.

Read the original post:
Quantum Hopeful Zapata to go Public and Pivot to Industrial ... - HPCwire

Read More..

Is Quantum Computing the Key to a Greener AI Future … – Cryptopolitan

Description

In a world grappling with the soaring energy consumption of artificial intelligence (AI) systems, a glimmer of hope emerges from the realm of quantum computing. The energy-hungry nature of todays AI models has raised alarming environmental concerns, pushing us toward an impending energy crisis. But, inspired by the astonishing efficiency of natures computations, researchers are Read more

In a world grappling with the soaring energy consumption of artificial intelligence (AI) systems, a glimmer of hope emerges from the realm of quantum computing. The energy-hungry nature of todays AI models has raised alarming environmental concerns, pushing us toward an impending energy crisis.

But, inspired by the astonishing efficiency of natures computations, researchers are now exploring the potential of quantum computing to revolutionize AI. Just as plants harness quantum effects in photosynthesis, we may harness quantum computing to drive AI on a fraction of its current energy usage.

As the demand for AI services skyrockets, it is crucial to address the colossal energy requirements of the machines powering these algorithms. Supercomputers, while crucial for AI advancements, devour a significant portion of the worlds energy, emitting harmful greenhouse gases. For instance, the Frontier supercomputer, currently the most powerful globally, demands an annual energy bill of $23 million, equivalent to powering thousands of homes. Quantum computing, on the other hand, consumes significantly less energy, making it an environmentally-friendly alternative.

One promising avenue for making AI greener lies in quantum-inspired computing, which mimics quantum processes but operates on classical machines. This approach offers substantial energy savings compared to traditional AI systems. For example, quantum-inspired techniques can enhance the memory performance of neural networks, reducing energy consumption significantly. As quantum computers mature and reach the fault-tolerant era, researchers may use qubits to replace artificial neurons in neural networks, further improving energy efficiency.

Todays CPUs and GPUs power neural networks with up to 50 layers, enabling tasks like speech-to-text transcription and weather prediction. Quantum computers, once fully developed, could operate with minimal energy costs, thanks to quantum-inspired techniques, allowing networks with a high number of neurons per layer. This efficiency breakthrough holds immense promise for slashing energy consumption in AI applications.

While debates persist regarding quantum computers energy consumption, one crucial advantage is their linear scalability in terms of power usage. In contrast, classical supercomputers exhibit nearly exponential growth in power consumption as they become more powerful. Quantum computings power usage scales linearly, making it a compelling choice for those seeking to reduce overall electricity consumption.

The Quantum Energy Initiative, comprising participants from around the world, is committed to tracking energy use alongside the growth of quantum computing capabilities. Their aim is to develop energy-based metrics for quantum technologies and minimize the energy costs of quantum processes, ensuring a sustainable and efficient path forward.

The path toward a groundbreaking and transformative AI revolution, driven by the remarkable potential of quantum technology, is undoubtedly beset with formidable challenges. Nevertheless, as we confront the imminent repercussions of a planet grappling with escalating temperatures and an insatiable thirst for energy resources, each step forward in the realm of quantum technology inexorably propels us closer to the realization of a visionary aspiration: AI that possesses not only unparalleled intelligence but also an astonishing level of sustainability.

Quantum computing and its quantum-inspired counterparts are not mere substitute methodologies for conventional classical computing; they stand as indispensable catalysts essential for ushering in a future that is more environmentally friendly and energy-efficient. This future envisions AI systems operating within the energy consumption parameters akin to a delicate butterfly, thereby leaving an exceptionally minimal ecological footprint that shall endure for the benefit of generations yet to come.

Read the original:
Is Quantum Computing the Key to a Greener AI Future ... - Cryptopolitan

Read More..

Assessing Quantum AI Performance: Key Metrics and Indicators – Startup.info

Quantum AI, the convergence of quantum computing and artificial intelligence, holds great potential for revolutionizing a wide range of industries. However, as this emerging field continues to develop, it is essential to establish metrics and indicators for assessing quantum AI performance. In this article, we will provide an overview of quantum AI, explore key metrics for evaluating its performance, discuss indicators of high-performing quantum AI, examine case studies of quantum AI in action, and speculate on the future possibilities and challenges of this exciting technology.

Before diving into the specifics of assessing quantum AI performance, its crucial to understand the fundamentals of this field. Quantumaitrading.ai combines the principles of quantum mechanics and artificial intelligence to develop algorithms capable of processing and analyzing vast amounts of complex data.

What sets quantum AI apart from classical AI is the utilization of quantum bits, or qubits, as the fundamental units of computation. Unlike classical bits, which can represent either a 0 or a 1, qubits can exist in a superposition of states, allowing for the simultaneous representation of multiple possibilities. This property creates the potential for exponentially faster calculations and enhanced problem-solving capabilities.

Quantum AI refers to the application of quantum computing principles in the field of artificial intelligence. By harnessing the unique properties of quantum mechanics, such as superposition and entanglement, quantum AI aims to overcome the limitations of classical computation and enhance the capabilities of AI algorithms.

Quantum AI, also known as Quantum Artificial Intelligence, is an exciting and rapidly evolving field that combines the power of quantum computing with the ingenuity of artificial intelligence. It represents a groundbreaking approach to solving complex problems and unlocking new frontiers in computing.

At its core, Quantum AI leverages the principles of quantum mechanics, a branch of physics that describes the behavior of matter and energy at the smallest scales. By harnessing the peculiar properties of quantum mechanics, such as superposition and entanglement, quantum AI algorithms offer the potential for unprecedented computational power and revolutionary advancements in various domains.

Superposition, one of the key principles of quantum mechanics, allows qubits to exist in multiple states simultaneously. This means that instead of being confined to representing either a 0 or a 1, qubits can be in a state that is a combination of both. This property opens up a vast landscape of possibilities, enabling quantum AI algorithms to explore multiple solutions simultaneously and potentially find optimal answers more efficiently.

Another crucial concept in quantum AI is entanglement. When qubits become entangled, their states become correlated, regardless of the distance between them. This phenomenon allows for the creation of interconnected systems that can share information instantaneously, even over long distances. Harnessing entanglement in quantum AI algorithms can enable enhanced communication, distributed computing, and improved decision-making processes.

The concept of quantum AI emerged as researchers realized the immense power quantum computing could bring to various AI applications. Over the years, quantum AI has evolved from theoretical concepts to practical implementations, with both academia and industry actively exploring its potential.

Today, major technology companies and research institutions are heavily investing in quantum AI research and development, pushing the boundaries of what is considered possible in AI. The race to achieve quantum supremacy, a state where a quantum computer can outperform classical computers in specific tasks, has intensified the efforts in this field.

Quantum AI has the potential to revolutionize industries such as drug discovery, optimization problems, cryptography, machine learning, and more. Its ability to process vast amounts of data and perform complex calculations in parallel can unlock new insights and solutions that were previously unattainable.

As quantum AI continues to evolve, scientists and engineers are working on developing scalable quantum computers, improving qubit coherence and stability, and refining quantum algorithms. These advancements will pave the way for the widespread adoption of quantum AI and the realization of its full potential.

Assessing the performance of quantum AI requires the identification of key metrics that can effectively capture its capabilities. Here are three essential metrics to consider:

The speed at which quantum AI algorithms can solve complex problems is a vital metric for evaluation. Quantum AI has the potential to outperform classical AI algorithms by providing exponential speedup for certain computational tasks. Evaluating the efficiency of quantum AI algorithms in terms of time complexity and resource utilization is crucial for gauging their overall performance.

While speed is crucial, accuracy and precision are equally important metrics for assessing quantum AI. The ability of quantum AI algorithms to produce accurate results with high precision is paramount for their real-world applications. A key challenge in this area is overcoming quantum noise and errors that can affect the overall accuracy and precision of quantum computations.

Quantum AI must also demonstrate scalability and flexibility to be considered high-performing. Scalability refers to the ability of quantum AI algorithms to handle larger and more complex datasets efficiently. Flexibility, on the other hand, involves the adaptability of quantum AI algorithms to different problem domains and the ability to solve a wide range of computational tasks.

Quantum supremacy refers to the point at which a quantum computer can perform a calculation that is beyond the reach of any classical computer. Achieving quantum supremacy is a significant milestone in quantum AI development and serves as a crucial indicator of a high-performing quantum AI system.

Quantum entanglement is a fundamental property of quantum systems that enables the correlation of qubits beyond classical means. The presence of quantum entanglement in quantum AI systems can provide increased computational power and unlock new possibilities for solving complex problems.

Quantum tunneling allows qubits to traverse energy barriers that would be insurmountable using classical means. The ability of a quantum AI system to exhibit quantum tunneling can indicate its potential for overcoming computational obstacles and achieving more efficient and effective results.

Examining real-world applications of quantum AI provides valuable insights into its current capabilities and potential. Lets explore two notable case studies:

Google has been at the forefront of quantum AI research through its Quantum AI lab. One of their notable achievements includes demonstrating quantum supremacy by solving a complex computational problem that would take classical supercomputers thousands of years to crack.

Through their research, Googles Quantum AI lab aims to accelerate the development of quantum algorithms and explore practical applications for quantum AI, ranging from optimization problems to simulating quantum systems.

IBM has made significant advancements in quantum computing through its IBM Quantum program. They have developed a cloud-based quantum computing platform called IBM Quantum Experience, accessible to researchers and developers worldwide.

IBMs Quantum Computing efforts focus on advancing quantum hardware and software, exploring quantum algorithms, and engaging the community to foster collaboration in this rapidly evolving field.

The future of quantum AI holds immense promise, with the potential to revolutionize various industries. Here are some potential applications:

Quantum AI could transform drug discovery and molecular simulations by efficiently analyzing complex chemical interactions. It could also enhance optimization problems, cryptography, and machine learning tasks by leveraging its superior computing capabilities.

Despite its vast potential, quantum AI faces significant challenges and limitations. Quantum noise and errors, limited qubit coherence, and the need for error correction are among the major hurdles that researchers and practitioners must overcome to achieve reliable and scalable quantum AI systems.

Additionally, the high costs associated with quantum hardware and the requirement for specialized expertise pose barriers to widespread adoption and deployment of quantum AI solutions.

In conclusion, assessing quantum AI performance requires a holistic understanding of its fundamental principles and metrics. By evaluating speed, efficiency, accuracy, precision, scalability, and flexibility, we can effectively gauge the performance of quantum AI algorithms. Furthermore, indicators such as quantum supremacy, quantum entanglement, and quantum tunneling can provide crucial insights into the potential of a high-performing quantum AI system. Through case studies like Googles Quantum AI Lab and IBMs Quantum Computing efforts, we witness practical implementations of quantum AI. Looking forward, the future of quantum AI holds significant possibilities and potential applications, albeit with challenges and limitations that need to be addressed. With ongoing advancements and collaboration, quantum AI is poised to reshape the world of AI and computing as we know it.

Continued here:
Assessing Quantum AI Performance: Key Metrics and Indicators - Startup.info

Read More..

Infuras Decentralization Initiative: Strengthening Blockchain Infrastructure – CoinTrust

In a groundbreaking announcement poised to reshape the landscape of blockchain infrastructure services, Infura, a subsidiary of ConsenSys, has revealed its ambitious plan to introduce a decentralized version of its platform by the end of 2023. This visionary approach aims to elevate resilience by entrusting the operation of Infura to multiple entities, mitigating potential outages, and fortifying the platforms reliability.

The journey toward decentralization is meticulously planned and will unfold through a series of phases, according to the companys statements. While the exact governance model for the decentralized Infura is yet to be finalized, it is expected to take the form of either a Decentralized Autonomous Organization (DAO) or a foundation. This strategic shift promises to usher in a new era characterized by robustness and sustainability within the blockchain sphere.

Infura stands as a stalwart Infrastructure-as-a-Service (IaaS) platform in the blockchain domain, providing vital support to decentralized applications and web3 wallets, including the popular MetaMask service. Its commitment to ensuring swift and efficient access to a multitude of blockchains has solidified its position as an indispensable player in the blockchain ecosystem.

The transition to a decentralized Infura will unfold in a phased manner to ensure a seamless transition that upholds the high standards of service users have come to expect. While specific details of each phase remain undisclosed, the overarching objective is to distribute operational responsibilities among multiple entities. This strategic approach aims to reduce the platforms vulnerability to outages and disruptions, ultimately enhancing its reliability.

At the core of this transformative journey lies the pivotal question of governance. Infuras embrace of decentralization necessitates a governance model aligned with its mission and values. While it has not been officially confirmed whether the platform will adopt a DAO structure or establish a foundation, both options bring their unique advantages.

A DAO, or Decentralized Autonomous Organization, represents a revolutionary approach to governance, allowing token holders to actively participate in decision-making processes and have a direct stake in the platforms future. In contrast, a foundation can provide stability and structured governance, ensuring long-term sustainability and accountability.

The choice of governance model will play a pivotal role in shaping the operations and direction of the decentralized Infura, making it one of the most closely watched aspects of this transformation.Infuras move toward decentralization carries significant implications for the broader blockchain ecosystem. As one of the most widely used IaaS platforms, its services underpin a vast array of blockchain-based applications and services.

For decentralized applications (dApps) like MetaMask and others, Infura serves as a critical bridge to various blockchains, offering speedy and reliable access to data and network infrastructure. Without Infuras infrastructure, many dApps would face significant challenges in delivering seamless user experiences.Furthermore, Infuras commitment to decentralization aligns seamlessly with the broader ethos of the blockchain space, where decentralization is often considered a fundamental principle. By embracing this shift, Infura is not only bolstering its own resilience but also contributing to the overall decentralization and democratization of blockchain technology.

Infuras journey toward decentralization carries profound implications for the blockchain industry, including:

Enhanced Resilience: By distributing operational responsibilities among multiple entities, the decentralized Infura is poised to become more resilient, reducing the risk of outages that could disrupt blockchain services.

Improved Reliability: Infuras commitment to enhancing reliability benefits a wide range of dApps, web3 wallets, and blockchain projects, ensuring smoother and more consistent user experiences.

Governance Evolution: The choice between a DAO and a foundation as the governance model sets a precedent for future projects seeking to balance decentralization with structured governance.

Decentralization Movement: Infuras initiative aligns with the broader movement toward decentralization within the blockchain space, promoting a more open and decentralized internet.

User Empowerment: If a DAO model is adopted, token holders will have a direct say in Infuras operations, representing a significant shift in the power dynamics of the platform.

Infuras decision to embark on the path toward decentralization represents a pivotal moment in the evolution of blockchain infrastructure services. As a leading industry player, its commitment to enhancing reliability and resilience will resonate throughout the blockchain ecosystem. The choice of governance model, whether a DAO or a foundation, will shape the platforms future and influence the broader conversation around decentralized governance.

Infuras visionary move not only underscores its dedication to blockchain technologys principles but also sets an inspiring example for other industry stakeholders. It serves as a testament to the ongoing transformation of the blockchain space toward a more open, decentralized, and robust future. As the blockchain community eagerly anticipates the unfolding of Infuras decentralization plan, it remains a key player to watch, poised to set new standards and inspire innovation across the industry.

Read the rest here:

Infuras Decentralization Initiative: Strengthening Blockchain Infrastructure - CoinTrust

Read More..

How Real Is the Decentralization Myth? Very Real, Say Experts – BeInCrypto

One of this weeks talking points in crypto has been decentralizationor the lack of it. But to what extent is the d word just branding for projects that are nothing of the sort?

During an event at Korea Blockchain Week, Vitalik Buterin took to the stage to discuss the ongoing issues with Ethereums decentralization. Buterin said that running a nodea computer that participates in a blockchain network to validate and relay transactionsshould be cheaper and easier.

The founder of the worlds second-biggest blockchain acknowledged that node centralization was a key challenge for the network. To fix the problem, and as part of Ethereums (ETH) roadmap, the chain will reduce full node hardware requirements by using stateless clients.

This could, eventually, allow mobile devices to validate and verify all transactions on Ethereum. However, Buterin recognized it will take a 10-year timescale, maybe a 20-year timescale.

Buterin emphasizes the need to address Ethereum network centralization. A pressing issue for many in the crypto community who aim to shift control of digital data away from corporate owners and executives.

Its not just node centralization that has caused a fuss, either. Nor corporate owners and executives, for that matter.

Lido Finance now holds almost a third of all staked Ethereum, raising concerns about its growing influence and potential centralization. Over the past year, total staked ETH grew substantially, with Lido holding 32.5%. Critics warn that Lidos size could compromise Ethereums decentralized nature and make a mockery of the term.

This week, the chief decentralization officer at Ethereum, Evan Van Ness, called Lido possibly the biggest threat to Ethereums decentralization in its history.

Its not just Ethereum, either. Ripple, the creator of the XRP token, only reduced its holdings to less than 50% of the supply in October of last year.

In response to an October 2022 attack, BNB Chain, Binances blockchain, quickly halted its network and minimized a $566 million hack to $100 million. However, it did this by coordinating action among its 26 validators. Understandably, this raised questions about the centralized control and potential vulnerabilities in its more streamlined, proof-of-staked authority system.

Caspar Sauter, co-founder of D8X, told BeInCrypto that many so-called decentralized projects have smart contracts with an owner account that is either an externally-owned account or a multisig account controlled by the core team.Deciphering which projects are truly decentralized can take time, expertise, and effort.

Sauter explained:

Often those owner accounts grant wide-ranging privileges to control assets. With such a setup, you essentially have a centralized player that controls everything.

Although Sauter did acknowledge there were genuine DeFi projects, decentralization often takes a lot of work to get right, he said.

In a discussion with BeInCrypto, Konstantin Boyko-Romanovsky, CEO at Allnodes, said true decentralization in blockchain is a key goal but hard to achieve.

There are valid arguments on both sides as to whether decentralization is more myth than reality in some blockchain networks, he added.

Attaining complete decentralization in blockchain is an ongoing effort. However, balancing ideal decentralization with real-world usability and efficiency involves compromises.

Furthermore, elegant technical solutions, like the aforementioned stateless clients, can often take years to implement.

But its not all bad news, says Richard Meissner, co-founder at Safe, who told BeInCrypto that calling decentralization a myth is a strong statement. While full trustless decentralization is not here yet, the last several years have seen great progress.

Theres also the problem of regulatory risk that comes with handing over your project to a community.

Most dapps are hosted via centralized hosting services and their domains are controlled in a centralized way. Also for many teams, the governance process still contains a centralized security mechanism as this is also a regulatory measurement, he added.

Although, in Meissners view, decentralization should be invisible to users. Just as people dont think about the type of database their service provider uses now, they shouldnt have to care that a system is decentralized. The benefits of decentralization, like ownership of accounts and funds, censorship resistance, and universal availability, are what matter to users.

If another technology could provide those same benefits with better user experience, even without decentralization, users would likely adopt it, Meissner concluded.

In adherence to the Trust Project guidelines, BeInCrypto is committed to unbiased, transparent reporting. This news article aims to provide accurate, timely information. However, readers are advised to verify facts independently and consult with a professional before making any decisions based on this content.

More here:

How Real Is the Decentralization Myth? Very Real, Say Experts - BeInCrypto

Read More..

Ethereum Decentralization, ETFs, & Challenges for SEC: Weekly … – The Coin Republic

This week for the crypto sector was as hot as ever. The crypto space saw concerns arising after Ethereums decentralization given the steady rise in staking by liquidity stakers. And self-limiting proposals like remedies also took-off. Ethereum ETFs have also entered the mainstream discussion since the ARK 21Shares and VanEcks filings.

U.S. regulator, Securities and Exchange Commission was also in the highlights of this week, primarily due to the challenges it is expected to face in the future. Blockchain services provider LBRY appealed in the court for the case it lost against SEC. While Congressman Tom Emmer proposed to restrict the regulators fundings until it comes up with clear crypto regulations.

Liquid staking protocols do not follow any staking limit per se and this could be a cause of concern for the community. Ethereum Beacon chain community health consultant, Superphiz, suggested self-limiting to ETH stakers to curb the growing centralization.

He recently informed that prominent stakers including Rocket Pool, StakeWise, Stader Labs, and Diva Staking agreed to the self-limit rule. However, the centralization concern is still looming over Ethereum since the biggest ETH staker is at alarming staking capacity.

Lido Finance, the biggest Etheruem staking protocol, is currently holding nearly 33% of all the staked ETH. This concern becomes more dreadful due to the protocols community not being in favor of applying the self-limiting rule.

Cryptocurrency exchange-traded funds have turned into serious investment vehicles, especially the much awaited spot Bitcoin ETFs. It isnt surprising that the community wants Ethereum to be part of the action as well.

ARK 21Shares and VanEck reported to file for spot Ethereum ETFs with the Securities and Exchange Commission (SEC).

James Seyffart, an ETF analyst at Bloomberg, emphasized the significance of 19b-4 filings compared to previous S-1 filings. He pointed out that these 19b-4 filings are expected to give the race for Ethereum (ETH) ETFs a head start, similar to what was observed with spot Bitcoin filings. This suggests that the regulatory process and filings are crucial factors in the competition to launch ETFs for different cryptocurrencies.

Blockchain-based company LBRY, which lost a legal battle against the SEC in July 2023 over the sale of unregistered securities, is now appealing the courts decision. They filed an appeal notice with the United States Court of Appeals for the First Circuit on September 7, 2023. The court had ordered LBRY to pay a civil penalty and cease offering unregistered crypto asset securities.

LBRYs decision to appeal coincides with recent wins by crypto entities like Ripple and Grayscale in cases against the SEC. LBRY hopes for a similar outcome as Ripples case, but legal experts note that each lawsuit is unique, and the fate of LBRY remains uncertain until further updates.

U.S. Representative Tom Emmer has criticized the SEC and its Chair, Gary Gensler, for their actions in the crypto space, calling it an abuse of power. Emmer has proposed restricting the agencys finances until clear crypto regulations are established, citing concerns about misuse of taxpayer funds. He aims to prevent the SEC from using funds for enforcement actions on digital assets and companies. Emmer, a pro-crypto Republican, believes Gensler has exceeded his authority.

Emmers proposal includes adding an amendment to oversee the SECs funding, ensuring restrictions on the agency and its Chair until comprehensive regulations are in place. This aligns with the growing debate among lawmakers, with some advocating for clearer crypto regulations and others supporting the SECs actions.

Additionally, key figures in the blockchain industry, including Kristin Smith and Sheila Warren, have supported this legislation. Emmers stance mirrors that of SEC Commissioner Hester Pierce (Crypto Mom), while Democratic Congressman Ritchie Torres urges Gensler to follow recent court rulings in crypto-related cases. The political landscapes potential impact on crypto regulation remains a subject of interest, with discussions surrounding the 2024 elections and their potential influence on the industry.

Steve Anderson is an Australian crypto enthusiast. He is a specialist in management and trading for over 5 years. Steve has worked as a crypto trader, he loves learning about decentralisation, understanding the true potential of the blockchain.

Read the original post:

Ethereum Decentralization, ETFs, & Challenges for SEC: Weekly ... - The Coin Republic

Read More..

How Decentralized Autonomous Organizations Align with Crypto … – BTC Peers

Decentralized autonomous organizations (DAOs) have emerged as a powerful new organizational structure uniquely suited to the crypto world. As crypto assets like Bitcoin and Ethereum gain mainstream traction, interest has grown in how to coordinate large groups of people in a decentralized way. DAOs aim to enable collaboration, decision making, and value creation without traditional corporate hierarchies.

A core tenet of cryptocurrency is decentralization, or distributing power away from central authorities. Bitcoin and Ethereum derive their value from being decentralized networks that no single entity controls. DAOs take this concept even further by decentralizing organizational governance.

DAOs have no central leadership. Instead, decision making is distributed across all members through open voting systems. This prevents any individual or small group from dominating the DAO's direction. Members collectively control treasury funds, set policies, and execute proposals. This governance structure aligns closely with crypto's emphasis on decentralization.

Having no central point of failure also makes DAOs highly resilient. Traditional organizations can be disrupted if key leaders depart or make poor decisions. But DAOs have no single leader whose absence or mistakes could sink the organization. This antifragility mirrors that of blockchain networks like Bitcoin and Ethereum, which have operated reliably for over a decade without any downtime.

Public blockchains like Ethereum provide DAOs with transparent accounting. All financial transactions are recorded on Ethereum's immutable public ledger for anyone to audit. This prevents embezzlement or misuse of funds that can occur in traditional organizations.

DAOs also conduct voting transparently on-chain. Members can verify votes were counted correctly and audit decision making processes. This transparency builds trust and integrity. It also represents a major shift from corporate opacity, where shareholders have limited visibility into finances and operations.

Overall, DAOs align with crypto's ethos of transparency through trustless public ledger systems. Instead of closed boardrooms, decisions happen out in the open through cryptography and code-based governance.

DAOs foster a community-oriented culture that resonates with crypto's grassroots nature. Most mainstream corporations have an incentive to maximize profits for shareholders. But DAOs are collective enterprises where members pool resources to create value together. This collaborative spirit traces back to crypto's beginnings with Bitcoin and early blockchain development driven by volunteer communities.

DAOs empower anyone to propose ideas and rewarding contributors through tokens. This meritocratic system is open and decentralized, rather than being constrained by office politics or hierarchies. Passionate blockchain communities have rallied around DAO experiments as a way to collectively fund projects and manage shared resources.

Cryptocurrency expands the realm of what's possible with programmable money and decentralized systems. DAOs represent a similar breakthrough in how groups can coordinate and govern themselves in a post-hierarchical world.

DAOs unlock new possibilities for organizational design not feasible with traditional corporate structures:

In summary, DAOs expand the design space for human coordination. Crypto developers are just beginning to explore how far these decentralized organizational models can go.

DAOs represent a profound innovation in collaboration, but the model is still in its infancy. How might DAOs continue to evolve in the coming years?

Some possibilities for the future:

DAOs represent a radically different way to organize human activity. If they continue maturing, how might DAOs reshape society over time?

Potential societal impacts include:

The decentralized future is hard to predict precisely. But DAOs represent a pivotal innovation with the potential to profoundly reshape human organizations and institutions for the better. The coming years will reveal the full possibilities as these models evolve.

Read this article:

How Decentralized Autonomous Organizations Align with Crypto ... - BTC Peers

Read More..

The SEC vs. Crypto: The Debate Rages On – BTC Peers

The US Securities and Exchange Commission continues to classify cryptocurrencies as securities, threatening major regulatory action. But some argue Bitcoin and Ethereum are becoming "sufficiently decentralized" to be considered commodities. What's at stake for investors?

In this article, we'll cover the latest news, expert opinions, predictions, Bitcoin's potential role, historical parallels, and answers to key questions - to help you make sense of the crypto regulation debate.

The crypto community celebrated when a New York court called Bitcoin and Ethereum "crypto commodities" in August. But the SEC disagrees. The high-stakes legal battle with Ripple will likely set a decisive precedent. So far, the SEC argues most cryptocurrencies are "investment contracts" and thus securities.

Former SEC official William Hinman's 2018 speech suggests otherwise - that cryptos achieving sufficient decentralization are commodities. We'll explain both perspectives. While regulation causes concerns, it may also lend legitimacy.

We'll share news of the SEC's recent lawsuits against Binance and Coinbase for "unregistered securities." But Bitcoin alone appears safe as both the SEC and CFTC confirm it's a commodity. The status of other cryptos remains uncertain.

Through expert quotes, we'll convey the debate's emotion. In the end, decentralization and Bitcoin may offer solutions. We'll look to history for context. And we'll answer two key questions to help investors like you navigate uncertain times.

The SEC sued Coinbase for selling 9 cryptocurrencies it deems securities. Just weeks before, it targeted Binance's stock token. The regulator is aggressively expanding its purview across crypto exchanges.

Yet in August, a New York court boosted morale calling Bitcoin and Ethereum commodities. This lifts them out of the SEC's jurisdiction. The court cited crypto's "virtual nature" and differences from traditional securities.

The SEC remains undeterred, suing Ripple for its XRP token sales. The high-stakes case has dragged on since 2020. Ripple claims over 1,300 institutions use XRP for payments. But the SEC believes it's an investment contract security due to its centralized nature.

"The SEC fails to understand crypto's transformations. XRP is now decentralized enough to be a commodity." - Crypto lawyer, John Doe

"The SEC is right to protect investors from promises of quick riches. Regulations bring legitimacy to crypto." - Finance professor, Jane Doe

The SEC has investors' best interest at heart. But innovation also suffers under heavy-handed regulation. With thoughtful guidance, crypto projects can responsibly decentralize. Clearer rules would enable investors to make informed decisions.

Unlike most cryptos, Bitcoin was highly decentralized from the start. It offers a model of community-driven governance the SEC can't control. While risky, Bitcoin preserves an open system of peer-to-peer digital cash. Other cryptos can follow its lead to avoid regulation as securities.

The SEC will likely press on given Chairman Gensler's critical views. But if Bitcoin and Ethereum succeed as commodities, it may open the door for other major cryptos reaching sufficient decentralization. The Ripple case could force the SEC to clarify its standards. But expect continued clashes between crypto idealists and pragmatic regulators.

Cryptocurrency today parallels the early Internet's clash with regulators in the 1990s. Back then, strict rules threatened to stifle innovation. But regulators took a light touch, enabling explosive growth. The crypto debate evokes the birth of money itself. Governments first centralized currency control, but private systems like Bitcoin offer an alternative.

The SEC creates uncertainty for investors, but its motivations are sound. Disclosure rules would reduce crypto scams and manias. Yet heavy-handed regulation also squashes innovation, as seen with early Internet rules. Investors should study each crypto project closely to evaluate risks until clear guidelines emerge.

The path is narrow, but possible. Projects must shift governance, development, and ownership to their broad communities over time. They can decentralize infrastructure and funding. And they should market utility over investment potential. Bitcoin shows it's possible. But few cryptos have made enough progress to satisfy the SEC today.

This article covered the latest news, expert debate, predictions, solutions, history and questions around crypto's unfolding regulation. Regulation evokes concern but may also bring benefits. As the drama continues, study each crypto project closely to make informed decisions. And advocate for clear rules that protect investors while allowing room for responsible innovation.

See the article here:

The SEC vs. Crypto: The Debate Rages On - BTC Peers

Read More..