Page 282«..1020..281282283284..290300..»

Israeli scientist Avi Wigderson wins prestigious AM Turing Award – The Times of Israel

Israeli computer scientist Avi Wigderson is awarded the prestigious AM Turing award.

Wigderson is hailed by the Association of Computing Machinery, the organization that oversees the prize, for reshaping our understanding of the role of randomness in computation, and for decades of intellectual leadership in theoretical computer science.

Wigderson has been a leading figure in areas including computational complexity theory, algorithms and optimization, randomness and cryptography, parallel and distributed computation, combinatorics, and graph theory, as well as connections between theoretical computer science and mathematics and science, the ACM noted.

The Turing Prize recipients receive a $1 million grant, funded by Google.

Haifas Technion also lauds Wigderson, an alumnus of the university.

After doing his undergrad at the Technion, Wigderson went on to do his MA and PhD at Princeton and is now a researcher at the Institute for Advanced Study in Princeton, New Jersey.

He has long maintained ties with the Technion and in June of 2023 was awarded an honorary doctorate there for his significant contribution and leadership in the fields of computer science theory and discrete mathematics and in gratitude for his long-standing relationship with the Technion, beginning with his undergraduate studies, the Technion says.

Wigderson, born in Haifa in 1956, has had a prolific and varied career in the field of computer science, with hundreds of peer-reviewed articles to his credit and numerous other publications.

The Turing Award, also called The Nobel Prize of Computing, is named after Alan Turing, the British cryptographer and mathematician who famously cracked the Nazi Enigma cipher during World War II.

You're a dedicated reader

Were really pleased that youve read X Times of Israel articles in the past month.

Thats why we started the Times of Israel eleven years ago - to provide discerning readers like you with must-read coverage of Israel and the Jewish world.

So now we have a request. Unlike other news outlets, we havent put up a paywall. But as the journalism we do is costly, we invite readers for whom The Times of Israel has become important to help support our work by joining The Times of Israel Community.

For as little as $6 a month you can help support our quality journalism while enjoying The Times of Israel AD-FREE, as well as accessing exclusive content available only to Times of Israel Community members.

Thank you, David Horovitz, Founding Editor of The Times of Israel

Original post:

Israeli scientist Avi Wigderson wins prestigious AM Turing Award - The Times of Israel

Read More..

Avi Wigderson wins Turing Award for his influential work in computational randomness – TechSpot

Kudos: Let's hear it for theoretical computer scientists. Without their work, we would not be carrying around a mini computer in our pockets. Perhaps even more importantly, we couldn't accurately predict the weather or be on the edge of reliable quantum computing. Avi Wigderson rightly deserves recognition in this field for work that has widely influenced so many others.

On Wednesday, the Association for Computing Machinery, a prestigious organization in the field of computer science, awarded mathematician Avi Wigderson the Turing Award. This award, often called the Nobel Prize of Computing, is named after Alan Turing, a pioneer in the field. It is considered one of the highest honors in computer science and comes with a million-dollar cash prize, reflecting the significant impact of the recipient's work.

Wigderson is a theoretical computer scientist specializing in randomness, cryptography, computational complexity, and other related pursuits. He works as an advanced mathematics professor at the Institute for Advanced Study at Princeton, New Jersey.

Theoretical computer scientists tackle questions like, "Is this problem solvable through computation?" or "If this problem is solvable through computation, how much time and other resources will be required?"

It also delves into the realm of computer algorithm optimization. Just because a line of code is the most obvious way to execute a task does not mean it is the most efficient way. So Wigderson and others in the field are indirectly responsible for breakthroughs in everything from cryptography to machine learning.

Wigderson has led theoretical research that has laid the foundations for computational randomness and pseudorandomness in systems for forty years. It was long thought that chaotic systems like the weather or quantum mechanics are impossible to model using deterministic instructions because of those systems' inherent random behavior.

In a series of studies, Wigderson and his colleagues challenged widely believed computational assumptions by proving that all probabilistic polynomial time algorithms can be "derandomized" efficiently and be made fully deterministic. Derandomization is the process of converting a probabilistic algorithm into a deterministic one, which has significant implications for the efficiency and reliability of computational processes.

"In other words, randomness is not necessary for efficient computation," the ACM notes. "This sequence of works revolutionized our understanding of the role of randomness in computation, and the way we think about randomness."

Wigderson's seminal works include the papers "Hardness vs Randomness," "BPP Has Subexponential Time Simulations Unless EXPTIME has Publishable Proofs," and "P = BPP if E Requires Exponential Circuits: Derandomizing the XOR Lemma." These papers, co-authored by contemporaries like Noam Nisan and Lance Fortnow, became foundational material for other studies in theoretical computer science.

As a professor, Wigderson has mentored students and colleagues alike. He has a passion for his field of study and an enthusiasm for sharing his knowledge with everyone he meets.

"Notably, none of these papers are solely authored or even have much overlap in their author lists," Fortnow wrote in his blog regarding his colleague's achievement. "Avi shared his wisdom with many, nearly 200 distinct co-authors according to DBLP. Congrats to Avi and this capstone to an incredible career and individual."

Image credit: Association for Computing Machinery

See the original post:

Avi Wigderson wins Turing Award for his influential work in computational randomness - TechSpot

Read More..

Avi Wigderson receives 2023 ACM A.M. Turing Award – SDTimes.com

ACM has announced that it is awarding the 2023 ACM A.M. Turing Award to Avi Wigderson for his contributions in the area of theoretical computer science, and notably, for changing our understanding of how randomness works in computation.

Wigderson is a towering intellectual force in theoretical computer science, an exciting discipline that attracts some of the most promising young researchers to work on the most difficult challenges, said Yannis Ioannidis, president of ACM. This years Turing Award recognizes Wigdersons specific work on randomness, as well as the indirect but substantial impact he has had on the entire field of theoretical computer science.

At their core, computers are deterministic systems, meaning their algorithms follow a predictable pattern where output is determined by the input. But the world we live in is full of random events, so computer scientists have enabled algorithms to make random choices too, which makes them more efficient. There are also many use cases where there isnt a possible deterministic algorithm, so these probabilistic algorithms have been used instead.

Many computer scientists have devoted their research to uncovering the relationship between randomness and pseudorandomness in computation, according to ACM.

Is randomness essential, or can it be removed? And what is the quality of randomness needed for the success of probabilistic algorithms? These, and many other fundamental questions lie at the heart of understanding randomness and pseudorandomness in computation. An improved understanding of the dynamics of randomness in computation can lead us to develop better algorithms as well as deepen our understanding of the nature of computation itself, ACM wrote in the post announcing this years award winner.

Wigdersons research proved that every probabilistic polynomial time algorithm can be efficiently derandomized and that randomness isnt essential for efficient computing.

Three of the papers he authored on this topic were then used by other computer scientists and led to several other new ideas.

Besides his work studying randomness is computation, his other areas of interest have included multi-prover interactive proofs, cryptography, and circuit complexity.

ACM also highlighted the fact that Wigderson has mentored many young researchers in the field. He is currently a professor in the School of Mathematics at the Institute for Advanced Study in Princeton, New Jersey.

Avi Wigdersons work on randomness and other topics has set the agenda in theoretical computer science for the past three decades, said Jeff Dean, senior vice president of Google. From the earliest days of computer science, researchers have recognized that incorporating randomness was a way to design faster algorithms for a wide range of applications. Efforts to better understand randomness continue to yield important benefits to our field, and Wigderson has opened new horizons in this area. Google also salutes Wigdersons role as a mentor. His colleagues credit him with generating great ideas and research directions, and then motivating a new generation of smart young researchers to work on them. We congratulate Avi Wigderson on receiving the ACM A.M. Turing Awardcomputings highest honor.

Originally posted here:

Avi Wigderson receives 2023 ACM A.M. Turing Award - SDTimes.com

Read More..

Join us for the 2024 Computer Science Alumni and Industry Symposium – UM Today

April 11, 2024

The computer science and tech community in Winnipeg is always growing with many opportunities for collaboration, innovation and creativity. UM Computer Science students often seek opportunities to connect with their community, network with their peers and industry representatives, and map out their future careers. Alumni are key to flourishing this community and guiding the next generation to enter the workforce. To achieve that, the department of computer science is holding the 2024 Computer Science Alumni and Industry Symposium for the second year and invites everyone curious about anything computer science and tech-related to join the event on May 3, 2024, at the RBC Convention Centre.

The symposium will include a series of short talks from the computer science department faculty and researchers, as well as a job fair to connect the near-graduation students with the broader tech community in Winnipeg. There will be food and drinks while the attendees network with each other and build connections.

This is a paid event and registration is required to join the event. You can register now through the Faculty of Science Eventbrite.

Kimia Shadkami

Excerpt from:

Join us for the 2024 Computer Science Alumni and Industry Symposium - UM Today

Read More..

Bitcoin world faces ‘halving’: what’s happening? – New Vision

Bitcoin miners, whose computer processors run the world's most popular virtual currency, will soon face the process of "halving" -- a quadrennial phenomenon which alters the profitability of the industry.

The looming occurrence, due later this month, has helped send Bitcoin racing to a string of recent record highs so far this year.

What is Bitcoin?

Bitcoin was created in 2008 by a person or group writing under the pseudonym Satoshi Nakamoto as a peer-to-peer decentralised electronic cash system.

The virtual unit was once the preserve of internet geeks and hobbyists but it has since exploded in popularity, with mining performed by huge banks of computers.

Bitcoins are traded via a decentralised registry system known as a blockchain.

How does mining work?

Bitcoin is created, or mined, as a reward when computers solve complex puzzles to decide which miner wins the privilege to validate the block and thus receive the reward.

The system requires massive computer processing power in order to manage and implement transactions.

That power is provided by miners, who do so in the hope they will receive new bitcoins for validating transaction data on the blockchain.

Commercial mining operations often occupy huge hangers or warehouses, and consume large amounts of electricity to power and cool the computers, which is a considerable cost on top of the equipment.

What is halving?

So-called halving is when cryptocurrency-mining companies and individuals find out the reduced payment that they will receive in return for their contribution to the system's smooth operation.

The first "halving" occurred in November 2012, the second in July 2016 and the third in May 2020. The fourth is due in mid-April.

The reward was originally set at 50 bitcoins but it was subsequently reduced to 12.5 and then to 6.25. It is now expected to drop to 3.125 bitcoins.

Why reduce the reward?

The halving process slows the rate at which new bitcoins are created, and therefore restricts supply.

The reward amount has been trimmed over time in order to implement Nakamoto's overall global limit of 21 million bitcoins.

Bitcoin was designed to go against the norms of traditional currencies, which can in contrast lose value over time when central banks increase money supply to boost economic growth.

Why are prices soaring?

Bitcoin, which enjoys increasing interest from institutional investors, has blazed a record-breaking trail this year on the prospect of halving, climaxing at $73,797.98 last month.

Halving tends to send the virtual currency shooting higher on the prospect of reduced supply.

The unit has also been bolstered this year by big moves toward greater trading accessibility. US authorities in January gave the green light to exchange-traded funds (ETFs) pegged to bitcoin's spot price, making it easier for mainstream investors to add the unit to their portfolio.

Here is the original post:

Bitcoin world faces 'halving': what's happening? - New Vision

Read More..

This blockchain platform offers decentralization, security and scalability in tandem – Cointelegraph

Partisia Blockchain is an L1 with novel zero-knowledge oracle and sharding solutions, sorting out problems plaguing the broader blockchain industry.

Diverse blockchains are plentiful nowadays, and solving the biggest challenges in the blockchain space is more critical than ever. Transitioning blockchains from their isolated nature toward a more interoperable framework presents significant hurdles, necessitating solutions that concurrently address privacy, security and scalability.

Existing blockchains are either fully transparent or fully anonymous, and neither solves the trust problem. The standard transaction output of layer-1 protocols is limited, with the original blockchain processing only seven transactions per second. Moreover, L1s are not designed to be interoperable by themselves.

Developed by a group of well-known scientists with 35 years of research and 15 years of practical implementation experience, Partisia Blockchain is designed to tackle these issues and become a sustainable ecosystem that enhances building opportunities for developers.

Partisia Blockchain is an interoperable messaging protocol that aims to solve the 3 biggest challenges in the blockchain space via the unique features it offers. The protocol employs a scalable sharding architecture, customizable multiparty computation (MPC) and tokenomics that allow for paying gas with other tokens than its native token.

Created by the projects co-founder Ivan Damgrd, who is also the inventor of Merkle-Damgard hash construction used in Bitcoin (BTC), customizable MPC offers an auditable privacy layer on top of which private data are computed. By allowing the processing of data without knowledge, it ensures the reliability and authenticity of the computation while preserving privacy. The zero-knowledge (ZK) variant technology enables a trust system that helps tokens, both transparent and anonymous, to thrive.

Its sharding architecture provides an opportunity for multiple blockchains to work independently but simultaneously. The messaging layer facilitates state synchronization by carrying transaction information from one blockchain to another. Whenever there is a need for additional output, a new block may easily be created, thus providing true horizontal scalability.

Various blockchains within the Partisia ecosystem are rendered interoperable owing to its gas mechanism. Liquid tokens of different blockchains can be used to pay for gas on Partisia, enabling wider interaction opportunities.

The integration of liquid tokens allows for the development of a unique staking model. MPC, Partisia Blockchains token that has recently been listed on multiple exchanges, functions as a governance and collateralization token within the blockchain. Users that are eager to participate in blockchain governance stake MPC and earn staking rewards in return. Stakers may opt to be rewarded both in the blockchains native currency and other liquid tokens that are used for gas payment, such as ETH, MATIC, BNB and prominent stablecoins, including USDC and USDT.

Furthermore, MPC is also used to provide protection for the ecosystem. On Partisia Blockchain, users may facilitate various transactions ranging from consensus and ZK computation to token bridging by using MPC as collateral.

Through these value propositions, Partisia Blockchain empowers the creation of unique applications in the Web3 space that are currently not possible in other chains. The situation is evident in Partisia Blockchains current ecosystem map.

Partisia Blockchains ecosystem map. Source: Partisia Blockchain

From unique Web3 applications like privatized DAOs, secret auctions enabling DEXs to be front-run resistant, auditable e-cash, privatized verifiable supply chains, RWA with privacy features and improved DID to traditional institutional applications like privatized healthcare data analysis, whistleblower solutions and unbiased RNG, Partisia Blockchain has the potential to allow developers to create solutions that could transform or disrupt existing market places.

Partisia Blockchain announced a 100 million MPC token ecosystem grant program with the goal of fostering innovation. Source: Partisia Blockchain

With a 100 million MPC token ecosystem grant program, developers are further incentivized to push the boundaries of what current blockchain solutions provide. For detailed information on the requirements and application procedure, read more here.

Disclaimer. Cointelegraph does not endorse any content or product on this page. While we aim at providing you with all important information that we could obtain in this sponsored article, readers should do their own research before taking any actions related to the company and carry full responsibility for their decisions, nor can this article be considered as investment advice.

Read more:

This blockchain platform offers decentralization, security and scalability in tandem - Cointelegraph

Read More..

Decentralizing the Electric Grid – Andreessen Horowitz

The electric grid, a vast and complex system of wires and power plants, is essential to our economy and underpins our industrial strength. Currently, we face a critical challenge: our electricity demands expected to nearly double by 2040 due to factors like AI compute, reshoring, and electrification are soaring, but our grid infrastructure and operations struggle to keep pace.

To seize a future of energy abundance, we must simplify the generation, transmission, and consumption of electricity; this entails decentralizing the grid. Big power plants and long power lines are burdensome to build, but technologies like solar, batteries, and advanced nuclear reactors present new possibilities. It will be these, and other more local technologies, that can circumvent costly long-haul wiring and be placed directly on-site that will help support significant load growth over the coming decades.

While historical industrial expansion relied on large, centralized power plants, the 21st century marks a shift towards decentralized and intermittent energy sources, transitioning from a hub and spoke model to more of a distributed network. Of course, such evolutions breed new challenges, and we need innovation to bridge the gap.

The United States electric grid comprises three major interconnections: East, West, and Texas, managed by 17 NERC coordinators, with ISOs (independent system operator) and RTOs (regional transmission operator) overseeing regional economics and infrastructure. However, actual power generation and delivery are handled by local utilities or cooperatives. This structure functioned in an era of low load growth, but expanding the grids infrastructure to meet todays demand is becoming increasingly challenging and expensive.

Grid operators use an interconnection queue to manage new asset connections, evaluating if the grid can support the added power at that location without imbalance, and determining the cost of necessary upgrades. Today, more than 2,000 gigawatts (GW) are waiting to connect, with over 700 GW of projects entering queues in 2022 alone. This is a lot: the entire United States electric grid only has 1,200 GW of installed generation capacity.

In reality, however, many projects withdraw after confronting the costs of grid connection. Historically, only 10-20% of queued projects have materialized, often taking over 5 years post-application to finally connect and those timelines are only lengthening. Developers frequently submit multiple speculative proposals to identify the cheapest interconnection point, then withdraw unfavorable ones after costs are known, complicating feasibility studies. Because of this surge in applications, CAISO, Californias grid operator, was forced to stop accepting any new requests in 2022, and plans to do so again in 2024.

This is a critical ratelimiter and cost driver in our energy transition. A recent report by the Department of Energy found that to meet high-load growth by 2035, within-region transmission to integrate new assets must increase by 128% and inter-region transmission by 412%. Even far more optimistic projections estimate 64% and 34% growth, respectively.

There are proposed reforms to help alleviate this development backlog. The Federal Energy Regulatory Commission (FERC) is pushing a first ready, first serve policy, adding fees to filter proposals and speed up reviews. The Electric Reliability Council of Texas (ERCOT) utilizes a connect and manage method that allows for quicker connections, but disconnects projects if they threaten grid reliability this has been remarkably successful in quickly adding new grid assets. While these policies mark progress, streamlining other regulations, such as NEPA, is also crucial to expedite buildout.

But even if approved, grid construction still faces supply chain hurdles, including lead times of more than 12 monthsand a 400% price surge for large-power transformers, compounded by a shortage of specialized steel. Achieving a federal goal to grow transformer manufacturing hinges on also supporting the electric steel industry, especially with upcoming 2027 efficiency standards. All of this comes at a time when grid outages (largely weather-related) are at a 20-year high, necessitating replacement hardware.

Ultimately, cost overhauls in building grid infrastructure manifest themselves in higher prices for consumers. The retail price that consumers pay is a combination of wholesale prices (generation costs) and delivery fees (the cost of the infrastructure needed to move that electricity to you). Critically, while the price to generate electricity has declined with cheap renewables and natural gas, the price to deliver it has increased by a far greater amount.

There are many reasons for this. Utilities use distribution charges to offset losses from customer-generated power, aiming to secure revenue from fixed-return infrastructure investments (similar to cost-plus defense contracting). Renewable energy development requires extending power lines to remote areas, and these lines are used less due to intermittency. Additionally, infrastructure designed for peak demand becomes inefficient and costly as load becomes more volatile with greater electrification and self-generation.

Policy and market adjustments are responding to these rising delivery costs, with Californias high adoption of distributed power systems, such as rooftop solar, serving as a notable example.

Californias Net Energy Metering (NEM) program initially let homeowners sell surplus solar power back to the grid at retail prices, ignoring the costs to utilities for power distribution. Recent changes now essentially buy back electricity at variable wholesale rates, reducing earnings for solar panel owners during peak generation times, which often coincide with the lowest electricity prices. This adjustment lengthens the payback period for solar installations, pushing homeowners and businesses to invest in storage to sell energy at more profitable times.

California utilities also proposed a billing model where fixed charges depend on income level and usage charges depend on consumption. This aimed to make wealthier customers cover more of the grid infrastructure costs, protecting lower-income individuals from rising retail power prices. And although this specific policy was recently shelved for a similar, but less extreme, version, ideas like this might lead affluent users to disconnect from the grid entirely. Defection could lead to higher costs for remaining users and trigger a death spiral. Some argue this is already happening in Hawaiis electricity market and in areas rapidly switching to electric heat pumps.

Electricity is not magic; grid operations are complicated. At all times, electricity generated must match electricity demand, or load; this is what people mean when they say balance the grid. At a high level, grid stability relies on maintaining a constant frequency 60 Hz in the United States.

Congestion from exceeding power line capacities leads to curtailments (dumping electricity) and local price disparities. Any frequency deviations can also cause equipment damages to generators and motors. Wind, solar, and batteries inverted-based resources lacking inertia also complicate frequency stabilization as they proliferate. In extreme cases, deviations may provoke blackouts or even destroy grid-connected equipment.

Because of the grids inherent fragility, careful consideration must be made to assets connected to it, aligning reliable supply with forecasted demand. The growth of intermittent power sources (unreliable supply) coinciding with the rise of electrification (spiky demand) is causing serious challenges.

Around two-thirds of load is balanced by wholesale markets through (mostly) day-ahead auctions, where prices are determined by the cost of the last unit of power needed. Renewables, with no marginal cost, typically outbid others when active, leading to price volatility extreme lows when renewables meet demand and spikes when costlier sources are needed (note: bid price is different from levelized cost of energy (LCOE).)

The unpredictability of solar and wind, alongside the shutdown of aging fossil fuel plants, strain grid stability. This leads to both blackouts (underproduction) and curtailment (overproduction), like Californias 2,400 GWh waste in 2022. Addressing this requires investment in energy storage and transmission improvements (discussed below).

Moreover, as power supply becomes more unpredictable, the role of natural gas grows due to its cost-effectiveness and flexibility. Natural gas often backs up renewables with peaker plants that activate only when needed. In general, the intermittency of solar and wind subjects natural gas plants, and other types of plants, to profit intermittently, sometimes even running continuously at a loss for technical reasons. Consequently, when peaker plants set wholesale prices when renewables are offline, it leads to higher costs and thus volatility for consumers.

The demand for electricity is also changing shape. Technologies like heat pumps, while energy-efficient, can cause winter load spikes when renewable output may be low. This requires grid operators to keep a buffer of power assets, and often ignore renewable sources in their resource adequacy planning. Grid operators typically adhere to a 1 in 10 rule, accepting a power shortfall once every decade, though the actual calculation is more complex. In ERCOT, which lacks a traditional capacity market in lieu of price spike incentives, weve already seen emergency reserves grow as renewables enter the grid.

High-solar-penetration areas, like California, also face the duck curve, requiring grid operators to quickly ramp up over 20 GW of power as daylight fades and demand rises. This is technologically and economically challenging for plants intended for constant output.

Renewable intermittency incurs hidden costs, forcing grid operators to embrace risk or invest in new assets. While the levelized cost of energy assesses a projects economic feasibility, it oversimplifies the assets true value to the broader grid. LCOE does, however, underscore the economic challenges of constructing new assets, like nuclear plants. Despite being costlier than natural gas today, nuclear offers a compelling path to decarbonizing reliable power. We just need to scale reactor buildout.

But we cant just rely on nuclear power. Relying solely on a single energy source is risky, as shown by Frances nuclear challenges during Russian energy sanctions and the southern United States issues with natural gas in cold weather, not to mention commodity price swings. Regions with lots of renewables, like California, also face uncertainty due to routine reliance on imports. Even places operating at nearly 100% clean energy, like Iceland or Scandinavia, maintain reliable backups or import options during crises.

As electricity demand grows, the grid struggles to manage growing complexity from both decentralization and intermittent renewables. We cannot brute force this shift; if were going to do it, we really need to get smart.

The current grid, aging and dumb, depends on power plants to align production based on predicted demand, while making small, real-time adjustments to ensure stability. Originally designed for one-way flow from large power plants, the grid is challenged by the concept of multiple small sources contributing power in various directions, like your rooftop solar charging a neighbors electric vehicle. Moreover, the lack of real visibility into live power flow presents looming issues, specifically at the distribution level.

Residential solar, batteries, advanced nuclear, and (possibly) geothermal provide decentralized power that reduces the need for infrastructure buildout. Yet integration into an evolving, volatile grid still demands innovative solutions. Additionally, efficient use of even utility-scale power systems can also be significantly improved by local storage and demand-side responses like turning down your thermostat when the grid is strained that reduce the need to build underutilized assets that are online for only brief peak periods.

The smart grid aims to accomplish all of this and more, and can be organized into three main technology groups:

Specifically, there are two broad trends that are crucial for the smart grid future.

First, we need to build a lot of energy storage to smooth out peak load locally and stabilize intermittent power supply across the grid. Batteries are already critical for small bursts of power, and, as prices continue to decline, even longer periods of time could be covered. But scaling hundreds of gigawatt-hours of batteries will also require expanded supply chains. Fortunately, strong economics will likely continue to accelerate deployment; entrepreneurs should seek to connect batteries anywhere they can.

Second, we need to accelerate the deployment and integration of a network of distributed energy assets. Anything that can be electrified will be electrified. Allowing these systems to interact with home and grid-scale energy systems will require a variety of new solutions. Aggregations of smart devices, like electric vehicles or thermostats, could even form virtual power plants that mimic the behavior of much larger energy assets.

A core challenge in grid expansion is carefully balancing the shift between centralized and decentralized systems, considering economic and reliability concerns. Centralized grids, while straightforward and (generally) reliable, face issues with complex demand fluctuations and high fixed costs for example, most large nuclear plants globally are government-financed and China can build a ton of big power assets, but it does so very inefficiently. Decentralized grids, while still in early stages of deployment, are cheap but dont automatically ensure reliable power, as preferences in some rural Indian communities indicate.

To be clear, the centralized grid we have today will certainly not disappear in fact, it also needs to grow in size but it will be consumed by networks of decentralized assets growing around it. Ratepayers will increasingly adopt self-generation and storage, challenging traditional electricity monopolies and prompting regulatory and market reform. This self-generation trend will reach its extreme in energy-intensive industries that especially prioritize reliability Amazon and Microsoft are already pursuing nuclear-powered data centers, and we should do everything we can to accelerate the development and deployment of new reactors.

More broadly, ratepayers want reliable, affordable, and clean power, typically in that order. ERCOT, with its blessed geography, innovation-receptive energy-only markets, and relaxed interconnection policy, will be key to watch in order to see if, when, and how this is achieved with a decentralized grid. And successfully navigating this shift will, no doubt, result in significant economic growth.

Critically, to build this decentralized grid demands our most talented entrepreneurs and engineers: We need a smart grid with serious innovation across ahead-of-the-meter, behind-the-meter, and grid software technology. Policy and economic tailwinds will accelerate this electricity evolution, but it will fall to the private sector to ensure this decentralized grid works better than the old one.

The future of the United States electric grid lies in harnessing new technology and embracing free-markets to overcome our nations challenges, paving the way for a more efficient and dynamic energy landscape. This is one of the great undertakings of the 21st century, but we must meet the challenges.

The world is changing fast, and the electric grid must change with it. If youre building the solutions here, get in touch.

Read the original post:

Decentralizing the Electric Grid - Andreessen Horowitz

Read More..

Decentralized dilemma: Could Ethereum survive if SEC ruled ETH a security? – Cointelegraph

On Feb. 26, 2024, a discreet modification appeared on the Ethereum Foundation website.

The footer of the website and the websites warrant canary were deleted according to a GitHub commit, which stated: This commit removes a section of the footer as we have received a voluntary enquiry from a state authority that included a requirement for confidentiality.

A warrant canary is usually some form of text or visual element in the case of the Ethereum Foundation, a yellow bird which some companies include on their websites to indicate theyve never been served with a government subpoena or document request.

If a government agency does request information, the company may remove the text, implicitly suggesting to visitors that it has received a subpoena.

The Ethereum Foundation removed this key section, indicating that the foundation is indeed under a confidential investigation. Due to the confidentiality clause, the Ethereum Foundation cannot offer further details.

Citing anonymous tips, Fortune reported that the United States Securities and Exchange Commission (SEC) was opening a probe into the foundation as part of a campaign to classify Ethereums underlying asset, Ether (ETH), as a commodity.

The reported inquiry couldnt have come at a worse time, as the May deadlines for SEC approval of Ether exchange-traded funds (ETFs) approach.

While the debate about whether Ether is a security has been going on for several years, doubts are now beginning to emerge.

Why would the SEC investigate the Ethereum Foundation almost 10 years after its launch? Does the SEC have jurisdiction over an organization based in Switzerland? Will the upcoming spot Ether ETF be delayed due to this action? Most importantly, what would happen to Ethereum and the crypto market if it is classified as a security?

The Ethereum Foundation cannot provide further details due to a confidential clause.

While the Ethereum Foundation has received an inquiry from a state authority, this doesnt mean the organization is the subject of investigation.

Carol Goforth, professor at the University of Arkansas School of Law specializing in business associations and securities regulation, explained the relevance of this detail to Cointelegraph:

The SEC may believe that the Ethereum Foundation has information that could help the commission in a different investigation, for example.

Either way, Goforth explained that the foundation would be open to cooperation. She said the desire to see Ether continue to be actively traded in U.S. markets would be a clear incentive to cooperate with the authorities.

Recent:Virtual reality steps up as metaverse struggles to deliver

An additional reason to collaborate with the SEC would be to help explain why Ether does not meet the Howey investment contract test and, therefore, be used to prove that its not a security.

On the other hand, if the Ethereum Foundation were the subject of investigation, the SEC could take a couple of years to move from an investigation to a lawsuit, Goforth said.

An ongoing investigation would be detrimental for Ethereum until the case is resolved and could affect decisions such as the approval of an Ether ETF and the further adoption of the asset as exemplified by the cost of the Ripple/SEC lawsuit.

Basel Ismail, CEO of investment analytics company Blockcircle, told Cointelegraph:

In his opinion, if a market-leading blockchain that is relatively decentralized and counts thousands of active developers globally turns out to be considered a security, then almost all crypto projects could fall into that category.

For the trader, it would be safe to assume that any other ERC-20 token that raised capital using a similar funding mechanism will need to comply with the same registration processes and rules.

In his opinion, the contagion effect could eventually destroy many companies in the sector, as their treasury funds arent deep enough to sustain such a shock.

Crypto exchanges that list Ether and operate in the U.S. markets would automatically support an asset legally considered a security. The exchanges would, therefore, be forced to choose between delisting Ethereum from their platform or registering as securities broker-dealers with the SEC.

Goforth explained that any trading platform that matches buyers and sellers has to register as a securities exchange or comply with an exemption, such as becoming an Alternative Trading System (ATS). Goforth said that both options require extensive disclosures and approval from the SEC.

She further highlighted a crucial point that would probably force the crypto exchanges to select the path of delisting ETH before registering as a securities exchange.

However, many exchanges may simply opt to de-list rather than undergo the significant process of registering with the SEC.

As Goforth noted, once the crypto exchange wishes to register as a securities exchange, it cannot provide exchange services for any security asset unless that security is registered.

Since the SEC has only officially declared Bitcoin as not being a security, the crypto exchange would be at risk if it helped its customers to buy or sell any other cryptocurrency, as virtually no crypto assets are registered. In her opinion, this would be tantamount to saying dont do business in the United States.

While Ethers significant market cap might be able to take the hit as Ether moved off of U.S. exchanges, the same cannot be said for most ERC-20 tokens.

Ismail explained, Liquidity would be drained, and order books would be considerably more shallow, creating extreme price slippage and hindering the chances for market makers to stabilize the price of the asset for some time.

What use is it worrying about possible effects if the Ethereum Foundation is based in Switzerland and the SEC is a U.S. regulator?

The SEC may technically only have jurisdiction over companies in the United States, but Goforth explained that it could claim extraterritorial jurisdiction if the challenged activity has a material impact in the United States.

A past example within the crypto industry occurred in 2020 when the SEC imposed a worldwide injunction against Telegrams planned issuance of its GRAM token. Telegram eventually had to return $1.2 billion and pay $18.5 million as a penalty despite not being based in U.S. territory.

Goforth confirmed that no specific legal mandates require the Ethereum Foundation to cooperate. However, she highlighted how the SEC values cooperation and can consider that when deciding on which action to take.

If the Ethereum Foundation did not collaborate, the SEC could issue a subpoena, legally forcing the organization to share any data the regulators request.

If centralized exchanges did shut down ETH trading in the U.S. market, decentralized exchanges (DEXs) could become an alternative.

The decentralized nature of such platforms, which makes them difficult for regulators to target, combined with their global reach, may allow Ethereum-based projects to engage in regulatory arbitrage.

Sergey Gorbunov, CEO of Interop Labs and co-founder of the Axelar protocol, told Cointelegraph, If ETH becomes a security, DEXs may largely maintain operations due to their decentralized global nature.

However, he admitted that some challenges would emerge, such as new compliance requirements in certain jurisdictions.

For example, Gorbunov illustrated how this new regulation update could threaten DEXs that connect with centralized crypto exchanges for liquidity purposes.

But, decentralization does not guarantee safety from regulators. U.S. authorities have demonstrated their ability to close down platforms, such as crypto mixer Tornado Cash,by targeting developers.

Goforth noted that the SEC could target some founder group, promoters or other active participants, while Gorbunov said regulators could target individuals or organizations that support the open network, such as code validators or contributors.

The implications of the SEC classifying Ethereum as a security appear grim. However, Ismail said that while it would be harmful to markets and adoption in the short term, there could be positive effects in the long run.

The Ethereum Foundation and the SEC could find a resolution to the problem where the foundation would be forced to pay a monstrous fine and register as a security in the U.S. markets.

Recent: Dead metaverse? Public administration breathes new life into virtual tech

Ether trading would then need to adhere to the same rules that apply to stocks or bonds. Ismail implied that this outcome would at least offer regulation clarity for the market participants. In his opinion, in the long term this could be beneficial for crypto market valuation.

If the SEC brings the Ethereum Foundation to court, the conclusion would be binary Ether is either a commodity or a security. Either outcome would create ripple effects on the tokens market valuation, and the impact on the ecosystem built on the blockchain would be massive. It could become the most crucial lawsuit in the crypto ecosystem.

Link:

Decentralized dilemma: Could Ethereum survive if SEC ruled ETH a security? - Cointelegraph

Read More..

The Rise of io.net: Decentralizing AI Compute – Grit Daily

In the 19th century, the California Gold Rush completely reshaped the United States as prospectors flocked to the region in search of riches. Today, a new kind of gold rush is underway: the rush for computing power to fuel the AI revolution. At the heart of this whirlwind stands NVIDIA, whose stock price surged 239% in 2023 as demand for its GPUs (graphics processing units) skyrocketed.

On the other hand, centralized cloud giants Amazon, Microsoft, and Google, whose combined $6 Trillion market cap has benefited greatly from the high demand for cloud services, are struggling to keep up with the insatiable demand for AI compute (computational power), leading to high prices, long wait times, and limited options for AI startups and innovators.

Enter io.net, a decentralized cloud compute provider which has achieved a $1 billion private market valuation and raised $40 million from various venture capital firms, showcasing significant market confidence in both its vision and the scalability of its network since its launch in November 2023.

Here, we discuss the decentralization and accessibility of the AI compute market by io.net, its technological and economic innovations, and plans for market expansion.

io.net is on a mission to disrupt the AI compute market through decentralization. The centerpiece is the IO Cloud, a permissionless, distributed network of GPUs and CPUs (central processing units) that anyone can tap into on-demand without long-term contracts or KYC.

Developers can quickly deploy ML clusters, while hardware providers earn rewards for contributing compute to the network. The result is a more accessible, flexible, and cost-effective alternative to big-box centralized cloud behemoths.

io.net was born out of necessity. In 2020, founder Ahmad Shadid was building quantitative trading systems and ran into the common roadblock of high GPU costs from existing providers. So, he built his own distributed GPU network to cut expenses.

After the ChatGPT launch, Shadid realized the opportunity to bring this network architecture to the broader market. A few months and a Solana hackathon victory later, io.net officially launched.

The IO Cloud already boasts an impressive 300,000 GPUs and 40,000 CPUs distributed globally. Crypto partners, like Filecoin, Render, and Gaimin, have surfaced their own GPU networks to the IO Cloud to take advantage of io.nets self-service cluster virtualization and AI-ready orchestration layer.

Accessing compute is simple through the intuitive dashboard at cloud.io.net. Here, developers can view stats, manage clusters, and track network performance. Hardware providers can also monitor their machines and earnings.

Under the hood, the io.net stack is powered by the IO-SDK and a specialized fork of Ray, the same open-source network architecture used by OpenAI and Uber. The modular architecture is designed for reliability and flexibility while scaling up parallel processing capabilities to handle AI/ML workloads. APIs enable connectivity while the infrastructure layer handles core functions like GPU provisioning, monitoring, and ML ops. This layered approach prevents single points of failure and allows the swapping of components as needed.

Compute power is the new oil for the AI era, said Ahmad Shadid, CEO and co-founder of io.net. By leveraging GPU power as a decentralized resource, io.net will continue to play a crucial role in the future of technology and economy, with compute power backing the value of $IO.

However, io.nets biggest advantage may be its crypto-economic model. By building on Solana, the network gains a trustless settlement layer without legacy inefficiencies. Plus, the $IO token set to launch in April following io.nets Ignition Program has the ambition to become the currency of compute, attracting users and incentivizing infrastructure buildout in a continuous growth cycle.

The Ignition program aims to incentivize and reward IO Network suppliers and community members for their contributions to the network. Participants can earn rewards by supplying GPUs to the network, with factors such as job hours completed, node bandwidth, GPU model, and uptime taken into consideration by the rewards algorithm.

We are thrilled to launch the Ignition program as the first step in our mission to decentralize the IO Network and transition governance to the community, said Garrison Yang, Chief Strategy & Marketing Officer at io.net. By incentivizing and rewarding our suppliers and community members, we aim to expand access to underutilized GPU resources and make the power of AI accessible to anyone in the world.

With its strong unit economics and token incentives, io.net is poised to grow exponentially, grabbing market share from the cloud giants. The timing could not be better, as the world is waking up to the transformative potential of AI. By making AI/ML compute more accessible, io.net is democratizing access to this critical resource.

While centralized providers struggle with capacity, io.net offers an elegant solution through decentralization: It provides the tools and infrastructure for anyone to participate in the AI revolution. As more developers and suppliers plugin, network effects should drive rapid expansion and adoption.

The California Gold Rush completely reshaped the Western United States in the 1800s. Now, io.net is in pursuit of reshaping computing for the AI age. If successful, it could become an essential backbone for the next generation of AI innovation. The IO Cloud may very well enable the next OpenAIs and Anthropics but in a more open and permissionless way. Keep an eye on this project as the decentralized AI compute wars heat up.

See the original post:

The Rise of io.net: Decentralizing AI Compute - Grit Daily

Read More..

DFINITY Foundation Launches Olympus, a Decentralized Global Acceleration Platform on the Internet Computer – PR Newswire

- Olympus is the first on-chain acceleration platform

- Olympus will transition into a DAO

ZURICH, April 9, 2024 /PRNewswire/ -- The DFINITY Foundation (DFINITY), a Swiss not-for-profit research and development organization and major contributor to the Internet Computer Blockchain (ICP), today announced the launch of the Olympus Acceleration Platform, web3's first decentralized, on-chain global acceleration platform. Olympus supports the development and adoption of web3 technology across multiple ecosystems. The acceleration platform is a first of its kind platform and will be used by teams around the world to organize and launch their own accelerator programs.

Initial operations of Olympus will be funded by a $15M grant from DFINITY and the ICP Asia Alliance, which aims to cultivate a dynamic Web3 and AI ecosystem in Asia. There will be new allocation of funds in future with the launch of proposed EU, MENA, Africa and America alliances, cementing ICP's global reach.

By the end of 2024 the platform will transition into adecentralized autonomous organization (DAO). To ensure platform sustainability and independence from grants, future cohorts will be funded by a native token generation event (TGE), followed by fundraising from the community. This will also ensure all stakeholders benefit from the success of the platform as token holders.

Designed for a new cycle of web3 product launches, Olympus provides a consolidated platform for open and sustainable project acceleration by providing access to grants, crowdfunding, VC investments, and referrals all in one place and on-chain. The platform operates as an ecosystem pipeline, channeling and curating the best projects from 40+ countries around the world. Projects can apply to join a distributed network of startup accelerators, raise funds, access talent, and interact across communities and ecosystems, while investors on the platform can increase their visibility and gain early access to fully transparent, globally vetted deals.

Dominic Williams, Founder and Chief Scientist of the DFINITY Foundation, commented "The Olympus Acceleration platform promotes decentralization, innovation and entrepreneurship, we're all looking forward to witnessing the growth of the next generation of projects on the Internet Computer and other ecosystems through Olympus. By creating a web3-based global platform for everyone, we're able to bring together top talent, projects, investors, and mentors to create a credible and trustless marketplace offering equal opportunity and access to all qualified projects. Traditional accelerator programs are permissioned and operate as silos, many are also not sustainable and rely on grants. Olympus is a new model, that's why I am especially excited to get involved as a mentor and share my experience with the next generation of founders".

Unlike existing accelerator programs which are siloed and require permission to interact, Olympus uses an Open Stake model where projects, mentors, and investors can interact freely, enabling permissionless ecosystem inclusivity and unlimited integrations. Utilizing multi-chain infrastructure, projects can also unlock capital and users at scale through early crowdfunding. Olympus will also enable on-chain verification of key project growth metrics, with further verification provided by the platform's trustless perpetual rating loop enabling community members, investors, mentors, and users to rate projects and provide testimonials. Such multi-chain infrastructure and on-chain verification are uniquely powered by the technologies of Internet Computer Protocol.

The launch of the platform is anchored by a number of partners and supporters, including Web3Labs, a blockchain incubation accelerator and investment firm based in Hong Kong committed to discovering, investing in, and nurturing the best projects and innovative teams in web3. DFINITY and Web3Labs recently announced a strategic partnership to foster and promote blockchain innovation across Asia with Web3Labs joining the ICP Asia Alliance founded last year. The first batch of multiple web3 startup accelerator programs is expected to be operated by ICP Hubs as well as partners like Web3Labs through the platform.

Also joining the platform are venture capital investors who will become Mentors in the accelerator cohorts and gain access to deal flows. These investors have also led investment at VC funds including Fenbushi Capital, Fundamental Labs, Softbank Vision Fund, NewTribe Capital, Cypher Capital, Bitcoin Frontier Fund, Summer Ventures, L2IV, Dext Force Ventures, Leadblock Partners, viaBTC Capital, Cipholio Ventures, Chiron Group, 3X Capital, Plutus.VC, and others.

Founders and developers wishing to participate in Olympus can submit their projects to the platform here.

About DFINITY Foundation: The DFINITY Foundation is a not-for-profit organization of leading cryptographers, computer scientists and experts in distributed computing. The DFINITY Foundation boasts the largest R&D operations in the blockchain industry with many employees coming from IBM Research and Google. The DFINITY Foundation employees have published papers 1600+ and 250+ patents. The Foundation is headquartered in Zurich, with a research center also in San Francisco. With a mission to shift cloud computing into a fully decentralized state, the Foundation leveraged its experience to create the Internet Computer and currently operates as a major contributor to the network.

Media Contact [emailprotected]

SOURCE DFINITY Foundation

Excerpt from:

DFINITY Foundation Launches Olympus, a Decentralized Global Acceleration Platform on the Internet Computer - PR Newswire

Read More..