Page 904«..1020..903904905906..910920..»

Comparing Tether on Ethereum vs Tron Blockchains – BTC Peers

Tether (USDT) has become one of the most widely used stablecoins in crypto. As a staple trading pair and hedge against volatility, USDT provides utility for millions of users across various blockchains. But not all Tether implementations are created equal. In this article, we'll compare USDT on the two most popular networks - Ethereum and Tron.

Ethereum is where Tether originated in 2014 (as Realcoin). Ethereum's adoption, developer community, and decentralized architecture made it an ideal blockchain for Tether in the early days.

For many, Ethereum is still synonymous with USDT. Tether took advantage of ERC-20 token standards to launch on Ethereum in 2017. Since then, Ethereum has facilitated the majority of USDT transactions and use cases. From trading on centralized and decentralized exchanges to lending protocols and yield generation, Ethereum powers critical Tether functions.

However, high fees and network congestion on Ethereum have pushed Tether to explore alternatives. The average Ethereum gas fee reached nearly $60 in 2021. These volatile fees create significant friction and uncertainty when moving USDT around.

In response to Ethereum's limitations, Tether launched on the Tron blockchain under the USDT-TRON ticker in early 2019. TRON offered much lower and consistent transaction fees compared to Ethereum.

By Q1 2022, over $55 billion worth of USDT had been issued on Tron. This growth demonstrates the demand for a more efficient USDT environment. Tether also benefits from TRON's high throughput for scalability.

Here are some key advantages USDT on Tron provides:

However, Tron is still considered more centralized than Ethereum. The lower decentralization introduces trust issues around Tether reserves and token minting.

While both chains have advantages, Ethereum remains the most trusted and decentralized Tether environment overall. But Tron provides faster and cheaper transactions for large USDT transfers and active trading.

This diversification brings more flexibility in how users can leverage Tether based on their priorities. And interoperability between the two networks grants broader access to USDT liquidity across the crypto ecosystem.

For smaller payments and transfers, Tron offers faster and almost free transactions. But for larger holdings and long-term storage, Ethereum may provide more decentralized security.

Ethereum is the most trusted and time-tested network for Tether. But the high gas fees create usability issues at times. Tron provides a compelling alternative for active trading and transactions under $10,000.

However, for long-term savings or holdings over $10,000, we recommend the extra security of Ethereum. The decentralization makes it much harder for reserves to be manipulated or compromised. And large institutional investors tend to prefer the battle-tested status of ETH.

So consider using Tron for active usage and transfers below $10k. But for significant savings or inactive holdings, the Ethereum blockchain remains the recommended home for your Tether.

While Tron has seen impressive growth, Ethereum 2.0 upgrades will likely ensure Ethereum remains the dominant Tether environment long-term.

It's tempting to assume Tron will eclipse Ethereum as the top Tether platform. After all, its growth has been astronomical since launching USDT in 2019. Tron already accounts for over one-third of all Tether activity. And the cost savings over Ethereum are undeniable.

However, Ethereum has major upgrades coming that should shore up its advantages over Tron:

These changes should allow Ethereum to match Tron's speed, fees, and scalability, while exceeding its decentralization and security. Ethereum will likely retain most DeFi protocols, developers, and institutional adoption.

Of course, Tron will remain an appealing alternative. But it's unlikely to seriously challenge Ethereum's dominance in the long run. Expect the majority of new innovation and infrastructure to continue materializing on Ethereum.

In summary, Tron makes USDT more accessible for smaller use cases today. But upcoming Ethereum upgrades should secure its status as the pre-eminent Tether environment for the foreseeable future.

See more here:

Comparing Tether on Ethereum vs Tron Blockchains - BTC Peers

Read More..

Governance system for local governments – Part 1 – Profit by Pakistan Today

Why local government? The World Bank in its 1997 report asserts that governments are more effective when they listen to businesses and citizens, and work in partnership with them in deciding and implementing policy. Where governments lack mechanisms to listen, they are not responsive to peoples interests.

The devolution of authority to local tiers of government and decentralization can bring in representation of local business and citizens interests. The visibility of the results achieved by the resources deployed in a specific geographic area maintains pressure on government functionaries. Publicprivate partnerships. including NGOpublic partnerships have proved to be effective tools in fostering good governance.

The World Development Report (WDR 2004) has argued that the accountability of governments to local communities and marginalized social groups will increase by assigning service delivery functions to politicians who are closer to the people and make them electorally accountable.

The 1973 constitution did specify only two tiers of government federal and provincial. It is only after the 18th Amendment in 2010 that a new clause Article 140A was introduced which states that Each province shall, by law, establish a local government system and devolve political, administrative and financial responsibility and authority to the elected representatives of the local governments.

The Supreme court has asserted and directed the holding of elections of local governments on several occasions. Unfortunately, unlike the detailed distribution of powers between the federal and the provincial governments clearly defined in the constitution, there is no such provision for local governments. This vagueness and ambiguity has been used by the provincial governments which have been struggling to come to some reasonable piece of legislation since 2010 on the functions and powers of this tier.

Logically, once the provincial governments were devolved, adequate powers accompanied by sufficient financial allocations out of the divisible tax pool and grants from the federal government there should have been similar decentralization and delegation to the local governments. How is it possible for Punjab, with a population of 110 million people and 36 districts covering an area of 205000 sq km, to respond to the disparate needs of citizens in the delivery of essential services? DG Khan and Faisalabad, for example, have very different requirements and a uniform one-shoe-fits-all approach that is the characteristic of an overcentralized system wont simply work.

The present culture of concentrating authority in power centres at Islamabad, Lahore, Karachi, Peshawar and Quetta has not only alienated the population living in the peripheries, but reduced its productive potential also and to no small extent. It is little surprise then that our research found 80 districts whose ordinary citizens are living in miserable conditions, according to the Deprivation Index, and remain almost criminally starved of their most basic needs.

The political parties that introduced this article in the constitution do not realize that meaningful empowerment of communities through decentralization and delegation of authority, in which the local government system plays a crucial role would in the long run promote greater trust, cohesion and harmony in our society and ensure access to basic public services in an efficient and equitable manner. These outcomes will not only help mobilize additional resources at the local level but also improve the efficiency of resource utilization.

The present state of disaffection and discontentment with the government would also be mitigated if public goods and services of everyday use to the citizens become available to them at the grassroots level. Local communities know their problems and their solutions much better than anybody else. It has also been found that for direct delivery by the government, the transfer of responsibility for these services to lower tiers of governance improves access by the poor.

Local government management of schools and hospitals involving communities and demand-side subsidies to the poor, monitored and under the oversight of government result in a favourable outcome in education and health. As these political parties will also contest elections, they will be represented at that tier of government too. Thus, the credit for citizen satisfaction, efficient allocation of resources and better access to essential services would go to the political parties themselves.

However, the myopic and self-centred approach adopted by all major political parties and resistance to empowerment and strengthening local governments is highly incomprehensible as in actual fact it simply entails the transfer of power from the provincial and national legislators and the ministers to the locally elected nazims or mayors of the districts. Those seeking to preserve their status, clout and influence should opt for local nazim positions rather than becoming MPAs or MNAs.

The 2001-2009 period: It would be useful to make an objective assessment of the local government system that existed in Pakistan between 2001 and 2009. There were many flaws in the 2001 system, including the fact that the functions of law and order, revenue records, and land administration and disaster management should have remained with neutral civil servants and not transferred to the nazims. In that event, the offices of the deputy/assistant commissioner should not have been abolished, thereby diluting the writ of the state.

The executive authority of the newly created post of district coordination officer DCO was diluted as magisterial powers were taken away from him or her although s/he was expected to perform duties relating to maintenance of law and order, removal of encroachments, price controls, and the like. As the powers of recruitment, transfers, postings, and disciplinary actions continued to remain vested in the provincial departments, the diarchy proved to be fatal for the effective functioning of the devolved departments.

The gap between law and actual practice remained wide to the detriment of the public at large. Corruption at the district government level could not be contained given the inadequate supervisory arrangements evolved by the provincial governments. The provincial secretaries retained considerable administrative authority over the district bureaucrats and used these powers to undercut the efficacy of the elective nazims. A tripartite confrontational mode in which the provincial ministers and secretaries aligned themselves against the district nazims was responsible for most of the practical difficulties faced by the citizens in access to services.

The degree of fiscal decentralization remained limited because the districts continued to depend upon the province for resources, didnt have the powers to collect new taxes, and didnt have the capacity to levy service or user charges. On the expenditure side, the fixed and growing expense of salaries, wages, and allowances paid to the staff devolved to the district governments (although they continue to be provincial servants) did not leave much surplus for either maintenance, operational, and development expenditure.

Over 90 per cent of expenditure of local governments was financed by transfers from the provincial governments. Lack of enhancement in local fiscal powers was a major weakness in the process of fiscal decentralization.

The share of local governments in the provincial allocable pool was about 40 per cent but their share in total public expenditure was only 13 per cent. Resource mobilization at the provincial and local levels remained substantially under-exploited. Land revenue accounted for less than one per cent of the agricultural income while the effective rate of property taxation of rental incomes was about five per cent as opposed to the statutory rate of 20 per cent or more.

The fragmentation of development projects into small schemes catered to the narrow interests of the local communities without any sense of priority, linkages, or widespread coverage.

Ideally, the transfer of resources from urban to rural areas should be a welcome move but such a transfer in the absence of a district-wide plan without specifying the goals to be achieved and assessing the cost-benefit of the approved schemes can be counterproductive. Urban-rural integration did not recognize or cater to the needs of growing urbanization.

Hasnain concludes on the basis of his study that in order to keep his voters happy, the district nazim would have very little choice but to acquiesce to the pressures exerted by the union and tehsil Nazims to allocate resources equally. The difference between equal and equitable distribution of resources should be understood as it is at the crux of the problem.

Under an equal distribution scheme there is no clear relationship between the needs of the community and the intended interventions. Rich and poor communities will receive the same amount irrespective of the intensity of their need. Equitable distribution takes into account the differences in the initial endowments and conditions of the intended beneficiaries. Those who are poor, marginalized, live in remote or geographically disadvantaged areas and cannot earn decent incomes on their own should receive higher allocations than those who are better off. Public resources thus supplement the private incomes of the poor to help out of poverty.

Two innovative features of the 2001 system are worth mentioning. The reservation of one-third seats for women and others for peasants, workers, minorities, the marginalized classes of our society, was an extremely commendable step. Similarly, the integration of the rural and urban administrative units at the tehsil level would have allowed the rural areas to benefit equally from the larger envelope of pooled resources available to the Tehsil Council. Even if the underlying patron-client relationship persists, the scope for inclusion of clients who were traditionally denied access under a MNA/MPA centred system, will be much wider under a decentralized and devolved system.

However, despite these flaws, empirical studies and surveys point to the net positive achievements of the local government system. The Social Audit Survey 2009-10 of 12,000 households drawn from 21 districts in all four provinces found that 56 per cent favoured the continuation of the local government system with high proportions in Punjab and Sindh. The level of satisfaction with the union councils was 33.8 per cent but the situation regarding support and social acceptability of womens participation seemed to have improved. Sixty per cent of female union councilors were of the view that people in their constituencies were happy with them.

The satisfaction levels of households with various public services varied but by 2009-10 satisfaction with roads, sewerage and sanitation, garbage disposal, water supply, health and education had improved although in percentage terms only less than half of the households expressed satisfaction with the services. Public education, at 58 per cent, showed the highest level of satisfaction.

The Social Policy Development Centre (SPDC) carried out a survey of 12 districts in the four provinces and found that the rate of enhancement in literacy of the population and access to water supply and sanitation had perceptibly increased during the post-devolution period. However, there were no indications of any impact of devolution on health indicators. The process of devolution was beginning to contribute to a quicker improvement in enrolment at the primary level and literacy in Pakistan.

At a micro level, Cheema and Mohmand analyzed a dataset of 364 households in the rural tehsil of Jaranwala in Faisalabad District to gain some insights regarding the types of households which gain and lose through electoral decentralization and whether the change in the post-reform provision between different household types is equitable. The empirical results of their study showed that increased access to development funds and heightened mandates for union nazims have resulted in a significant increase in union level provisions within a short span of time. They further found that the increase in the post-reform provision in nazim villages is less elite-based as it encompasses small peasants, minority peasant biradaris, and non-agricultural castes.

Hasnain reports on the basis of a survey carried out in 2005 that over 60 per cent of the households stated that they would approach a union councilor or Nazim to resolve their problems in comparison to only 10 per cent who said they would approach members of the provincial or national assembly. This reflects the increase in accessibility of policymakers after devolution. A system in which bureaucrats control the development departments provides neither access nor accountability. Having a system of elected nazims and councilors who remain responsive to the needs of their citizens is better because these officials are liable to lose their offices if they do not fulfil their responsibilities and duties. The best one can do with a recalcitrant bureaucrat is to transfer him out of a particular district but that does not resolve the inherent problem of access to the poor.

Cheema, Khawaja and Qadir in their study found that three types of changes were brought about by the 2001 devolution. One, changes in the decision-making level of the service from provincial bureaucrats to district level bureaucracy. Two, changes in the decision-makers accountability from bureaucrats to elected representatives at the district level; and three, changes in the fiscal resources available to the service.

The education department, primary healthcare and the management of district and tehsil hospitals experienced a change of the first type, where the decisions previously made by the provincial secretariat and the provincial cabinet were transferred to the district nazim and executive district officers.

The municipal services provided by the local government, the rural development department, and the public health engineering departments of the provincial government became the sole functional responsibility of the tehsil municipal administration. This was a fundamental change because the power to allocate resources, prioritize projects, and deliver results moved away from 48 provincial departments to 6000 units of local government whereas prior to devolution, the deconcentrated provincial bureaucracy at the district level was accountable to their non-elected provincial secretariat. The 2001 devolution made them accountable to the elected heads of districts and tehsil governments. Under the previous system, the de-facto head of the district administration was the district commissioner who would report to the non-elected commissioner while after devolution he reported to the elected district nazim (mayor).

Their study also found that a rule-based fiscal transfer system between the provinces and the local governments was established under the 2001 Devolution Plan. Approximately 40 per cent of the provincial consolidated fund was distributed among local governments with due weightage given to backwardness in order to ensure some form of equity across districts in the allocation of development funds. The other innovation was that these budgetary transfers did not lapse at the end of the year but continued to be retained by the relevant local governments, providing for flexibility and presumably some improvement in the efficiency of resource allocation

Proposed governance system for local governments: In light of the experience of the 2001 LGO, let us now examine what needs to be done to avoid the weaknesses of the previous system and implement the spirit of Article 140A using political, administrative and financial dimensions of devolution.

The remainder of this piece will be published in next weeks issue.

Continued here:

Governance system for local governments - Part 1 - Profit by Pakistan Today

Read More..

Bitcoin Spark’s ICO promises rewarding returns for Early Investors – Captain Altcoin

Home Journal Bitcoin Sparks ICO promises rewarding returns for Early Investors

Bitcoin Spark (BTCS), with its Initial Coin Offering (ICO) in full throttle, offers early investors the chance to be part of a promising venture. Bitcoin Sparks ICO is in the last stages of phase three at affordable pricing and enticing bonus, before rolling out phase four at an increased value but declined bonus offer. BTCS provides enticing returns for those who recognize its potential early on, and the phase three holders will realize a 560% ROI.

BTCS, a ticker for Bitcoin Spark, is a decentralized cryptocurrency project that seeks to revolutionize crypto mining and rewards distribution. This Bitcoin fork, built on Ethereum, is driven by an innovative concept, the Proof-Of-Process (PoP) consensus mechanism, aimed at enhancing security and scalability while maintaining decentralization. The project emphasizes decentralization by encouraging more validators to participate, bolstering network security. By combining unique approaches like the non-linear rewards system, BTCS plans to create a self-sustaining mining ecosystem that offers consistent profitability to miners.

Bitcoin Spark uses a smart combination of proof-of-work and proof-of-stake mechanisms, and a special algorithm, to ensure stability and fairness in distributing rewards. Miners need to solve hexadecimal hashes to earn rewards. While solving hexadecimal hashes wont be the primary way to earn, users need to stake on the network, similar to how Proof-Of-Stake blockchains work. However, the relationship between stake and earnings isnt linear. Unlike traditional systems that solely favor high computing power or large stakes, Bitcoin Sparks approach prevents imbalanced rewards based purely on monetary worth.

The network also requires users to contribute processing power to Bitcoin Sparks clients for remote computing tasks that demand high CPU or GPU usage. An exciting feature is that users can contribute processing power through an app in a secure virtual environment, setting Bitcoin Spark apart. BTCS provides miners with a user-friendly Bitcoin Spark application, enabling them to contribute their device processing power to the network and earn rewards.

Rewards are calculated based on a combination of the individual stake and the work performed as remote computing. The more work and stake provided, the higher the rewards but this relationship isnt linear. Higher emphasis is placed on work done to prioritize the revenue-generating product.

Bitcoin Spark caps users to contribute around 5 Teraflops of processing power before experiencing rewards reduction. This estimation may change over time based on total network output, client demand, remaining mining rewards, and total BTCS staked. The system will adapt with technological advancements to stay relevant.

Through its ICO and advanced technologies, Bitcoin Spark envisions a balanced and profitable environment for miners while contributing to the broader cryptocurrency landscape. BTCS rolled out phase one of its ICO at $1.50 and a 20% bonus, with phase one holders expecting an 800% ROI.

Phase two featured one BTCS at $1.75 and a 15% bonus, for an ROI of 657%. Imagine acquiring Bitcoin when its value was $1. The current phase three, which is running out fast, has one BTCS at $2.00 and a 12% bonus. Phase three holders expect 560% in ROI gains.

As evidenced above, Bitcoin Spark implements high rewards in the early ICO stages, rewarding in a big way the early investors. Early adopters enjoy an affordable BTCS price, which keeps increasing after each phase, and bonuses which diminish as phases progress. After the project launch in November, the early adopters will enjoy higher margins and will have earned impressive amounts in bonuses.

Bitcoin Sparks profitability is driven by a thoughtful blend of mechanisms that value both stake and active contribution, fostering a fair and lucrative environment for its participants. The reward structure is designed to prioritize revenue generation. The more work a user performs and the higher their stake, the greater their rewards, but this relationship follows a non-linear pattern.

This approach encourages participation by both large and small stakeholders, enhancing network security and decentralization. Apart from processing power rental, the BTCS ecosystem also introduces a new advertising concept where users can vote out adverts if inappropriate.

Read more on BTCS here:

Website|Buy BTCS

Disclaimer: We advise readers to do their own research before interacting with any featured companies. The information provided is not financial or legal advice. Neither CaptainAltcoin nor any third party recommends buying or selling any financial products. Investing in cryptoassets is high-risk; consider the potential for loss. CaptainAltcoin is not liable for any damages or losses from using or relying on this content.

CaptainAltcoin's writers and guest post authors may or may not have a vested interest in any of the mentioned projects and businesses. None of the content on CaptainAltcoin is investment advice nor is it a replacement for advice from a certified financial planner. The views expressed in this article are those of the author and do not necessarily reflect the official policy or position of CaptainAltcoin.com

See original here:

Bitcoin Spark's ICO promises rewarding returns for Early Investors - Captain Altcoin

Read More..

Is DeFi Poised to Eclipse CeFi? Binance CEO Seems To Think So – Crypto Mode

In a recent virtual conversation through X Spaces, Binance CEO Changpeng CZ Zhao deliberated on the growing influence of decentralized finance (DeFi). He proposed a perspective: DeFi might soon outpace centralized finance (CeFi). But what exactly was his basis, and what are the latest trends supporting this theory?

Zhao fervently believes in the power of decentralization. The greater the industrys decentralization, the more prosperous its future, he emphasized. DeFi currently contributes 5% to 10% of CeFi trading volumes. However, this isnt a meager contribution. Zhao says, The upcoming bullish phase could position DeFi above CeFi.

Following the U.S. Securities and Exchange Commissions (SEC) legal motions against centralized exchanges like Coinbase and Binance, theres been a seismic shift. The average trading volume of the top three decentralized exchanges (DEXs) rocketed 444% within two days. As of the last report, DEXs clocked a 24-hour trading volume of $722,776,226.

Touching on the recent Uniswap lawsuit dismissal, Zhao expressed his relief and approval. The resolution for Uniswap was unequivocally positive, logical, and transparent. Its a significant win, he articulated.

On August 30th, a U.S. federal court dismissed a collective suit against Uniswap and its leadership. The litigation was led by plaintiffs asserting losses from fraudulent tokens on this decentralized crypto exchange. The dismissal hinged on the inability of both parties to pinpoint the fraudsters and stressed that regulatory vagueness weakens investor protection.

A participant in the X Spaces session highlighted the courts stance: developers arent accountable for DeFi platform misuse. Zhao concurred, recognizing this as a favorable move for DeFi architects. Code composition by developers is tantamount to free speech. This progress is truly commendable, he noted.

A noteworthy trend is the reallocation of venture capital from CeFi ventures to DeFi initiatives. CoinGeckos report from March 1st provided some telling statistics. Digital asset investment agencies poured $2.7 billion into DeFi endeavors in 2022, marking a whopping 190% rise from the previous year. Conversely, CeFi project investments plummeted 73%, at $4.3 billion.

The scales might be tipping in DeFis favor. As regulators, investors, and industry leaders navigate these evolving landscapes, the discourse on DeFi and its potential dominance over CeFi is far from over. The upcoming chapters in this financial narrative will undoubtedly be ones to watch.

None of the information on this website is investment or financial advice. CryptoMode is not responsible for any financial losses sustained by acting on information provided on this website.

Continue reading here:

Is DeFi Poised to Eclipse CeFi? Binance CEO Seems To Think So - Crypto Mode

Read More..

Quantum Machines next-gen quantum control solution that can scale beyond 1,000 qubits – TechCrunch

Image Credits: Quantum Machines

Tel Aviv-based Quantum Machines today announced the OPX1000, the latest iteration of its quantum controller. Built for large-scale quantum computers, the OPX1000 can control 1,000 qubits and more, well beyond what its predecessors the OPX and OPX+ controllers could handle.

Quantum Machines controllers have allowed many of the leading quantum computer manufacturers to deliver on the existing roadmaps. But now, those companies are looking to build machines with 100,000 qubits or more within the next decade and beyond, getting the noise under control figuring out how to control them is yet another major challenge.

Quantum Machines co-founder and CEO Itamar Sivan told me that he believes the original OPX in 2019 and then the OPX+ drove a major paradigm shift in how people looked at quantum control and orchestration. It was a paradigm shift in the sense that people used to operate quantum processors with what you could call a memory-based control system, he explained. He likened those memory-based systems to memorizing massive multiplication and division tables. That allows you to program the quantum processors but its not a smart system. It cant quickly react to changes in the processing unit and it cant handle increasingly complex algorithms.

Quantum Machines original breakthrough was that it moved from memory-based to processor-based machines. The QPX1000 pushes this a step further by not just featuring an updated version of the companys Pule Processing Unit (PPU), but also an increased number of control channels (64 output and 16 input channels), which Quantum Machines says allows it to offer the highest in-class density of control and readout channels on the market right now. In part, its the improved networking stack that allows Quantum Machines to now scale its solution to more than 1,000 qubits.

All of this fits into a data center-ready 3U package and its worth stressing that this is very much a solution for some of the most sophisticated players in the quantum space. That also means that not everyone currently needs the OPX1000 and Quantum Machines will continue to offer the OPX+, too.

Currently, hundreds of research labs, HPC centers and quantum computer manufacturers are using the existing OPX controllers. For the OPX1000, Quantum Machines recruited a small number of beta users who are already testing the new systems. The company wasnt quite ready to go on the record with their names, though. The OPX1000 will be generally available later this year, though, and I expect well hear more from Quantum Machines customers then.

Looking ahead, Sivan explained that figuring out how to scale beyond 10,000 qubits is still an open question.

When if you have a wonderful Quantum Processing Unit with 20 cubits, lets say, if youre not using the most advanced control system in the industry, youre really not even getting a fraction of its capabilities, he told me. If you look beyond 10,000, theres still question marks. Even for players like Quantum Machines that have sold whats needed for the 1,000 [qubits], I can tell you for sure that there are still many open questions and a lot of research left to do with whats happening beyond that.

In part, he noted, thats why Quantum Machines recently partnered with Nvidia to combine classical and quantum machines.

Continued here:
Quantum Machines next-gen quantum control solution that can scale beyond 1,000 qubits - TechCrunch

Read More..

Ethics and quantum computing – Scientific Computing World

New technology and ethics are inseparably linked in today's rapidly evolving technological landscape. Quantum computing is no exception: as we stand on the precipice of a new era of computing, the ethical considerations that arise are complex and far-reaching. As a company that recognises the importance of these ethical consideration and is committed to responsible innovation, we believe that these concerns must be understood and addressed.

Ethical quantum concerns typically fall into several major categories:

1. Resource Allocation and Inequality: Quantum computing is a resource-intensive technology, both in terms of the physical resources required to build quantum computers and the human resources needed to program and operate them. Such resources are available only to a few nations. Given this, and given the rise in quantum nationalism - the development of country-specific quantum programs - will the benefits of quantum computing primarily accrue to the wealthy, developed nations that can afford to invest in it? This could further deepen global socio-economic divides. Within the legal frameworks of the countries QuEra operates in, we seek to provide equitable access to potential users, whether via the cloud or by owning a quantum computer.

2. Misuse of power: a sufficiently powerful quantum computer could one day break many current encryption schemes leading to unparalleled breaches of privacy and security. Thats why many experts warn against bad actors that implement Store Now Decrypt Later, capturing encrypted information today while hoping to decrypt it in a few years. This is especially relevant for information with a long shelf life such as medical records or certain financial transactions

3. Accountability and Transparency: The complexity of quantum algorithms could lead to a lack of transparency and accountability. If a quantum algorithm, for instance, makes a mistake or causes harm, it may be difficult to understand why or how it happened. Ensuring such explainability is a key requirement of many algorithms such as those deciding the outcome of a loan application. At QuEra, we seek to understand the reasons certain algorithms work and share this knowledge with our customers.

4. Job Displacement: The increased processing power and efficiency of quantum computers could automate many jobs currently performed by humans, leading to potential job displacement. We do our best to support education and re-training programs both to address the potential of job displacement as well as to train the next generation of scientists and technicians that will help build, program and maintain these advanced machines.

Some of these categories, such as job displacement, are not specific to quantum computing and present themselves when discussing other technologies such as AI or robotics. Others breaking the encryption system - are specific to quantum, whereas AI presents its own unique challenges such as bias and discrimination, the ability to generate artificial consciousness.

Striving to address these concerns, several organisations have started constructing ethical frameworks for quantum computing. The World Economic Forum has developed a set of Quantum Computing Governance Principles that aim to guide the responsible development and use of quantum computing including inclusiveness and equity, security and safety, environmental sustainability, and transparency and accountability. The National Academies of Sciences, Engineering, and Medicine has published a report on The Ethics of Quantum Computing that identifies a number of ethical issues including the potential malicious use of quantum computing, the potential to disrupt existing industries, the negative environmental potential, and the need to ensure that quantum computing is developed and used in a way that is fair and equitable. Last, Deloitte has developed a Trustworthy & Ethical Tech Framework that can be used to guide the development and use of quantum computing.

Beyond ethical frameworks, one could imagine some solutions. Job displacement, for instance, is often associated with the introduction of transformative technologies. Factory workers that manually assembled cars might find themselves displaced by robots, but these robots need to be built and serviced by people. If quantum computers make certain jobs obsolete, they open other opportunities.

Other solutions might require multinational collaboration. For example, the World Health Organization serves an important function that ultimately helps both developed as well as developing nations. Promoting standards, monitoring global trends, and coordinating emergency responses have helped address inequality in healthcare, benefiting all. Similarly, a World Quantum Organization might provide shared quantum resources to benefit all, not just those that could develop an autonomous quantum ecosystem.

Concurrent with developing solutions and ethical frameworks, there is a need to educate and inform the public, policymakers, and stakeholders about the potential implications of quantum computing to foster informed discussions about its ethical, social, and economic impacts.

Quantum computing's potential to revolutionise industries is matched by the complexity of the ethical considerations it raises. At QuEra, we recognise these challenges and are committed to responsible innovation that prioritises inclusiveness, security, and sustainability. Collaborative efforts, such as the proposed 'World Quantum Organization,' resonate with our belief in shared quantum resources and global partnerships, and we invite interested parties to engage with us. As we navigate this exciting frontier, we must do so with both eyes open to the potential downsides, ready to tackle them head-on, and always guided by ethical principles.

Yuval Boger is the Chief Marketing Officer at QuEra Computing.

Excerpt from:
Ethics and quantum computing - Scientific Computing World

Read More..

Quantum Computing: Technologies and Global Markets to 2028 – GlobeNewswire

New York, Sept. 04, 2023 (GLOBE NEWSWIRE) -- Reportlinker.com announces the release of the report "Quantum Computing: Technologies and Global Markets to 2028" - https://www.reportlinker.com/p05480379/?utm_source=GNW

Using 2022 as the base year, the report provides estimated market data for the forecast period 2023 through 2028.

Revenue forecasts for this period are segmented into:Offering: services and systems.Method of deployment: on-premises and cloud-based.Technology: trapped ions, quantum annealing, superconducting qubits, and others.Application: quantum-assisted optimization, quantum simulation and quantum-assisted machine learning.End-user industry: banking and finance, IT and telecom, healthcare and pharmaceuticals, space and defense, energy and power, transportation and logistics, academia, government, chemicals, and others.Region: North America is segmented into the U.S., Canada, and Mexico; Europe is segmented into the U.K., France, Germany, and Rest of Europe; the U.K. is further segmented into England, Wales, Scotland, and Northern Ireland; Asia-Pacific (APAC) is segmented into China, Japan, India, and Rest of Asia-Pacific; the Rest of World is segmented into the Middle East and Africa, and Latin America.COVID-19 has had a massive impact on society since early 2020.This report considers the impact of COVID-19 and the economic slowdown it created.

With people relying more on technology, the demand for quantum computing will increase and boost the market growth. The report also focuses on the major trends and challenges that affect the market and the vendor landscape.

This report has been prepared in a simple, easy-to-understand format, with numerous tables and charts/figures.The scope of the report includes a detailed study of global and regional markets for quantum computing, with reasons given for variations in the growth of the industry in certain regions.

The report examines each component of quantum computing technology, determines its current market size, and estimates its future market.The report also analyzes the market from the manufacturers viewpoint as well as that of the final consumer.

A number of technical issues arising out of the utilization of quantum computing technologies are discussed, and solutions are indicated.

Report Includes:- 43 data tables and 45 additional tables- An updated review of the global markets for quantum computing technologies- Estimation of market size and analyses of global market trends, with data from 2022, estimates for 2023, and projections of compound annual growth rates (CAGRs) through 2028- Evaluation and forecast the global quantum computing market size in dollar value terms, and corresponding market share analysis by offering, application, end-user industry and region- Identification of the quantum computing technologies and products with the greatest commercial potential- Coverage of recent advances in the quantum computing industries with environmental, social, and corporate governance (ESG) developments, and information on Japans first superconducting quantum computer launched by Nippon Telegraph and Telephone Corp. (NTT)- Assessment of the key drivers and constraints that will shape the market for quantum computing over the next ten years and discussion on the upcoming market opportunities and areas of focus to forecast the market into various segments and sub-segments- Identification of the companies best positioned to meet this demand because of their proprietary technologies, strategic alliances, or other advantages- Review of the key patent grants and new technologies in the quantum computing sector- Insight into the recent industry strategies, such as M&A deals, joint ventures, collaborations, and license agreements currently focused on quantum computing products and services- Company profiles of major players within the industry, including Alphabet Inc. (Google LLC), Amazon.com Inc., International Business Machines (IBM) Corp., and Microsoft Corp.

Summary:Quantum computing is the gateway to the future.It can revolutionize computation by making certain types of classically stubborn problems solvable.

Currently, no quantum computer is mature enough to perform calculations that traditional computers cannot, but great progress has been made in the last few years.Several large and small start-ups are using non-error-corrected quantum computers made up of dozens of qubits, some of which are even publicly accessible via the cloud.

Quantum computing helps scientists accelerate their discoveries in related areas, such as machine learning and artificial intelligence.

This report has divided the global quantum computing market based on offering, technology, method of deployment, application, end-user industry, and region.Based on offering, the market is segmented into systems and services.

The services memory segment held the largest market share, and it is expected to register the highest CAGR, at REDACTED%, during the forecast period. The services segment includes quantum computing as a service (QCaaS) and consulting services.

The market for quantum computing by application is segmented into quantum-assisted optimization, quantum simulation, quantum-assisted machine learning, and quantum cryptography. The quantumassisted optimization segment dominated the market, holding over REDACTED% of market share in 2022.

With regard to end-user industries, the market covers banking and finance, information technology, healthcare and pharmaceuticals, space and defense, energy and power, transportation and logistics, academia, government, chemicals, and others.The demand for quantum computers is expected to grow from multiple end-user industries, from finance to pharmaceuticals, automobiles to aerospace.

The academia, government, banking and finance, healthcare and pharmaceuticals, and chemicals industries are expected to be fastest growing end-user industries during the forecast period.

In terms of geographical regions, North America held the highest revenue share in the market in 2022 at $REDACTED million, and it is expected that it will continue to dominate the revenue share with a value of $REDACTED billion in 2028. The robust R&D environment and the increasing focus on public-private partnerships to boost adoption and innovation in the field are expected to drive the quantum computing market in North America.

The extensive growth of the Europe quantum market is mainly driven by key factors such as the rush towards quantum computing technologies in the region in various sectors such as healthcare, chemicals and pharmaceuticals among others. Also, its higher application in fields such as development and discovery of new drugs, cryptography, cybersecurity, and defense sector is likely to bolster market growth during the forecast period.

The APAC region is expected to be the fastest-growing regional market for quantum computing during the forecast period.In 2022, China accounted for a majority of the demand for quantum computing in APAC due to growing applications from end-user industries and increasing R&D activities.

The other Asia-Pacific countries, including Japan, India and South Korea, are supplementing regional market growth.Read the full report: https://www.reportlinker.com/p05480379/?utm_source=GNW

About ReportlinkerReportLinker is an award-winning market research solution. Reportlinker finds and organizes the latest industry data so you get all the market research you need - instantly, in one place.

__________________________

Originally posted here:
Quantum Computing: Technologies and Global Markets to 2028 - GlobeNewswire

Read More..

D-Wave Suggests Quantum Annealing Could Help AI – The New Stack

The effect of quantum computing on Artificial Intelligence could be as understated as it is profound.

Some say quantum computing is necessary to achieve General Artificial Intelligence. Certain expressions of this paradigm, such as quantum annealing, are inherently probabilistic and optimal for machine learning. The most pervasive quantum annealing use cases center on optimization and constraints, problems that have traditionally involved non-statistical AI approaches like rules, symbols, and reasoning.

When one considers the fact that there are now cloud options for accessing this form of quantum computing (replete with resources for making it enterprise-applicable for any number of deployments) sans expensive hardware, one fact becomes unmistakably clear.

With quantum computing, a lot of times were talking about what will it be able to do in the future, observed Mark Johnson,D-WaveSVP of Quantum Technologies and Systems Products. But no, you can do things with it today.

Granted, not all those things involve data science intricacies. Supply chain management and logistics are just as easily handled by quantum annealing technologies. But, when these applications are considered in tandem with some of the more progressive approaches to AI-enabled by quantum annealing, their esteem to organizations across verticals becomes apparent.

Quantum annealing involves the variety of quantum computing in which, when the quantum computer reaches its lowest energy state, it solves a specific problem even NP-hard problems. Thus, whether users are trying to select features for a machine learning model or the optimum route to send a fleet of grocery store delivery drivers, quantum annealing approaches provide these solutions when the lowest energy state is achieved. Annealing quantum computing is a heuristic probabilistic solver, Johnson remarked. So, you might end up with the very best answer possible or, if you dont, you will end up with a very good answer.

Quantum annealings merit lies in its ability to supply these answers at an enormous scale such as that required for a defense agencys need to analyze all possible threats and responses for a specific location at a given time. It excels in cases in which you need to consider many, many possibilities and its hard to wade through them, Johnson mentioned. Classical computational models consider each possibility one at a time for such a combinatorial optimization problem.

Quantum annealing considers those possibilities simultaneously.

The data science implications for this computational approach are almost limitless. One developer resource D-Wave has made available via the cloud is a plug-in for the SDK for Ocean a suite of open source Python tools that integrates with scikit-learn to improve feature selection. It supports recognizing in a large pattern of data, can I pick out features that correlate with certain things and being able to navigate that, Johnson remarked. I understand it ends up mapping into an optimization problem. The statistical aspects of quantum annealing are suitable for other facets of advanced machine learning, too.

According to Johnson, because of its probabilistic nature, one of the interesting things that quantum annealing does is not just picking the best answer or a good answer, but coming up with a distribution, a diversity of answers, and understanding the collection of answers and a little about how they relate to each other. This quality of quantum annealing is useful for numerous dimensions of machine learning includingbackpropagation, which is used to adjust a neural networks parameters while going from the output to the input. It can also reinforce what Johnson termed Boltzmann sampling, which involves randomly sampling combinatorial structures.

There are considerable advantages to making quantum annealing available through the cloud. The cloud architecture for accessing this form of computing is just as viable for accessing what Johnson called the gate model type of quantum computing, which is primed for factoring numbers and used in RSA encryption schema, Johnson confirmed. Organizations can avail themselves of both quantum computing methods in D-Waves cloud platform. Moreover, they can also utilize hybrid quantum and classical computinginfrastructure as well, which is becoming ever more relevant in modern quantum computing conversations. You would just basically be using both of them together for the part of the problem thats most efficient, Johnson explained.

In addition to the ready availability of each of these computational models, D-Waves cloud platform furnishes documentation for a range of example use cases for common business problems across industries. Theres also an integrated developer environment you can pull up that already has in it Ocean, our open source suite of tools, which help the developer interface with the quantum computer, Johnson added. Examples include the ability to write code in Python. When organizations find documentation in the cloud about a previous use case thats similar to theirs, You can pull up sample code that will use the quantum computer to solve that problem in your integrated developer environment, Johnson noted.

That sample code provides an excellent starting point for developers to build applications for applying quantum computing and hybrid quantum and classical computing methods to an array of business problems pertaining to financial services, manufacturing, life sciences, manufacturing, and more. Its just one of the many benefits of quantum computing through the cloud. The appeal of quantum annealing, of course, lies in its ability to expedite the time required to solve combinatorial optimization problems.

As the ready examples of quantum solutions the vast majority of which entail quantum annealing across the aforesaid verticals indicate, such issues are, the harder we look, ubiquitous throughout business, Johnson indicated. The data science utility of quantum annealing for feature selection, Boltzmann sampling, and backpropagation is equally horizontal and may prove influential to the adoption rates of this computational approach.

Link:
D-Wave Suggests Quantum Annealing Could Help AI - The New Stack

Read More..

Cooling method will enable size reduction of quantum computers – Electronic Products & Technology

VTT Technical Research Centre of Findland is developing a cooling technology based on microelectronics and electric current, which can be utilised by low-temperature electronic and photonic components. The new technology reduces the size, power consumption and price of cooling systems. The method has a wide range of application fields: one topical example is quantum technology.

Figure 1: Silicon wafer with VTTs electronic refrigerators. The wafer is under visual investigation under a microscope. Source: VTT

Many electronic, photonic and quantum technology components require cryogenics, as they only operate at very low temperatures. For example, a quantum computer built from superconducting circuits has to be cooled near to the absolute zero (-273.15 ). Currently, such temperatures are achieved by complex and large dilution coolers. VTTs electronic method can replace and complement existing solutions and thus reduce the size of the refrigerators. Accordingly, this makes it possible to significantly reduce the size of quantum computers.

Current dilution refrigerators are based on multistage pumping of cryogenic liquids. Although these coolers are commercial technology today, they are still very expensive and large. What makes the cooler technology complicated is especially its coldest stage, where refrigerant is a mixture of helium isotopes. New electric cooling technology could replace this part. This would make the system much simpler, smaller, more efficient and more cost effective. A cooler the size of a car, which cools a silicon chip of about a square centimeter in size, could be shrunk by orders of magnitudes down to a size of a suitcase, for example.

Figure 2: Schematic illustration of VTTs electronic refrigerator technology. Refrigerator chips are joined by tunnel junctions, through which the passing electrical current leads to cooling, and the lowest temperature is reached on the topmost chip. Source: VTT

We believe that this purely electric cooling method can be utilised in numerous applications requiring cryogenics, from quantum computing to sensitive radiation detectors and space technology, says VTT Research Professor Mika Prunnila, who is leading the cooler development.

VTT researchers have already experimentally confirmed the functionality of the cooling method. The method is now being refined into a commercial demonstrator in SoCool-project which was granted to VTT in the highly competitive EIC-Transition program of the European Commission. VTT will also continue the highly important fundamental research of electronic coolers in the CoRE-Cryo-project, funded by the Technology Industries of Finland Centennial Foundation.

Electric cooling can be used to actively cool components directly on a silicon chip or in large-scale general purpose refrigerators. It is a platform technology that is suitable for numerous applications and creates opportunities for new business. The active part of the cooler is manufactured using microelectronics manufacturing methods on silicon wafers, which makes the manufacturing very cost-effective.

Figure 3: VTTs electronic refrigerator prototypes going to cryogenic testing. Source: VTT

Making the refrigeration systems more user friendly, smaller and cheaper can significantly boost the application of cryo-enabled technologies to new areas. We see that our electronic cooling technology can play an important role in this development, Mika Prunnila says.

Cryogenics has become an area of increasing interest thanks to quantum technology. Systems developed for the extreme demands of the quantum technology can be also used in various sensors, space technology and possibly also in classical computing. Compact and easy-to-use cooling methods contribute to the large-scale adoption of these technologies. Quantum technology is expected to be only the tip of the iceberg for cryogenic, cryo-electronic and cryo-photonic applications.

Link:
Cooling method will enable size reduction of quantum computers - Electronic Products & Technology

Read More..

Why These 3 Stocks Are the Worst Ways to Play Quantum … – InvestorPlace

Quantum computing stocks are having a moment. The industrys leading pure play company, IonQ (NYSE:IONQ), up nearly 400% year to date (YTD), has plenty of positive chatter about their proprietary technology. However, traders should avoid these three other quantum stocks that dont have the same apparent potential.

Its important to realize that there are various ways to generate qubits and enable quantum computing. A pitfall is to value all these methods similarly. Yet, trapped ion approaches, used by IonQ and Honeywell (NYSE:HON), show the most promise. Others, such as superconducting and quantum annealing, are less proven to this point.

Avoid making any painful quantum computing mistakes and steer clear of these three riskier quantum picks.

Source: Bartlomiej K. Wroblewski / Shutterstock.com

Rigetti Computing (NASDAQ:RGTI) is a small, pure play quantum computing firm focused on using superconducting technology to produce its qubits.

Researchers have noted that there are high error rates with superconducting transmon qubits. Efforts are under way to reduce the error rate in order to make superconducting a more competitive way of achieving quantum computing. For now, investors have gravitated to alternatives, such as trapped ion systems, used by rivals such as IonQ.

Rigetti enjoyed a brief moment in the sun this summer thanks to the superconducting media cycle. However, skeptics quickly debunked reports of a breakthrough in room temperature-based superconducting systems. This should put an end to the recent enthusiasm in RGTI stock.

At the end of the day, Rigetti is a tiny firm with minimal revenues attempting to popularize so-called second-tier quantum computing systems. All of that makes it a highly risky bet today.

Source: vs148 / Shutterstock

Quantum computing and quantum technologies have started to make their impacts known in other industries. For example, Arqit Quantum (NASDAQ:ARQQ), is attempting to commercialize quantum encryption to deliver next-generation cybersecurity solutions.

On paper, this seems like a promising venture. The CEO, David Williams, has given investors an optimistic vision of the company, declaring in 2021 that Arqit could be Britains biggest ever tech scale-up.

However, another statement he made at the time has now backfired.

We dont need to raise any more money, ever, Williams stated when Arqits SPAC deal was closing.

You can probably guess what happened next. Thats right, Arqit slammed investors with an unexpected capital raise earlier this year, causing the stock to careen 44% lower in a single day.

Perhaps quantum computing will be vital in developing future cybersecurity technologies. With little sign of it yet, Arqit is generating a disappointing $2.6 million of revenues through the first half of its latest fiscal year.

Source: T. Schneider / Shutterstock

D-Wave Quantum(NYSE:QBTS) is a Canadian technology company attempting to commercialize its quantum computing systems. It has a fifth-generation quantum computer and offers quantum computing services on-demand, in addition to its Ocean open-source python tools ecosystem.

D-Wave uses a quantum annealing approach which it sees as optimal for solving energy-minimization problems such as optimization and sampling.

It should be noted, however, that most quantum computing firms have chosen other approaches, rather than quantum annealing.

In fact, it has little evidence of widespread customer interest. D-Waves Q2 earnings release showed just $1.71 million of revenues, which fell short of already modest expectations. Also, the revenue growth rate of 25% year over year (YOY) is quite unimpressive for a firm with such a small starting base of operations.

Perhaps quantum annealing will eventually take off. However, D-Waves current $175 million market capitalization seems awfully steep given the minimal revenues and fairly unproven nature of the technology.

On the date of publication, Ian Bezek did not have (either directly or indirectly) any positions in the securities mentioned in this article. The opinions expressed in this article are those of the writer, subject to the InvestorPlace.com Publishing Guidelines.

Ian Bezek has written more than 1,000 articles for InvestorPlace.com and Seeking Alpha. He also worked as a Junior Analyst for Kerrisdale Capital, a $300 million New York City-based hedge fund. You can reach him on Twitter at @irbezek.

Read the original post:
Why These 3 Stocks Are the Worst Ways to Play Quantum ... - InvestorPlace

Read More..