Page 1,136«..1020..1,1351,1361,1371,138..1,1501,160..»

Deloitte and the World Economic Forum collaborate to launch the … – PR Newswire

Forecasting unprecedented challenges to cybersecurity amid accelerating quantum computing developments, the Quantum Readiness Toolkit provides organizations with resources to help stay ahead of threats

NEW YORK, June 29, 2023 /PRNewswire/ -- Today, in collaboration with Deloitte, the World Economic Forum (The Forum) released actionable guidance to help protect organizations during the rapid development of quantum computing technology. The Quantum Readiness Toolkit provides specific guidance in line with the overall framework presented in last year's flagship report, "Transitioning to a Quantum-Secure Economy."

"The Forum has developed specific messaging for executive leaders to help mitigate quantum risk. This is important as many discussions on this topic get bogged down about when a quantum computer will be available to mount these types of attacks. The dialog should be much more about preparing today while you still have time so that quantum risk can be methodically mitigated and businesses can continue to thrive without fear of disruption," says Colin Soutar, Managing Director and Deloitte co-lead for WEF Quantum Security support, Deloitte US.

Advancements in quantum computing have the potential for systemic cybersecurity risk, whether through increased breaches of sensitive health and financial personal data, compromised private communications, or forged digital versions of information, identities and sensitive data. The new paper, "Quantum Readiness Toolkit: Building a Quantum Secure Economy," outlines five principles businesses and organizations should follow when building their quantum security readiness. These include:

"It is essential that organizations across industries and geographies work together to help mitigate emerging risks and collectively build a more cyber secure future. Deloitte is committed to collaborating on important issues such as managing quantum risk and we are grateful to the World Economic Forum for such a strong collaboration on this and other topics," says Isaac Kohn, Partner and Deloitte co-lead for WEF Quantum Security support, Deloitte Switzerland.

Protecting our digital economy and data from quantum computer attacks requires a cohesive, cross-border approach that integrates government support with individual organizations' support as they test strategies and identify what works. With the timeline to prepare for quantum computing threats shrinking, businesses should take a multi-pronged approach, including empowering the quantum cybersecurity team with executive support and mandates, upskilling talent, and requiring quantum security provisions in new or renewed product contracts.

"As we make rapid strides towards the quantum era, it is crucial for leaders to ensure that security risks don't come in the way of realizing the transformative potential of the technology. We are grateful for the collaboration with Deloitte and the efforts of a community comprising the foremost leaders in quantum security that has resulted in the Quantum Readiness Toolkit, a comprehensive roadmap to ensure a secure transition to the quantum economy,"saysAkshay Joshi, Head of Industry and Partnerships, Centre for Cybersecurity, World Economic Forum.

To view the full report, please visit https://www.weforum.org/whitepapers/quantum-readiness-toolkit-building-a-quantum-secure-economy

About Deloitte"Deloitte," "us," "we" and "our" refer to one or more of Deloitte Touche Tohmatsu Limited ("DTTL"), its global network of member firms, and their related entities (collectively, the "Deloitte organization"). DTTL (also referred to as "Deloitte Global") and each of its member firms and related entities are legally separate and independent entities, which cannot obligate or bind each other in respect of third parties. DTTL and each DTTL member firm and related entity is liable only for its own acts and omissions, and not those of each other. DTTL does not provide services to clients. Please seewww.deloitte.com/aboutto learn more.

Deloitte provides industry-leading audit and assurance, tax and legal, consulting, financial advisory, and risk advisory services to nearly 90% of the Fortune Global 500and thousands of private companies. Our professionals deliver measurable and lasting results that help reinforce public trust in capital markets, enable clients to transform and thrive, and lead the way toward a stronger economy, a more equitable society and a sustainable world. Building on its 175-plus year history, Deloitte spans more than 150 countries and territories. Learn how Deloitte's approximately 415,000 people worldwide make an impact that matters atwww.deloitte.com.

SOURCE Deloitte

Follow this link:
Deloitte and the World Economic Forum collaborate to launch the ... - PR Newswire

Read More..

Research begins with a newly developed 8-qubit quantum … – Scientific Computing World

Tohoku University and NEC Corporation have started joint research on computer systems using an 8-qubit quantum annealing machine developed by NEC and Japan's National Institute of Advanced Industrial Science and Technology (AIST).

The 8-qubit quantum annealing machine used in this research has been newly developed using superconducting technology paired with ParityQC Architecture(*1). Owing to this, the machine is resistant to noise and remains capable of scaling up to a fully-connected quantum annealing architecture while maintaining a prolonged quantum superposition state.

This is the first domestically manufactured quantum annealing machine in Japan that is accessible from the outside via the Internet. This joint research is also the first project to use this machine.

Joint research using the newly developed 8-qubit quantum annealing machine

Solving complex social issues entails deriving optimal combinations from a large number of options (solution of combinatorial optimization problems). To solve combinatorial optimization problems at high speed and with high accuracy, NEC and AIST are developing a quantum annealing machine(*2) using superconducting parametrons(*3).

Using superconducting parametrons makes this quantum annealing machine resistant to noise and enables a long coherence time (duration for maintaining the quantum state)(*4). Coherence time is generally shortened during multi-qubit implementation. However, in addition to the noise-resistant characteristic of superconducting parametrons, the machine is able to maintain a long coherence time even in multi-qubit implementation by adopting the ParityQC architecture, a coupling technology that is highly compatible with parametrons. These features enable the calculation of real-world combinatorial optimization problems at high speed and with high accuracy.

In regard to these two technologies, NEC already succeeded in demonstrating the operation of the unit cell consisting of four qubits in March 2022(*5). Recently, it succeeded in developing a quantum annealing machine consisting of eight qubits by aligning the unit cells.

The newly developed 8-qubit chip based on unit cells with long coherence time

Tohoku University and NEC began joint research on high-performance computing technologies in 1958. In 2014, the Joint Research Division of High-Performance Computing (NEC) was established within the Tohoku University Cyberscience Center(*6) to conduct research aimed at solving various scientific and social issues. For the current joint research, Tohoku University and NEC will study the application of the above quantum annealing machine to many combinatorial optimization problems that exist in the real world, such as deriving optimal evacuation routes to mitigate damage and injuries from tsunami inundation.

Prior to this joint research, Tohoku University and NEC were already working on the development of quantum-annealing-assisted next-generation supercomputing platforms under Japans Ministry of Education, Culture, Sports, Science and Technologys (MEXT) Next-Generation Research and Development Project(*7) since 2018. This initiative is aimed at further improving the performance and sophistication of vector supercomputers, which have shown high processing capacity in many practical applications, and at developing a new supercomputing platform through functional complementation of vector supercomputers with quantum and simulated quantum annealing specialized for combinatorial optimization problems. In recognition of these initiatives on quantum annealing, Tohoku University has been accredited by Japans Cabinet Office as a "Quantum Solution Center"(*8).

In the current joint research, the 8-qubit quantum annealing machine based on superconducting parametrons developed by NEC and AIST will be made available to Tohoku University via the Internet. As part of this joint research, Tohoku University and NEC will utilize both the quantum annealing machine and the simulated quantum annealing machine (NEC Vector Annealing) that runs on the vector supercomputer "SX Aurora TSUBASA"(*9) installed at Tohoku University to leverage the features of both the quantum annealing machine and the simulated quantum annealing machine(*10). Going forward, Tohoku University and NEC will jointly conduct research on computing system architectures to solve complex social issues. Further, they will also explore use cases unique to quantum annealing, which has the potential to perform high-speed computations.

In addition to the simulated quantum annealing machine installed at Tohoku University, NEC and Tohoku University will also use the quantum annealing machine installed at AIST via the Internet. Researchers will examine the overall configuration of both machines in consideration of the effects of communication delays and other factors and feed the results back to the future development of both quantum annealing and simulated quantum annealing machines. Furthermore, in order to solve problems in the real world with the speed and accuracy of quantum annealing, they will investigate how to optimally allocate computations to both machines and endeavor to improve their usefulness.

Tohoku University and NEC will leverage this joint research as an opportunity to further accelerate the social implementation of quantum computing technologies.

More:
Research begins with a newly developed 8-qubit quantum ... - Scientific Computing World

Read More..

From 2023 to 2031, the Quantum Processing Units (QPU) Market to … – InvestorsObserver

From 2023 to 2031, the Quantum Processing Units (QPU) Market to witness Stellar Growth of 40.5% CAGR, reaching US$ 13.1 billion: TMR Report

Wilmington, Delaware, United States, June 27, 2023 (GLOBE NEWSWIRE) -- Transparency Market Research Inc. - The global quantum processing units (QPU) market is slated to register astronomical growth, registering a CAGR of 41.7 % from 2023 to 2031. By the end of the aforementioned forecast period, the market is poised to reach a valuation of US$ 13.1 billion.

A quantum processing unit (QPU) is a real-world or simulated processor with a collection of linked qubits that can be used to manipulate quantum algorithms. It serves as the brain of a quantum computer or simulator.

QPU offers significant advantages while executing certain tasks such as simulating quantum systems and tackling complex optimization problems. However, large-scale qubits and superior QPUs are necessary to reap these advantages. Therefore, study and development of high-performance QPU is one of the crucial objectives of quantum computing.

Quantum computing, AI, and hybrid cloud technologies are used to detect tumors at an early stage and enhance the development of targeted drugs, thus leading to notable advancements in the field of pharmacology. These systems facilitate the swift identification of alterations and abnormalities by enabling clinicians to conveniently review CT scans over time.

In March 2023, IBM installed an on-site IBM-managed quantum computer at Cleveland Clinic in the United States. It was installed for healthcare research to help Cleveland Clinic accelerate biomedical discoveries. Hence, increase in usage of quantum computing and other advanced computing technologies is expected to spur the Quantum Processing Units (QPU) market growth in the near future.

To Remain Ahead of Your Competitors, Request for a Sample: https://www.transparencymarketresearch.com/sample/sample.php?flag=S&rep_id=85086

Market Snapshot:

Key Takeaways from the Market Study

Prominent Drivers of the Quantum Processing Units (QPU) Market

Buy this Premium Research Report | Immediate Delivery Available - https://www.transparencymarketresearch.com/checkout.php?rep_id=85086&ltype=S

Quantum Processing Units (QPU) Market- Regional Profile

Competitive Landscape

Prominent quantum processing unit providers as profiled by Transparency Market Research include:

Key Market Developments

Get Customization on this Report for Specific Research Solutions: https://www.transparencymarketresearch.com/sample/sample.php?flag=CR&rep_id=85086

Key Segments Covered

Offering

Technology

Application

Industry Vertical

Region

About Transparency Market Research

Transparency Market Research, a global market research company registered at Wilmington, Delaware, United States, provides custom research and consulting services. Our exclusive blend of quantitative forecasting and trends analysis provides forward-looking insights for thousands of decision makers. Our experienced team of Analysts, Researchers, and Consultants use proprietary data sources and various tools & techniques to gather and analyses information.

Our data repository is continuously updated and revised by a team of research experts, so that it always reflects the latest trends and information. With a broad research and analysis capability, Transparency Market Research employs rigorous primary and secondary research techniques in developing distinctive data sets and research material for business reports.

Contact:

Nikhil Sawlani Transparency Market Research Inc. CORPORATE HEADQUARTER DOWNTOWN, 1000 N. West Street, Suite 1200, Wilmington, Delaware 19801 USA Tel: +1-518-618-1030 USA Canada Toll Free: 866-552-3453 Website: https://www.transparencymarketresearch.com Blog : https://tmrblog.com Email: sales@transparencymarketresearch.com

Go here to read the rest:
From 2023 to 2031, the Quantum Processing Units (QPU) Market to ... - InvestorsObserver

Read More..

How Golteum (GLTM) tokens promote decentralization in the Crypto … – Cyprus Mail

What Are Cookies

As is common practice with almost all professional websites, https://cyprus-mail.com (our Site) uses cookies, which are tiny files that are downloaded to your device, to improve your experience.

This document describes what information they gather, how we use it, and why we sometimes need to store these cookies. We will also share how you can prevent these cookies from being stored however this may downgrade or break certain elements of the Sites functionality.

How We Use Cookies

We use cookies for a variety of reasons detailed below. Unfortunately, in most cases, there are no industry standard options for disabling cookies without completely disabling the functionality and features they add to the site. It is recommended that you leave on all cookies if you are not sure whether you need them or not, in case they are used to provide a service that you use.

The types of cookies used on this Site can be classified into one of three categories:

Disabling Cookies

You can prevent the setting of cookies by adjusting the settings on your browser (see your browsers Help option on how to do this). Be aware that disabling cookies may affect the functionality of this and many other websites that you visit. Therefore, it is recommended that you do not disable cookies.

Third-Party Cookies

In some special cases, we also use cookies provided by trusted third parties. Our Site uses [Google Analytics] which is one of the most widespread and trusted analytics solutions on the web for helping us to understand how you use the Site and ways that we can improve your experience. These cookies may track things such as how long you spend on the Site and the pages that you visit so that we can continue to produce engaging content. For more information on Google Analytics cookies, see the official Google Analytics page.

Google Analytics

Google Analytics is Googles analytics tool that helps our website to understand how visitors engage with their properties. It may use a set of cookies to collect information and report website usage statistics without personally identifying individual visitors to Google. The main cookie used by Google Analytics is the __ga cookie.

In addition to reporting website usage statistics, Google Analytics can also be used, together with some of the advertising cookies, to help show more relevant ads on Google properties (like Google Search) and across the web and to measure interactions with the ads Google shows.

Learn more about Analytics cookies and privacy information.

Use of IP Addresses

An IP address is a numeric code that identifies your device on the Internet. We might use your IP address and browser type to help analyze usage patterns and diagnose problems on this Site and improve the service we offer to you. But without additional information, your IP address does not identify you as an individual.

Your Choice

When you accessed this Site, our cookies were sent to your web browser and stored on your device. By using our Site,you agree to the use of cookies and similar technologies.

More Information

Hopefully, the above information has clarified things for you. As it was previously mentioned, if you are not sure whether you want to allow the cookies or not, it is usually safer to leave cookies enabled in case it interacts with one of the features you use on our Site. However, if you are still looking for more information, then feel free to contact us via email at [emailprotected]

Read more:

How Golteum (GLTM) tokens promote decentralization in the Crypto ... - Cyprus Mail

Read More..

How will DogeMiyagi, Ethereum, and Polygon navigate regulation? – Analytics Insight

Navigating Crypto Regulation: How Can DogeMiyagi, Ethereum, and Polygon Learn From Ripple vs. SEC?

Regulatory uncertainty has been a persistent concern for crypto investors and cryptocurrencies, especially as the US Securities and Exchange Commission (SEC) increasingly targets crypto firms. Crypto enthusiasts and analysts have been eagerly awaiting the end of the longstanding SEC vs. Ripple case, which has just reached a major turning point as Ripple released the Hinman documents as part of its defense. Analysts are now waiting to see the implications of these documents and how they could affect altcoins like Ethereum (ETH), Polygon (MATIC), and DogeMiyagi ($MIYAGI).

The SEC brought charges against Ripple, the company behind XRP, in 2020, alleging that it sold unregistered securities. Ripples CEO fought these charges, sparking an ongoing legal battle with the SEC. As part of their defense, the Hinman documents have been released, and according to analysts at JP Morgan Chase and Co., this could have a profound impact on the altcoin market. The documents reveal that in 2018, the SEC did not consider Ethereum to be a security because it did not have a controlling group, so its network was considered sufficiently decentralized. The documents also acknowledge that there is regulatory uncertainty in the current system if Ethereum isnt considered a security, so new rules and regulations would need to be decided. Analysts now believe that altcoins will need to emulate Ethereums level of decentralization to navigate the regulatory landscape successfully.

The SEC recently brought charges against crypto exchanges Coinbase and Binance, citing many top altcoins in the charges, including Polygon. The release of the Hinman documents may play a pivotal role in determining Polygons future, and if it could reach the same level of decentralization as Ethereum, it could protect itself from these charges. Polygon Labs has also proposed an upgrade known as zkEVM validium, which aims to enhance security while keeping transaction fees low. This could enhance Polygons network and put it on track to becoming more decentralized.

DogeMiyagi is a new meme coin project built on Ethereums infrastructure and communicated to decentralization. The project boasts a decentralized autonomous organization (DAO), which empowers users to be a part of the decision-making process rather than a central authority. This focus on decentralization could give DogeMiyagi more acceptance as crypto regulation tightens.

DogeMiyagi also shows its commitment to the community through giveaways, exclusive NFTs, and a unique referral program. By inviting new investors to join the community, $MIYAGI holders can gain a 10% bonus. DogeMiyagis emphasis on community participation and loyalty through the referral program could work in its favor, fostering trust and a sense of belonging among its supporters.

The release of the Hinman documents in the Ripple vs. SEC case could bring more clarity to the crypto regulatory landscape. As altcoins navigate the challenges of increased scrutiny, decentralization could be the key to finding solace. DogeMiyagis focus on community and decentralization, therefore, positions it to navigate the uncharted crypto waters.

Website: https://dogemiyagi.com

Twitter: https://twitter.com/_Dogemiyagi_

Telegram: https://t.me/dogemiyagi

More here:

How will DogeMiyagi, Ethereum, and Polygon navigate regulation? - Analytics Insight

Read More..

Crypto Congestion | DogeMiyagi, Ethereum + Bitcoin – Analytics Insight

Cutting the Block: DogeMiyagi, Ethereum, and Bitcoins Approach to Network Congestion

As with the introduction of all new technology, one inevitable occurrence is the hoards of users all gathering on the servers at the same time, putting through multiple transactions and sometimes this can be very stressful for the server. In recent years, network congestion has become a significant challenge in the crypto space. As its popularity grows and more people jump into mining, high transaction volumes often result in delays and increased fees, hindering user experience.

Luckily, there are a variety of ways to tackle this whilst still leaving the user experience intact. Over the years, Ethereum (ETH) and Bitcoin (BTC) have found solutions to address network congestion. Their approaches, trade-offs, and implications for transaction speed and user experience serve as a lesson for newcomers like DogeMiyagi (MIYAGI), who is slowly implementing its own strategy to combat congestion.

Ethereum, one of the most established cryptocurrencies, revolutionized the industry by introducing smart contracts and decentralized applications. Its blockchain has become a hub for innovation, enabling developers to create a wide range of decentralized solutions.

Ethereum has implemented and continues to explore various scalability solutions to improve transaction speed and alleviate network congestion. Ethereum 2.0, an upgrade for the network, aims to transition from a proof-of-work to a proof-of-stake consensus mechanism. This transition will enhance scalability and reduce energy consumption.

Additionally, Ethereum has integrated layer-two solutions, such as the Polygon network and Optimisms Optimistic Rollups. These solutions allow for faster and cheaper transactions by processing them off-chain and settling them on the Ethereum mainnet later. By offloading a significant portion of transactions to layer-two solutions, Ethereum reduces congestion and improves the overall user experience.

Bitcoin, the first and most well-known cryptocurrency, laid the foundation for the entire industry. With its decentralized nature and limited supply, Bitcoin has become a digital store of value and a widely accepted means of exchange.

Bitcoins strategy to address network congestion is primarily focused on preserving decentralization and security while maintaining its core functionality as digital gold. Its approach to network congestion focuses on security and decentralization. While Bitcoins transaction speed may not match that of newer cryptocurrencies, its robustness lies in its ability to process secure and immutable transactions.

Bitcoin has implemented the Lightning Network as a layer-two scaling solution to mitigate network congestion. The Lightning Network allows users to open payment channels off-chain, enabling fast and low-cost transactions. By leveraging these channels, users can conduct numerous transactions without burdening the Bitcoin blockchain directly, thus reducing congestion.

DogeMiyagi, a rising meme token in the crypto sphere, combines the popularity of Doge and Miyagi from the Karate Kid movie, infusing humor and a sense of community into the project. By leveraging the familiarity of meme culture, DogeMiyagi aims to create a unique and engaging space for cryptocurrency enthusiasts.

To enhance transaction speed, Dogemiyagi employs a multi-layered approach. Firstly, it utilizes off-chain solutions, such as state channels and sidechains, for faster and cheaper peer-to-peer transactions. By moving a significant portion of transactions off the main blockchain, DogeMiyagi reduces congestion and ensures quicker settlements.

Lastly, Dogemiyagi actively explores layer-two solutions, including the integration of various Layer-2 protocols like Optimistic Rollups and Plasma. These solutions allow for faster transaction confirmations and significantly reduce congestion by processing multiple transactions off-chain before settling them on the main blockchain.

In the quest to address network congestion, Dogemiyagi, Ethereum, and Bitcoin employ different strategies with unique trade-offs and implications for transaction speed and user experience. Dogemiyagi emphasizes scalability and transaction speed Ethereum focuses on protocol upgrades, including Ethereum 2.0 and layer-two solutions, to enhance scalability and reduce congestion. Bitcoin, while prioritizing decentralization and security, leverages the Lightning Network as a layer-two solution to alleviate network congestion.

Website: https://dogemiyagi.com

Twitter: https://twitter.com/_Dogemiyagi_

Telegram: https://t.me/dogemiyagi

Link:

Crypto Congestion | DogeMiyagi, Ethereum + Bitcoin - Analytics Insight

Read More..

Lido Governance Greenlights Revamped Revenue Sharing Program – Yahoo Finance

The Lido community passed a revised rewards program designed to accelerate its growth alongside two other governance proposals on June 29.

Lido now offers a tiered revenue-sharing program that allocates a portion of its 5% share of staking rewards to prospective partners. Lido will also form a committee tasked with the authority to whitelist and distribute stETH rewards per the program.

Participants must apply for the program and meet eligibility criteria, and rewards will be paid out gradually over fixed terms.

Lidos governance token LDO is up nearly 10% in the past 24 hours.

Lido Governance Greenlights Revamped Revenue Sharing Program

The proposals come at a time when Lidos increasing staking dominance is under intense scrutiny.

Lido currently controls nearly 32% of the roughly 23.5M staked Ether and also accounts for 32% of Ethereums 733,950 validators, posing a centralization risk to the network.

Last month, Vitalik Buterin, Ethereums chief scientist and co-founder, recommended that no single staking pool control more than 15% of the networks validators. As such, Lidos move to further bolster its growth through revenue-sharing incentives has drawn the ire of decentralization advocates.

On June 29, Danny Ryan of the Ethereum Foundation tweeted that liquid staking centralization threatens the network by driving otherwise disparate node operators to operate in a unified manner. Lido asserts that its validators are independent.

The revenue-sharing program will also likely target expanded DeFi integrations, despite prominent Ethereum community members sounding the alarm on Ethereums consensus mechanism being potentially subverted.

The goal of staking is not to promote DeFi, the goal of staking is to promote the security and the health of the Ethereum network,said Superphiz, the co-founder of the EthStaker community. Youve got to keep those two goals separate.

On June 22, Seraphim, a Lido contributor, proposed a strategic alliance between Lido and Mantle. The deal would allocate 40,000 ETH to Lido from BitDAOs treasury to bootstrap liquidity across Mantle, with Lido and Mantle sharing revenue generated from the stETH over 12 months. The proposal has been met with mixed reactions on Lidos governance forum.

Story continues

BitDAO Approves Rebrand and Token Swap

Meanwhile, Lidos mission statement, which passed on the same day, espouses keep[ing] Ethereum decentralized, accessible to all, and resistant to censorship.

Lidos community also approved a 300,000 DAI grant for Launchnodes, an impact staking project allowing users to distribute a share of their staking rewards to non-profit beneficiaries, including Unicef, GiveDirectly, and Treedom.

The grant allocates funding for the development of a user interface and smart contracts, the completion of a security audit, and consultation regarding the tax, KYC, and AML obligations of the project. Launchnodes will also deploy an Impact Staking Smart Contract for Lido, allowing users to participate in impact staking.

Lido will retain 60% of the allocated funding until Impact Staking has generated $3M in funding.

Read more:

Lido Governance Greenlights Revamped Revenue Sharing Program - Yahoo Finance

Read More..

How blockchain is reshaping the entertainment industry Q&A with EarnTV – Cointelegraph

In the ever-evolving landscape of entertainment, the convergence of watch-to-earn and blockchain technology is reshaping the way users consume and engage with media content. This dynamic fusion offers a range of possibilities, from rewarding viewers for their time spent watching to creating transparent and decentralized ecosystems.

These opportunities inspired Pascal Vallat, a seasoned professional in marketing and digital media to create EarnTV, a platform that aims to revolutionize the way users consume and engage with media content.

In this interview, Vallat explained his view on how blockchain can transform the entertainment industry and create new opportunities for viewers, content creators and stakeholders alike.

Cointelegraph: Can you start by sharing your journey in the digital media and linear television industry, and how it influenced your decision to start EarnTV?

Pascal Vallat: With 20 years of experience in marketing, data, television and digital media, I have seen the industry evolve firsthand. The rise of streaming and the growing demand for personalized content inspired me to create EarnTV. I saw an opportunity to improve the viewing experience, reward viewers and empower content creators through blockchain and Web3 technologies. EarnTV is my vision to bridge the gap between viewers, content owners and advertisers, creating a fair and rewarding ecosystem for all.

CT: How would you describe the shift from traditional models to Web3 and blockchain technologies in the media and entertainment industry?

PV: The shift from traditional models to Web3 and blockchain technologies is transformative. It brings transparency, decentralization and new opportunities for producers, rights holders, top studios and viewers. Blockchain ensures immutable records, while smart contracts enable fair distribution of rewards and tokenization unlocks new forms of value exchange. This shift empowers individuals, removes intermediaries, and enables direct engagement between creators and consumers. It opens the door to innovative monetization models, personalized experiences, and a more inclusive media and entertainment landscape.

CT: Could you explain how the EarnTV platform leverages blockchain technology to transform the viewing experience for users?

PV: EarnTV leverages blockchain to transform the viewing experience in several ways. First, it enables secure and transparent transactions through smart contracts, ensuring a fair distribution of rewards to viewers. Second, the decentralized nature of blockchain eliminates intermediaries, reducing costs and increasing revenue opportunities for content creators. Third, tokenization allows viewers to earn rewards simply by watching content, creating a seamless and engaging experience. Finally, blockchain enables the creation of the ETV token, which unlocks additional benefits for viewing, liking, sharing content, inviting friends and generating value within the EarnTV ecosystem.

Source: EarnTV

CT: You recently announced the ETV token presales. Can you tell us more about this and what it means for the future of EarnTV?

PV: The ETV token presales mark an important milestone for EarnTV and its future. The ETV token serves as a utility token within our ecosystem, allowing users to earn rewards, access exclusive content and engage with the platform. By participating in the presales, supporters can acquire ETV tokens at an early stage and benefit from potential future appreciation. These funds will fuel the development of EarnTV, allowing us to enhance the platform, expand partnerships and deliver an innovative entertainment experience.

CT: Can you explain the concept of Watch to Earn? How does this offering set EarnTV apart from other platforms?

PV: Watch to Earn is a core concept of EarnTV that sets us apart from other platforms. Simply put, viewers are rewarded with ETV tokens for the time they spend watching movies, TV shows and premium fast channels. This innovative approach recognizes the value of viewers attention and turns it into tangible rewards. Unlike traditional platforms, EarnTV directly benefits its users, creating an engaging and rewarding ecosystem that incentivizes viewing and fosters a strong community of content enthusiasts. We launched EarnTV on all devices with 2,000 hours of content and 200 premium fast channels. Time spent watching content is rewarded with the ETV utility token.

CT: What is the ETV NFT Movie Club"_blank" data-amp="https://cointelegraph-com.cdn.ampproject.org/c/s/cointelegraph.com/learn/what-are-nfts-and-why-are-they-revolutionizing-the-art-world/amp" href="https://cointelegraph.com/learn/what-are-nfts-and-why-are-they-revolutionizing-the-art-world">nonfungible tokens (NFTs). Members of the ETV NFT Movie Club can get early access to highly anticipated films and engage in immersive movie-related experiences. By participating in the club, viewers gain privileges and become part of an exclusive community of film enthusiasts. Its a way for us to reward our users with memorable cinematic experiences and foster a deeper connection with the entertainment industry.

CT: What are some of the key challenges youve faced in deploying a blockchain-based video content delivery protocol, and how have you overcome them?

PV: Implementing a blockchain-based video content delivery protocol has its challenges. One of the key challenges is scalability, ensuring that the platform can handle a large number of concurrent users and deliver a seamless streaming experience. Weve addressed this by leveraging scalable blockchain solutions and optimizing our infrastructure for high-performance delivery.

Another challenge is user adoption and education. To overcome this, weve focused on providing an easy-to-use interface and educating our community about the benefits and features of blockchain technology, building trust and fostering engagement. The benefits of decentralized storage are also part of our value proposition to all rights holders who want to join our content fandom.

CT: Given the dynamic nature of the industry, how do you see the future of EarnTV and the role of blockchain technology in media and entertainment?

PV: The future of EarnTV is bright, driven by our commitment to innovation and the transformative potential of blockchain technology. We envision EarnTV empowering viewers, content owners and advertisers alike, and becoming the leading cross-platform video content delivery protocol.

Blockchain technology will continue to play a central role in ensuring transparency, trust and fair rewards within the ecosystem. As the media and entertainment industry evolves, we will adapt by introducing new features, expanding partnerships and pioneering new ways to enhance the viewing experience while remaining at the forefront of blockchain adoption.

CT: Finally, could you share any upcoming features or plans that users should look forward to on EarnTV?

PV: Users can look forward to the launch of our decentralized hub, which will give content owners more control and monetization options. Were also expanding our partnership network to offer a wider range of content, ensuring a diverse and engaging selection for our viewers. In addition, were working to introduce innovative features such as interactive NFT experiences and enhanced social features to foster a vibrant community within EarnTV.

Disclaimer. Cointelegraph does not endorse any content or product on this page. While we aim at providing you with all important information that we could obtain in this sponsored article, readers should do their own research before taking any actions related to the company and carry full responsibility for their decisions, nor can this article be considered as investment advice.

See the original post here:

How blockchain is reshaping the entertainment industry Q&A with EarnTV - Cointelegraph

Read More..

Here’s Why Google DeepMind’s Gemini Algorithm Could Be Next-Level AI – Singularity Hub

Recent progress in AI has been startling. Barely a weeks gone by without a new algorithm, application, or implication making headlines. But OpenAI, the source of much of the hype, only recently completed their flagship algorithm, GPT-4, and according to OpenAI CEO Sam Altman, its successor, GPT-5, hasnt begun training yet.

Its possible the tempo will slow down in coming months, but dont bet on it. A new AI model as capable as GPT-4, or more so, may drop sooner than later.

This week, in an interview withWill Knight, Google DeepMind CEO Demis Hassabis said their next big model, Gemini, is currently in development, a process that will take a number of months. Hassabis said Gemini will be a mashup drawing on AIs greatest hits, most notably DeepMinds AlphaGo, which employed reinforcement learning to topple a champion at Go in 2016, years before experts expected the feat.

At a high level you can think of Gemini as combining some of the strengths of AlphaGo-type systems with the amazing language capabilities of the large models, Hassabis told Wired. We also have some new innovations that are going to be pretty interesting. All told, the new algorithm should be better at planning and problem-solving, he said.

Many recent gains in AI have been thanks to ever-bigger algorithms consuming more and more data. As engineers increased the number of internal connectionsor parametersand began to train them on internet-scale data sets, model quality and capability increased like clockwork. As long as a team had the cash to buy chips and access to data, progress was nearly automatic because the structure of the algorithms, called transformers, didnt have to change much.

Then in April, Altman said the age of big AI models was over. Training costs and computing power had skyrocketed, while gains from scaling had leveled off. Well make them better in other ways, he said, but didnt elaborate on what those other ways would be.

GPT-4, and now Gemini, offer clues.

Last month, at Googles I/O developer conference, CEO Sundar Pichai announced that work on Gemini was underway. He said the company was building it from the ground up to be multimodalthat is, trained on and able to fuse multiple types of data, like images and textand designed for API integrations (think plugins). Now add in reinforcement learning and perhaps, as Knight speculates, other DeepMind specialties in robotics and neuroscience, and the next step in AI is beginning to look a bit like a high-tech quilt.

But Gemini wont be the first multimodal algorithm. Nor will it be the first to use reinforcement learning or support plugins. OpenAI has integrated all of these into GPT-4 with impressive effect.

If Gemini goes that far, and no further, it may match GPT-4. Whats interesting is whos working on the algorithm. Earlier this year, DeepMind joined forces with Google Brain. The latter invented the first transformers in 2017; the former designed AlphaGo and its successors. Mixing DeepMinds reinforcement learning expertise into large language models may yield new abilities.

In addition, Gemini may set a high-water mark in AI without a leap in size.

GPT-4 is believed to be around a trillion parameters, and according to recent rumors, it might be a mixture-of-experts model made up of eight smaller models, each a fine-tuned specialist roughly the size of GPT-3. Neither the size nor architecture has been confirmed by OpenAI, who, for the first time, did not release specs on its latest model.

Similarly, DeepMind has shown interest in making smaller models that punch above their weight class (Chinchilla), and Google has experimented with mixture-of-experts (GLaM).

Gemini may be a bit bigger or smaller than GPT-4, but likely not by much.

Still, we may never learn exactly what makes Gemini tick, as increasingly competitive companies keep the details of their models under wraps. To that end, testing advanced models for ability and controllability as theyre built will become more important, work that Hassabis suggested is also critical for safety. He also said Google might open models like Gemini to outside researchers for evaluation.

I would love to see academia have early access to these frontier models, he said.

Whether Gemini matches or exceeds GPT-4 remains to be seen. As architectures become more complicated, gains may be less automatic. Still, it seems a fusion of data and approachestext with images and other inputs, large language models with reinforcement learning models, the patching together of smaller models into a larger wholemay be what Altman had in mind when he said wed make AI better in ways other than raw size.

Hassabis was vague on an exact timeline. If he meant training wouldnt be complete for a number of months, it could be a while before Gemini launches. A trained model is no longer the end point. OpenAI spent months rigorously testing and fine-tuning GPT-4 in the raw before its ultimate release. Google may be even more cautious.

But Google DeepMind is under pressure to deliver a product that sets the bar in AI, so it wouldnt be surprising to see Gemini later this year or early next. If thats the case, and if Gemini lives up to its billingboth big question marksGoogle could, at least for the moment, reclaim the spotlight from OpenAI.

Image Credit: Hossein Nasr / Unsplash

See more here:
Here's Why Google DeepMind's Gemini Algorithm Could Be Next-Level AI - Singularity Hub

Read More..

DeepMind scientists demonstrate the value of using the "veil of ignorance" to craft ethical principles for AI systems – PsyPost

New research by scientists at Google DeepMind provides evidence that the veil of ignorance can be a valuable concept to consider when crafting governance principles for artificial intelligence (AI). The researchers found that having people make decision behind the veil of ignorance encouraged fairness-based reasoning and led to the prioritization of helping the least advantaged.

The new findings appear in the Proceedings of the National Academy of Sciences (PNAS).

The veil of ignorance is a concept introduced by the philosopher John Rawls. It is a hypothetical situation where decision-makers must make choices without knowing their own personal characteristics or circumstances. By placing themselves behind this veil, individuals are encouraged to make impartial decisions that consider the interests of all parties involved. The authors of the new research wanted to see if applying the veil of ignorance could help identify fair principles for AI that would be acceptable on a society-wide basis.

As researchers in a company developing AI, we see how values are implicitly baked into novel technologies. If these technologies are going to affect many people from across the world, its critical from an ethical point of view that the values which govern these technologies are chosen in a fair and legitimate way, explained study co-authors Laura Weidinger, Kevin R. McKee, and Iason Gabriel, who are all research scientists at DeepMind, in joint statement to PsyPost.

We wanted to contribute a part of the overall puzzle of how to make this happen. In particular, our aim was to help people deliberate more impartially about the values that govern AI. Could the veil of ignorance help augment deliberative processes such that diverse groups of people have a framework to reason about what values may be the most fair?

Can a tool of this kind encourage people to agree upon certain values for AI? Iason Gabriel had previously hypothesized that the veil of ignorance might produce these effects, and we wanted to put these ideas to the (empirical) test, the researchers explained.

The researchers conducted a series of five studies with 2,508 participants in total. Each study followed a similar procedure, with some minor variations. Participants in the studies completed a computer-based harvesting task involving an AI assistant. Participants were informed that the harvesting task involved a group of three other individuals (who were actually computer bots) and one AI assistant.

Each participant was randomly assigned to one of four fields within the group, which varied in terms of harvesting productivity. Some positions were severely disadvantaged, with a low expected harvest, while others were more advantaged, with a high expected harvest.

The participants were asked to choose between two principles: a prioritarian principle in which the AI sought to help the worst-off individuals and a maximization principle in which the AI sought to maximizes overall benefits. Participants were randomly assigned to one of two conditions: they either made their choice of principle behind the veil of ignorance (without knowing their own position or how they would be affected) or they had full knowledge of their position and its impact.

The researchers found that participants in the veil of ignorance condition were more likely to choose the prioritarian principle over the maximization principle. This effect was consistent across all five studies and even when participants knew they were playing with computer bots instead of other humans. Additionally, participants who made their choices behind the veil of ignorance were more likely to endorse their choices when reflecting on them later, compared to participants in the control condition.

The researchers also investigated factors that influenced decision-making behind the Veil of Ignorance. They found that considerations of fairness played a significant role in participants choices, even when other factors like risk preferences were taken into account. Participants frequently mentioned fairness when explaining their choices.

In our study, when people looked at the question of how AI systems should behave, from an impartial point of view, rather than thinking about what is best for themselves, they more often preferred principles that focus on helping those who are less well off, the researchers told PsyPost. If we extend these findings into decisions we have to make about AI today, we may say that prioritizing to build systems that help those who are most disadvantaged in society is a good starting point.

Interestingly, participants political affiliation did not significantly influence their choice of principles. In other words, whether someone identified as conservative, liberal, or belonging to any other political group did not strongly impact their decision-making process or their support for the chosen principles. This finding suggests that the Veil of Ignorance mechanism can transcend political biases.

It was really interesting to see that political preferences did not substantially account for the values that people preferred for AI from behind a veil of ignorance, the researchers explained. No matter where participants were on the political spectrum, when reasoning from behind the veil of ignorance these beliefs made little difference to what participants deemed fair, with political orientation not determining what principle participants ultimately settled on.

Overall, the study suggests that using the veil of ignorance can help identify fair principles for AI governance. It provides a way to consider different perspectives and prioritize the interests of the worst-off individuals in society. But the study, like all research, include some limitations.

Our study is only a part of the puzzle, the researchers said. The overarching question of how to implement fair and inclusive processes to decide on values to encode into our technologies is still left open.

Our study also asked about AI under very specific circumstances, in a distributional setting specifically applying AI to a harvesting scenario where it acted as an assistant. It would be great to see how results may change when we change this to other specific AI applications. Finally, previous studies on the veil of ignorance were run in India, Sweden, and the USA it would be good to see whether the results from our experiment replicate or vary across a wide range of regions and cultures, too.

The study, Using the Veil of Ignorance to align AI systems with principles of justice, was authored by Laura Weidinger, Kevin R. McKee, Richard Everett, Saffron Huang, Tina O. Zhu, Martin J. Chadwick, Christopher Summerfield, and Iason Gabriel.

More here:
DeepMind scientists demonstrate the value of using the "veil of ignorance" to craft ethical principles for AI systems - PsyPost

Read More..