Page 471«..1020..470471472473..480490..»

The Revolution of German Farmers | Eva Vlaardingerbroek & Anthony Lee | EP 416 – The Daily Wire

The Jordan B. Peterson PodcastJan 22, 2024

Dr. Jordan B. Peterson speaks with German farmer Anthony Lee and journalist Eva Vlaardingerbroek. They discuss the ongoing farmers' protest, the war on efficient agriculture, what is now being panned as the failed German state, the ludicrous net-zero goals creating excess electric vehicles while cutting off the generation of power, and how a grassroots movement can make genuine change at the local and national levels.

Eva Vlaardingerbroek is a Dutch journalist and previous YouTube host of the Lets Talk About It program on the channel Riks. Vlaardingerbroek has published opinion articles in newspapers such as the Dutch weekly Elsevier Weekblad and appeared on programs like Tucker Carlson's show on Fox News. She advocates for a cultural return to faith and a rejection of the WEF manifested, globalist ideology.

Anthony Lee is a German farmer who has become outspoken in recent months as he followed news of farmers struggling all across Europe. Now affecting his home country, the time has come to take a stand, and not just for German farmers and workers, but for everyone.

See the article here:
The Revolution of German Farmers | Eva Vlaardingerbroek & Anthony Lee | EP 416 - The Daily Wire

Read More..

Decentralization: The Future Of Clean Energy OpEd – Eurasia Review

Today is the first-ever International Day of Clean Energy. This important day January 26th, 2024 ordained by the United Nations General Assembly (UNGA), is intended to encourage the transition to clean energy and ultimately inspire a transformation of the world as we know it. However, as we mark this milestone on the calendar, it is clear that the world is failing at this task. The global community remains mired in unclean energy, which creates emissions that warm the Earth. Because of the complacency surrounding this issue, the world has fallen behind in the UNs Sustainable Development Goal 7 (SDG7): affordable and clean energy for all by 2030. While the global community is lagging in its efforts, there is still hope. By implementing decentralized energy infrastructure, such as solar and wind systems, the possibility of achieving SDG7 by 2030 still exists.

Decentralized energy, also known as an autonomous energy grid (AEG), generates energy near the point of consumption and eliminates the energy lost in transport. However, with centralized energy, energy use can take place up to 300 miles (480 km) from production, squandering up to five percent of produced energy. From the late 1800s beginning with Nikola Teslas implementation of the alternating current to the 2000s, centralized energy was the most efficient and cost-effective energy production to serve as many people as possible. This efficiency came from the idea that bigger is better, so having one large plant or station for a large geographic area made it easy to maintain and monitor. However, as populations grew and spread out geographically, efficiency decreased. Energy that is produced but not consumed emits carbon. Technology in the late 1800s was limited, making decentralized energy unobtainable. However, decentralized energy has become an efficient and attainable alternative. In our world today, bigger is not always better.

So, how does decentralized energy production help reach the UNs clean energy goals? Clean energy technology is suitable for small-scale implementation. By implementing clean energy like solar and wind turbines on a decentralized basis, efficiency will be close to one hundred percent with zero emissions. The energy lost in distribution is eliminated.

Many countries in the Global North will vehemently oppose the transition from centralized energy because of their deeply ingrained, well-established systems, which would have an enormous cost and effort to uproot. Inversely, the Global South, which has less developed energy systems, is positioned well to transition to decentralized systems to satisfy the energy needs of all citizens, especially rural ones who are not connected to the grid. Ultimately, to be successful, governments cannot be solely responsible, particularly in the Global South. Private organizations and NGOs can play a vital role by prompting and creating trusting relationships by demonstrating successful decentralized clean energy systems.

In Morocco, the High Atlas Foundation (HAF), in conjunction with Germanwatch, successfully implemented a decentralized energy system of solar panels in the province of Youssoufia, specifically El Kdirat village in the rural Jnane Bouih commune. The residents reached out to HAF because they saw that their village and land held the potential to flourish. However, they faced socio-economic challenges as a rural and low-income community. Coming together, the community followed HAFs Imagine Program to determine their highest priority needs. The El Kdirat village determined that a decentralized system would be the most useful in powering irrigation for a tree nursery and providing running water in their school for drinking and bathrooms. The residents followed HAFs directive of adopting a participatory labor model, where the residents successfully constructed the solar system that fed energy into a pump system connected to the tree nursery and pipes in the school to the irrigation system. This decentralized solar system has given the El Kdirat village true sustainability that has bolstered the community and improved the villages economy.

As a whole, Africa has the potential to change the climate surrounding decentralized energy while helping all rural Africans with energy-related insecurities. Rural Africans comprise 60-80 percent of the African population; by turning to decentralized clean energy, African countries can become global leaders in clean energy. According to Statista and the World Bank, Africa has the worlds highest solar potential, making this leadership possible. While Africa is spearheading this new economic energy model, the Global North must commit to transitioning away from the centralized systems they have relied on for decades to reach the attainable goal of affordable and clean energy for all.

Read more:

Decentralization: The Future Of Clean Energy OpEd - Eurasia Review

Read More..

EXCLUSIVE: Charting The Course – Crypto’s Delicate Dance Between Decentralization And Regulation By Benzinga – Investing.com UK

Benzinga - by Abbey Higginbotham, Benzinga Staff Writer.

The Crypto & Blockchain Outlook in 2024 event hosted by Benzinga spotlighted the critical issue facing the crypto world today: balancing blockchains inherent decentralization with the emerging need for regulatory frameworks. This balance is increasingly vital as the cryptocurrency sector continues to evolve.

The Impact Of Politics On CryptoAlex Chizhik, COO of the Chamber of Digital Commerce, brought into focus the influence of political dynamics on cryptocurrencys regulatory path. He stressed the importance of political leadership in shaping the industrys future. The direction of cryptocurrency regulation is significantly influenced by political dynamics, Chizhik observed, highlighting policy-makings role in the crypto landscape.

A Global View On Cryptocurrency RegulationJoey Garcia, COO of Xapo Bank, then expanded the discussion globally. He discussed the diverse regulatory approaches adopted globally and stressed the importance of creating adaptable rules. Countries worldwide are trying to understand and navigate this new ecosystem, Garcia noted, emphasizing the need for regulations that align with the unique characteristics of digital assets.

Also Read: Bitcoin Pressure Eases As Profit-Taking Party Winds Down, JPMorgan Examines GBTC

Perspectives On Innovation And RegulationShifting the focus to the challenges faced by innovators, Stefan Russo, CEO of Truflation, expressed concerns about the impact of the current regulations on technological progress. The regulatory framework, as it stands, might stifle innovation, Russo warned, highlighting the potential negative consequences of overregulation.

Europes Approach To Crypto RegulationJesper Toft, founder of the GJU Protocol, offered a critique of the European Unions regulatory strategy. The EU is applying outdated frameworks to a new technology, Toft argued, stressing the need for regulations that fully grasp the essence and potential of blockchain and cryptocurrencies.

Benzinga Cannabis Conferences are coming to Los Angeles. Join the Benzinga Cannabis Market Spotlight: California and unlock the future of cannabis at the premier networking event in Culver City on February 22. Connect with top industry leaders, gain insider insights into the investment landscape, and shape the evolving markets in California and beyond. Dont miss this chance to be at the forefront of the cannabis industrys growth and innovation! Join now.

Now Read: EXCLUSIVE: SECs Green Light to Bitcoin ETF Is 2024 A New Era For Cryptocurrencies?

Images: Growtika/Unsplash

2024 Benzinga.com. Benzinga does not provide investment advice. All rights reserved.

Read the original article on Benzinga

See the article here:

EXCLUSIVE: Charting The Course - Crypto's Delicate Dance Between Decentralization And Regulation By Benzinga - Investing.com UK

Read More..

Ethereum’s decentralized mantra in tatters after execution of client bug – CoinGeek

Ethereums claims of decentralization are ringing ever more hollow due to the networks lack of client software diversity, an overreliance that could pose an existential threat to ETH stakers.

Last weekend, around 8% of Ethereumproof-of-stake(PoS) transaction validators suddenly beganproducing invalid blocksdue to a critical flaw in the Nethermind client software. The issue followed the release ofNetherlands v1.23.0 update, requiring a frantic patch to get these network nodes back in business.

While the order was eventually restored, a similar scenariobefell the Besu client earlier this month. Besus share of the Ethereum execution client market was around 5% at the time and has since fallen to 4%.

Ethereum watchers soon warned that the fallout could be catastrophic if/when a similar situation impacted the Geth (Go Ethereum) client. Geth accounted for around 84% of network execution clients at the time of the Nethermind bug, but a concerted effort to sound the eggs in one basket alarm has sincepushed this down to a mere 79%.

Geth is client software developed by the Ethereum Foundation that supports network functions like transaction validation and smart contract execution. Geth is generally considered a more robust option than its rivals, which, along with the Foundations stamp of approval, is why it accounts for such an outsized market share.

Regardless, such a high concentration level would be problematic in any sector. But in an industry so prone to probing by malicious actorsparticularly those adept atsoftware supply chain attacksits a recipe for disaster.

A Geth failure would bring Ethereum finalization to a screeching halt, but as Labrys developerLachlan Feeneyrecentlysuggested, this vulnerability could also result in the loss of the majority of the roughly 29 million ETH currently staked by validators.

Because validators are penalized for being offline, an inactivity leak could rob validators of two months worth of staking rewards in just two days. Should the downtime extend for five days, a years worth of rewards would be lost. A weeklong outage could cost validators 10% of their staked ETH, while 90% of that stake could be lost if the outage extends to six weeks. While such a prolonged outage may be unlikely, its also not impossible.

Geth-based validators would likely stampede for the exits rather than watch their stake bleed away to nothing. But theyd get caught in a logjam of similarly minded validators, all of whom would continue bleeding ETH as they wait their turn to disembarkthis sinking ship. Given the sheer number of Geth-based validators, this bottleneck could mean that only one in 12 could exit with more than 50% of their stake.

Fork me

The potential repercussions of a serious Geth bug dont stop there. Should a Geth-based validator produce an invalid block, Geths domination of the network could result in this invalid block being added to the chain, resulting in a fork that would quickly become the dominant chain. Geth validators would be blocked from the valid chain until the smaller chain is finalized.

As Feeney put it, Because the Geth validators are stuck on the invalid chain, they are considered inactive on the non-Geth chain and will suffer the inactivity leak. No software update or bug patch to Geth will save these validators. They will be bled out until their stake represents < of the network, allowing the non-Geth chain to finalize.

Feeney estimates that this bleeding could result in an 18% reduction in the total supply of ETH. (Thats definitely one method of creating artificial scarcity and thus losing the tokens fiat value.)

Feeney offered this warning: Staked ETH is not risk-free yield. Would you invest a minimum of $75,000 USD [the rough value of the 32 ETH required for serving as a validator] into an instrument where the maximum potential gain is 3.5% p.a. but the potential for loss is 100% (even if that loss is unlikely)? Probably not, but this is what 84% of the Ethereum stakers are doing today.

Lido shuffle

Retail users lacking the 32 ETH necessary to stake on their own have several options for pooling their resources, but staking via a service wont necessarily protect them from the potential carnage described above.

Lido Finance, the largest staking service with around 9.4 million ETH staked, relies on Geth for most of its operations. On January 23, Lidostated that its preliminary Q4/23 data put Geth usage across Lido protocol validators at 67%, down from 76% in Q3/23 and 93% in Q3/22. Lido said client diversity is fundamental to its mission to decentralize Ethereum.

Lido added that its Lido DAO (decentralized autonomous organization) node operators are afforded high degrees of autonomy, but they have already begun to signal their commitments to reduce majority client usage, or explain how their setups avoid the possibility of being affected by supermajority bugs.

Coinbase has a plan to make a plan

TheCoinbase(NASDAQ: COIN)exchangehas faced similar queries this week about its reliance on Geth. On January 22, CEOBrian Armstrongpersonally responded to a Coinbase customer whotweeted that they had unstaked all of the ETH I had staked with you since you offered it as a service. The user added that this single client staking setup made it not worth the risk of losing a large percentage of my deposit. Armstrong replied: Taking a look.

Later that day, the Coinbase Cloud accounttweeted that when it launched its ETH staking service, Geth was the only client that met our technical requirements. Coinbase claimed that execution client diversity is a critical concern, and thus, it was conducting an updated technical assessment with the goal of adding another execution client to our infrastructure. Coinbase promised to provide an update on its progress by the end of February.

Not everyone found this reassuring, with at least one customersuggesting that the situation wasnt a review and plan kind of phase. This is take serious and urgent action, with informed customers phase. Coinbase was urged to set up an insurance fund or allow us to opt in to other clients because the risk of supermajority failure is a *bigger* risk to your customers than minority client failure.

This probably wasnt the best week for Coinbase to runa sponsored post on Decrypt promoting the claim that its staking service aims to be a one-stop shop for crypto staking. One stop, one point of failure Synergy!

Coinbase rival exchangesBinance,Kraken, andBitfinex (among others) also rely solely on Geth to power their staking services. However, they have kept quiet regarding any plans they may have to inject a little diversity into their operations.

All in one

Decentralization theater has been Ethereums stock-in-trade from its inception, starting with the Ethereum Foundations oversight-bereft crowd sale that delivered the majority of ETH into the hands of a few whales. This concentration of wealth and power continues to this day via the PoS consensus mechanism thatfurther enriches the whales who can afford to run multiple validators.

This centralization was recently cited by the U.S. Securities and Exchange Commission (SEC) as part of the reason it was delaying decisions on applications to offer Ethereum spot-basedexchange-traded funds(ETF). BlackRock, Grayscale Investments, and Fidelity are among those chomping at the bit to offer Ethereum ETFs to the public following the launch of multipleBTC spot-based ETFs earlier this month.

In approving those BTC ETFs, SEC chairman Gary Gensler stressed that the decision was limited to one non-security commodity (BTC) and should in no way signal the Commissions willingness to approve listing standards for crypto asset securities. Gensler, who has previously stated his belief thatETH is an unregistered security, reiterated his dont expect ETH ETF approvals anytime soon message earlier this week.

On Thursday, the SEC posteda list of ETH-related questions for public comment before making any ETF decision. For instance, the SEC wonders if there are particular features related to ETH and its ecosystem, including its proof of stake consensus mechanism and concentration of control or influence by a few individuals or entities, that raise unique concerns about ETHs susceptibility to fraud and manipulation?

Interested parties are instructed to submit their comments within three weeks. Just remember Ethereum Foundation members, you have to at leasttry to make it look like theyre not all coming from the same address.

FollowCoinGeeks Crypto Crime Cartelseries, which delves into the stream of groupsfromBitMEXtoBinance,Bitcoin.com,Blockstream,ShapeShift,Coinbase,Ripple, Ethereum,FTXandTetherwho have co-opted the digital asset revolution and turned the industry into a minefield for nave (and even experienced) players in the market.

New to blockchain? Check out CoinGeeks Blockchain for Beginners section, the ultimate resource guide to learn more about blockchain technology.

Visit link:

Ethereum's decentralized mantra in tatters after execution of client bug - CoinGeek

Read More..

This competitor to Solana (SOL) is poised for a 50x surge upon its launch – AMBCrypto News

When it comes to cryptocurrencies and blockchain technology, competition is fierce. Solana (SOL), renowned for its speed and scalability, has been a dominant force in the blockchain space. But a strong contender has emerged, aiming to challenge Solanas supremacy Pandoshi.

Pandoshi is a blockchain project that has garnered significant attention and anticipation in the crypto community. Often referred to as the Solana Competitor, Pandoshi has set its sights on offering a unique and innovative ecosystem that promises to rival and potentially surpass Solanas achievements.

At its core, Pandoshi is driven by a set of fundamental principles that define its identity. It champions decentralization, advocates for privacy, and promotes monetary freedom. These principles align with the very essence for which blockchain technology was conceived. Pandoshi stands as a paragon of decentralization, and it is a community-driven initiative where the power over its present and future lies firmly in the hands of its people.

Pandoshis journey began with a highly anticipated presale. This phase marked the inception of Pandoshi, where the development of smart contracts paved the way for the sale of its native utility token, PAMBO. The presale was a crucial step in securing the necessary resources to propel Pandoshi forward.

As of writing, Pandoshi is in Phase 4 of its presale, with PAMBO tokens priced at $0.008. This phase has seen significant investor interest, with over $639,269 raised out of the targeted $2,400,000. This translates to a sold percentage of 27%, signaling a strong and growing community backing for the project.

At the core of the Pandoshi ecosystem lies its native utility token, PAMBO. PAMBO serves as the lifeblood of the network, facilitating various functions and transactions within the ecosystem. Much like Solanas SOL, PAMBO is more than just a token; it embodies the spirit of decentralization, privacy, and monetary freedom principles that blockchain technology was founded upon.

One of the standout features of PAMBO is its deflationary nature, supported by a robust buy-and-burn mechanism. This mechanism involves the deliberate removal of PAMBO tokens from circulation, reducing the overall supply. The burning process continues until 80% of the total supply is removed, a strategy designed to instill scarcity and potentially drive up the tokens value.

PandaChain is a crucial component of the Pandoshi ecosystem. It is a PoS (Proof-of-Stake) Layer-2 blockchain solution designed to be a cost-effective blockchain infrastructure for the Pandoshi community. The goal is to increase the burn rate of PAMBO, effectively reducing the circulating supply of the token. This reduction in supply has the potential to create scarcity and drive demand for PAMBO tokens.

PandaChain doesnt just focus on cost-efficiency; it also embraces innovation. The blockchain incorporates technologies like the PolyBFT consensus mechanism, StateSync, and Checkpoints for bridging, all of which enhance the robustness and scalability of the ecosystem. Additionally, PandaChain supports ERC standards like ERC-20, ERC-721, and ERC-1155 for token creation, providing flexibility for developers and users.

PandoshiSwap is a decentralized crypto exchange (DEX) that allows users to trade tokens directly without intermediaries. This peer-to-peer trading approach is more private and secure compared to centralized exchanges (CEXs). PandoshiSwap distinguishes itself from some competitors by committing to not arbitrarily ban assets and avoiding the integration of optional KYC codes.

PandoshiSwap not only supports multiple chains but also features an internal bridge for asset transfers across different blockchains. With a transaction fee of 0.3%, with 70% allocated to liquidity providers and 30% for the buy and burn of PAMBO tokens, PandoshiSwap incentivizes liquidity provision while reducing the tokens supply.

Security and privacy are paramount in any blockchain ecosystem. Pandoshi Wallet sets itself apart by adopting a non-custodial and highly secure approach. What makes it unique is its commitment to user data privacy Pandoshi Wallet does not collect any user data, ensuring data losses are averted.

Pandoshi has marked a significant achievement with the launch of its Pandoshi Wallets beta version on the Google Play Store, a development they proudly announced on Twitter. This launch, coinciding with their ongoing presale, represents a crucial advancement for the project, especially in supporting Ethereum Virtual Machine (EVM)-compatible chains, with plans to add non-EVM chains in the future. An iOS version of the wallet is also in the works, aiming to expand its accessibility to a wider audience.

The introduction of the Pandoshi Wallet in the Google Play Store has played a key role in enhancing the projects reputation in the market, helping to dispel doubts and boost investor confidence in its commitment to decentralized finance (DeFi). This move is in line with Pandoshis dedication to open-source development and a governance model led by its community, appealing to investors who prioritize privacy and decentralization. The wallets availability has led to an increase in investor participation in the presale, as many are eager to take advantage of this promising opportunity.

Pandoshi is not merely a competitor to Solana; it is a formidable challenger with a vision and ecosystem poised for significant growth. With a successful presale and strong community backing, Pandoshis prospects are promising. The project embodies the principles of decentralization, privacy, and monetary freedom, setting the stage for a potential 50x surge upon its launch.

As the crypto space continues to evolve, Pandoshi stands as a beacon of innovation and opportunity. Keep a close watch on this rising star, as it may very well be the next big contender in the blockchain arena.

Click Here To Take Part In Pandoshi Presale

Visit the links below for more information about Pandoshi (PAMBO):

Website: https://pandoshi.com/ Whitepaper: https://docs.pandoshi.com/

Disclaimer: This is a paid post and should not be treated as news/advice.

See the rest here:

This competitor to Solana (SOL) is poised for a 50x surge upon its launch - AMBCrypto News

Read More..

Exclusive: Samsung to launch Petabyte SSD subscription PBSSD-as-a-service is definitely not your usual cloud … – TechRadar

Samsung has announced it is exploring a new business model thats likely to get a lot of attention from partners and rivals alike.

PBSSD as a service is what the company calls a high capacity SSD subscription service that it says, goes beyond capacity limits.So it is neither a cloud storage service nor a cloud backup solution, at least not for now.

In a blog post on the companys website, Yongcheol Bae, Executive Vice President of Memory Product Planning at Samsung Electronics, disclosed that it is envisioned as a business structure where customers use services instead of purchasing a server configured with SSDs. That sounds a lot like what other vendors like Pure Storage are offering.

As a high-capacity SSD subscription service, Bae continues, it is expected to contribute to lowering the initial investment cost of customers storage infrastructure as well as maintenance costs by providing customers with a petabyte-scale box that functions as memory expansion.

The Petabyte SSD architecture was unveiled back in August 2023 and aimed back them to provide a petabyte-scale ultra-high capacity solution that provides high scalability by varying the capacity depending on the application.

A few days ago, we learnt that Solidigm, one of Samsung's rival, was selling its 61.44TB SSD for around $60 per TB, which would put the price of 1PB at approximately $60,000 (although you'd need to add the cost of the server etc).

Flash is expensive, and what Samsung is trying to do is offer a way for those looking for superfast storage to reduce their capital expenditure. Whether or not Samsung will sell these as barebones or with an additional layer of software and services (courtesy of third party partners) remains to be seen.

What we do know though is that this is not a 1PB SSD, instead it is a box that contains several SSDs (probably four of the 256TB SSDs it revealed during Flash Memory Summit 2023). If you want the real deal, then you will have to wait a bit longer. In March 2023, VP and General Manager of NAND Product Planning Group, Kyungryun Kim, revealed the company wanted to launch a 1PB (1000TB) SSD in the next decade.

We dont know when it will be released but it will be interesting to see how it compares to Pure Storages DFM (Direct Flash Module), currently on 75TB capacities and likely to delivery 300TB in 2026. A Purestorage FlashBlade//E AFA storage system packs 55 DFM to deliver 4PB storage in a 6U rack. Thats now. In two years, thats going up to 16PB or about 2.5PB per 1U and Samsung knows that.

And just as a comparison, provisioning 1PB of local SSD space from one of the hyperscalers (e.g. Google Cloud) cost a cool $43,000 per month when taking on a three-year commitment.

Excerpt from:
Exclusive: Samsung to launch Petabyte SSD subscription PBSSD-as-a-service is definitely not your usual cloud ... - TechRadar

Read More..

Icedrive review: Slickly implemented online storage with tiers for everyone – PCWorld

At a glanceExpert's RatingPros

Though not the cheapest service available, Icedrive is certainly one of the most attractive, easiest, and slickest to use. As pure storage its as good as it gets, though we miss online file editing.

$6 per month for 1TB

There are a lot of choices in online storage these days, so users can pick and choose the easiest and most affordable. Icedrive competes well on both fronts proving to be one of our favorites to use, though that includes some features currently still in beta. Still, color us impressed.

Further reading: See our roundup of the best online backup services to learn about competing products.

As with any online storage vendor, Icedrives primary feature is providing an offsite repository for your files. However, it also sports collaboration features such as file sharing, public links, and file requests (others asking to see your files).

Icedrives client side software includes an optional virtual drive I: (Icedrive, I:, get it?) that acts as a local portal to your online files, as well as an encrypted folder (free 10GB plan excepted) that utilizes a secondary password or passphrase that you define. Icedrive doesnt have access to this passphrase, so dont lose or forget it.

iOS and Android apps are also provided so you can back up your phone or other mobile devices. My one caveat here is the lack of a Sync tab as on the Windows client. Instead backups (its not traditional sync) are defined under the settings tabthe last place I looked.

Speaking of such, in addition to the I: drive, Icedrives Windows client offers two-way (mirroring) as well as one-way (local to online, and online to local) sync options. You can choose whether or not to mirror deletions during two-way sync i.e., if you delete the online file the local copy still remains; if you delete the local file, the online copy still remains. A nice data-safety feature.

File versioning can serve as a kind of ad hoc backup: Older files are kept just in case. Icedrive does it better than most. Instead of moving the older file to a visible sub-directory, or renaming the older file and leaving it in plain view (this can get messy with a lot of versions), it retains the files out of sight.

Simply right-click on the file for the context menu, drill down to the Icedrive sub-menu (Windows 11/Show more options) if necessary, select Version history and youll see a list of older versions which you may then download.

Note that versioning only appeared on the local I: drive with the version 3 beta software. Also note that simply creating a file didnt count as a version only upon editing and saving was a version created. Beta.

Icedrive offers an online document preview feature that handles many common types. However, my more recent Office files wouldnt open so I found the feature only useful for plain vanilla PDF, JPEG, etc.

No editing of said documents is available, so if youre looking to work online, Icedrive is likely not your cup of tea. At least for now. That said, Icedrive doesnt make claims in this regard. As simple storage with easy access, its aces.

Icedrive is the easiest online storage service to get up to speed with that Ive tested so far. Thats largely because the client software is simple, straightforward, and doesnt misidentify sync operations as backup like some, such as pCloud.

Note that my opinion takes into consideration the beta features which arent available to all users yet. Soon, hopefully.

As mentioned, Icedrive creates a virtual Windows drive like the previously reviewed pCloud. This feature for macOS and Linux is scheduled for release later in 2024. Icedrive sent me the macOS beta, which relies on the public domain macFUSE a separate download. OpenDrive also relies on this macOS extension, which seems to work well.

To access your encrypted folder from the local I: drive, you must enter the passphrase online, then open the local client and under the Mount tab, choose Crypto Lock and enter the passphrase. Simple, and after that, its all transparent.

Its important to remember that the I: drive files exist only online and changes made to them are permanent.

Note that you can set the online storage as read-only if you want to be sure files dont get mucked up.

Icedrive has an option to suit just about anyones budget, though the most affordable require a five-year commitment. The currently discounted five-year plans are 1TB/$189 ($3.15 monthly), 3TB/$399 ($6.65 monthly/$2.21 monthly per TB), and 10TB/$999 ($16.65 monthly/$1.66 monthly per TB).

There are also monthly and yearly plans. The former are $6 for 1TB, $12 for 3TB, and $30 for 10TB. Annual prices are: 1TB is $59 yearly ($4.91 monthly), 3TB is $120 ($3.33 monthly per TB), and 10TB is $299 ($2.49 monthly per TB).

All in all, Icedrive isnt the cheapest option out there, but it is certainly competitive and as mentioned, theres a free 10GB plan (without the encrypted folder) so you can kick the tires.

When it comes to ease of use, Icedrive is as good as it gets, and the pricing is quite competitive. When all the beta features are in place, it will be in a league of its own in terms of interface. Definitely worth a look-see.

Editors note: Because online services are often iterative, gaining new features and performance improvements over time, this review is subject to change in order to accurately reflect the current state of the service. Any changes to text or our final review verdict will be noted at the top of this article.

Here is the original post:
Icedrive review: Slickly implemented online storage with tiers for everyone - PCWorld

Read More..

Global Cloud Monitoring Market Analysis Report 2023-2030 – Market Set to Reach USD 9.37 Billion by 2030, with … – Yahoo Finance

Company Logo

Global Cloud Monitoring Market

Global Cloud Monitoring Market

Dublin, Jan. 24, 2024 (GLOBE NEWSWIRE) -- The "Global Cloud Monitoring Market Size, Share & Trends Analysis Report by Type (Cloud Storage Monitoring, Database Monitoring, Website Monitoring), Service Model, Enterprise Size, Industry Vertical, Region, and Segment Forecasts, 2023-2030" report has been added to ResearchAndMarkets.com's offering.

The global cloud monitoring market size is anticipated to reach USD 9.37 billion by 2030. The market is expected to grow at a CAGR of 20.9% from 2023 to 2030. Factors such as the growing complexity of cloud environments, the increasing size of datasets, and the need for better visibility and control over cloud resources are expected to drive market growth. Moreover, the growing adoption of hybrid and multi-cloud environments presents significant growth opportunities for the market.

Cloud computing brought a major transformation in the IT industry, which led to new challenges for managing and monitoring cloud environments. Many businesses are increasingly adopting cloud-based applications, and infrastructure that needs effective cloud monitoring solutions is becoming increasingly important. Cloud computing offers several advantages to businesses of all sizes, including scalability, flexibility, cost-effectiveness, and access to a wide range of services and applications. However, this shift to cloud computing has also introduced new complexities in managing and monitoring IT infrastructure. The need for effective cloud monitoring solutions has become more crucial as critical applications and data are being entrusted to the cloud.

Many businesses operate in multi-cloud or hybrid cloud environments, combining various cloud providers or integrating with on-premises infrastructure. Cloud monitoring solutions offer an integrated view across these diverse environments, simplifying management and monitoring. Since 2020, most IT leaders have planned to distribute workloads across multiple clouds, leading to the rise of multi-cloud systems in the cloud monitoring market.

Cloud Monitoring Market Report Highlights

Story continues

By type, the cloud storage monitoring segment accounted for the largest revenue share of 28.3% in 2022 and is anticipated to witness significant growth during the forecast period. Monitoring solutions for storage allow businesses to track resource usage, identify underutilized or overused resources, and optimize accordingly. It helps control costs and maximize the value of cloud investments.

In terms of service model, the IaaS segment is estimated to grow at the fastest CAGR of 22.6% over the forecast period. Growth in demand for low-cost IT infrastructure and quicker data accessibility fuels the growth of IAAS. The adoption of cloud computing in various industries is also one of the key factors for the demand for IAAS, as it offers fast data accessibility regardless of the data center's location.

In terms of enterprise size, the SME segment held the largest revenue share of 71.6% in 2022. Utilizing cloud monitoring tools enables SMEs to compete effectively by providing better services, maintaining high uptime, and delivering consistent performance to customers.

In terms of industry verticals, the BFSI segment is estimated to grow at a significant CAGR of 21.6% over the forecast period. To fulfill their security and risk management responsibilities, financial institutions are increasingly implementing monitoring controls that help them avoid relying on historical assessments. These monitoring controls include using dashboards and logging capabilities offered by cloud service providers (CSPs) and compatible solutions. Such controls help monitor operational performance and security threats effectively, leading to the growth of cloud monitoring tools used in BFSI institutions.

North America dominated the market with a share of 60.7% in 2022 and is anticipated to dominate the market over the forecast period. The ongoing evolution of IT infrastructure and adoption of hybrid and multi-cloud strategies in the region requires monitoring solutions capable of managing diverse cloud data smoothly, which fuels the demand for monitoring tools.

Key Attributes:

Report Attribute

Details

No. of Pages

130

Forecast Period

2022 - 2030

Estimated Market Value (USD) in 2022

$2.08 Billion

Forecasted Market Value (USD) by 2030

$9.37 Billion

Compound Annual Growth Rate

20.9%

Regions Covered

Global

Key Topics Covered:

Chapter 1. Methodology and Scope

Chapter 2. Executive Summary

Chapter 3. Cloud Monitoring Market Variables, Trends & Scope3.1. Market Lineage Outlook3.2. Industry Value Chain Analysis3.3. Market Dynamics3.4. Cloud Monitoring Market Analysis Tools3.4.1. Industry Analysis - Porter's Five Forces3.4.2. PESTEL analysis

Chapter 4. Cloud Monitoring Market: Type Estimates & Trend Analysis4.1. Cloud Monitoring Market: Key Takeaways4.2. Cloud Monitoring Market: Movement & Market Share Analysis, 2022 & 20304.3. Cloud Storage Monitoring4.4. Database Monitoring4.5. Website Monitoring4.6. Virtual Network Monitoring4.7. Virtual Machine Monitoring

Chapter 5. Cloud Monitoring Market: Service Model Estimates & Trend Analysis5.1. Cloud Monitoring Market: Key Takeaways5.2. Cloud Monitoring Market: Movement & Market Share Analysis, 2022 & 20305.3. SaaS5.4. IaaS5.5. PaaS

Chapter 6. Cloud Monitoring Market: Enterprise Size Estimates & Trend Analysis6.1. Cloud Monitoring Market: Key Takeaways6.2. Cloud Monitoring Market: Movement & Market Share Analysis, 2022 & 20306.3. Large Enterprises6.4. SMEs

Chapter 7. Cloud Monitoring Market: Industry Vertical Estimates & Trend Analysis7.1. Cloud Monitoring Market: Key Takeaways7.2. Cloud Monitoring Market: Movement & Market Share Analysis, 2022 & 20307.3. BFSI7.4. IT & Telecom7.5. Healthcare7.6. Government7.7. Retail & Consumer Goods7.8. Manufacturing

Chapter 8. Cloud Monitoring Market: Regional Estimates & Trend Analysis

Chapter 9. Competitive Landscape9.1. Recent Developments & Impact Analysis, By Key Market Participants9.2. Market Participant Categorization

Amazon Web Services, Inc.

Microsoft

Alphabet Inc. (Google Cloud)

Cisco Systems, Inc.

Oracle

International Business Machines Corp.

Datadog

Dynatrace LLC.

New Relic, Inc.

LogicMonitor Inc.

Splunk Inc.

AppDynamics

Zenoss Inc.

SolarWinds Worldwide, LLC.

Sumo Logic

For more information about this report visit https://www.researchandmarkets.com/r/6h057

About ResearchAndMarkets.comResearchAndMarkets.com is the world's leading source for international market research reports and market data. We provide you with the latest data on international and regional markets, key industries, the top companies, new products and the latest trends.

Attachment

See the article here:
Global Cloud Monitoring Market Analysis Report 2023-2030 - Market Set to Reach USD 9.37 Billion by 2030, with ... - Yahoo Finance

Read More..

Taiwan connects its first home-grown quantum computer to the internet – The Register

Taiwanese research institute Academia Sinica has connected a home-brew quantum computer to the internet.

A January 19 announcement of the connection reveals that the machine has five qubits and is available as a test bed for the university's project collaborators, with other researchers able to use it as a development platform for their own efforts using the machine's ultra-low temperature CMOS and parametric amplifiers. Collaborators include the University of California, Santa Barbara, and the University of Wisconsin-Madison, so this machine's success may assist US quantum development efforts.

The machine has already had an upgrade from three to five qubits, the announcement states, adding that cubit logic gate fidelity was measured at 99.9 percent which suggests the computer is nicely stable.

Details of the machine's operating environment are absent in the University's statement, which focused on its strategic significance as a sign of progress in the drive to develop the island's quantum capabilities.

Taiwan has, of course, famously come to lead the world in semiconductor manufacturing. Framers of the island's aggressive industry policy will not have missed the likelihood that the rise of quantum systems may make its silicon prowess less relevant.

Indeed, local media yesterday reported that Taiwan's Semiconductor Research Institute has gone shopping for a five-qubit machine from Finland's IQM manufacturer of a machine called the Spark which matches that spec.

The Spark is billed as "An affordably priced 5-qubit superconducting quantum computer, professionally designed and calibrated as a turnkey solution." That makes it sound a little more mature and easy to deploy than Academia Sinica's effort which, as illustrated below, has a few rough edges.

Academica Sinica Quantum Computer Click to enlarge

Original post:
Taiwan connects its first home-grown quantum computer to the internet - The Register

Read More..

Alice & Bob Advance Quantum Computing with Fewer Qubits Needed for Error Correction – HPCwire

PARIS, Jan. 23, 2024 Alice & Bob, a leading hardware developer in the race to fault tolerant quantum computers, in collaboration with the research institute Inria, today announced a new quantum error correction architecture low-density parity-check (LDPC) codes on cat qubits to reduce hardware requirements for useful quantum computers.

The theoretical work, available on arXiv, advances previous research on LDPC codes by enabling the implementation of gates as well as the use of short-range connectivity on quantum chips. The resulting reduction in overhead required for quantum error correction will allow the operation of 100 high-fidelity logical qubits (with an error rate of 10-8) with as little as 1,500 physical cat qubits.

Over 90% of quantum computing value depends on strong error correction, which is currently many years away from meaningful computations, said Jean-Franois Bobier, Partner and Director at the Boston Consulting Group. By improving correction by an order of magnitude, Alice & Bobs combined innovations could deliver industry-relevant logical qubits on hardware technology that is mature today.

This new architecture using LDPC codes and cat qubits could run Shors algorithm with less than 100,000 physical qubits, a 200-fold improvement over competing approaches 20 million qubit requirement. said Thau Peronnin, CEO of Alice & Bob. Our approach makes quantum computers more realistic in terms of time, cost and energy consumption, demonstrating our continued commitment to advancing the path to impactful quantum computing with error corrected, logical qubits.

Cat qubits alone already enable logical qubit designs that require significantly fewer qubits, thanks to their inherent protection from bit flip errors. In a previous paper by Alice & Bob and CEA, researchers demonstrated how it would be possible to run Shors algorithm with 350,000 cat qubits, a 60-fold improvement over the state-of-the art.

LDPC codes are a class of efficient error correction codes that reduce hardware requirements to correct errors occurring in information transfer and storing. By using LDPC codes on a cat-qubit architecture, this latest work not only shows how the qubit footprint of a fault tolerant quantum computer could be further reduced but overcomes two key challenges for the implementation of quantum LDPC (qLDPC) codes.

Alice & Bob recently announced the tape out of a chip that would encode their first logical qubit prototype, known as Helium 1. When logical qubits with a sufficiently low error rate are implemented and using the cat qubit LDPC code technique, Alice & Bob would be capable of harnessing the computing power of 100 logical qubits with as little as 1,500 physical qubits, to run fault-tolerant algorithms.

As leading superconducting quantum computing manufacturers like IBM offer up to 1,121 physical qubits, outperforming classical computers in the simulation of quantum systems (quantum supremacy) is a milestone that would become attainable within current hardware capabilities using Alice & Bob new architecture.

In previously proposed qLDPC codes implementation, most notably IBMs last years paper, long-range qubit connectivity and high-weight stabilizers were required, which represent a daunting technical challenge. In contrast, Alice & Bobs combined approach of cat qubits with classical LDPC codes allows the use of short-range, local qubit interactions and low-weight stabilizers.

This simpler architecture enables for the first time the implementation of a fault-tolerant set of parallelizable logical gates without additional hardware complexity. Allowing for logical gates is a necessary step for the implementation of quantum algorithms and practical quantum computing altogether.

About Alice & Bob

Alice & Bob is a start-up based in Paris and Boston whose goal is to realize the first universal, fault-tolerant quantum computer. Founded in 2020, Alice & Bob has already raised 30M in funding, hired over 80 employees, and demonstrated experimental results surpassing those of technological giants like Google or IBM. Alice & Bob specializes in cat qubits, a technology pioneered by the companys founders and later adopted by Amazon. Demonstrating the power of its cat architecture, Alice & Bob recently showed it could reduce hardware requirements to build a large-scale useful quantum computer by up to 200 times compared to competing approaches.

Source: Alice & Bob

Read more:
Alice & Bob Advance Quantum Computing with Fewer Qubits Needed for Error Correction - HPCwire

Read More..