Page 2,075«..1020..2,0742,0752,0762,077..2,0802,090..»

Top Trader Says One Under-the-Radar Altcoin Set To Emerge As Winner of Terra (LUNA) Collapse – The Daily Hodl

A closely tracked trader is naming one altcoin that he says can come out as the big winner following the crumbling of Terra (LUNA) and its algorithmic stablecoin TerraUSD (UST).

Pseudonymous trader Light tells his 160,600 Twitter followers that Tron (TRX) and its algorithmic stablecoin Decentralized USD (USDD) are poised to fill the vacuum left by Terras demise.

TRX has displayed enormous relative strength, again at its March highs in an environment where 99% of alts are down 50% or more, driven by Trons recent foray into the algorithmic stable space and a 30% yield on USDD. In bear markets, winners win.

Tron, a cryptocurrency originally designed to serve as a decentralized storage and distribution platform for social media and digital entertainment content, has recently made a pivot to mirror the affiliation between LUNA and UST prior to their collapse.

Says Tron founder Justin Sun,

USDD will be pegged to the underlying asset, TRX, and issued in a decentralized manner. When USDDs price is lower than one US dollar, users and arbitrageurs can send one USDD to the system and receive one US dollar worth of TRX.

When USDDs price is higher than one US dollar, users and arbitrageurs can send one US dollar worth of TRX to the decentralized system and receive one USDD. Regardless of market volatility, the USDD protocol will keep USDD stable at 1:1 against the US dollar via proper algorithms in a decentralized manner.

Light says the current state of TRX and USDD presents opportunities to traders and investors considering that the algorithmic stablecoin has a market cap of just $534 million.

[TRX plus USDD] is in its fairly early stages, with only insiders in. There is room for latecomers and eventually even retail to enter before it carries the same risks as LUNA. Timing is everything in musical chairs. USDD market cap is at only 2.5% of USTs peak.

The crypto trader also mentions a tweet from Justin Sun, where the Tron founder says that USDD has a backstop of $10 billion sitting at the Tron DAO Reserve.

According to Light, traders have now begun the process of sending their TRX to the system to mint USDD.

After the market-wide capitulation, USDD minting restarted this weekend, catalyzing a sharp decline in centralized exchange balances of TRX as it is withdrawn to mint USDD.

The popular trader concludes by saying that TRX has a history of outperforming other crypto assets during bear markets.

Tron has a history of pumping during periods of poor market conditions, where a few names soak up the lions share of speculative flows. The risk/reward of betting on a resurgent Justin Sun, who has seen opportunity in the vacuum left by Do Kwon, is set favorably.

At time of writing, Tron is trading at $0.08, up over 14% from its seven-day low of $0.07.

Featured Image: Shutterstock/Natalia Siiatovskaia/Art Furnace

The rest is here:
Top Trader Says One Under-the-Radar Altcoin Set To Emerge As Winner of Terra (LUNA) Collapse - The Daily Hodl

Read More..

The Altcoin Market is Getting Redefined with these Powerful Coins: Polygon, Chainlink, and MusheToken – Deccan Herald

Regardless of your familiarity with cryptocurrencies, you must be acquainted with Ethereum (ETH). This cryptocurrency is so popular since its market cap accounts for 20% of the market. The Ethereum (ETH) network often becomes overcrowded, resulting in delayed transactions and increased network fees. This has prompted most investors to seek superior alternatives Polygon (MATIC), Chainlink (LINK), and Mushe Token (XMU) are three highly sought-after cryptocurrencies that are altering the nature of blockchain technology. Let's investigate them.

Polygon (MATIC)

Polygon (MATIC) is a blockchain network that aims to address Ethereum's (ETH) congestion problems. As a layer 2 blockchain, this cryptocurrency may assist decentralised application developers in avoiding very expensive costs. Its native token, MATIC, is used for network payments.

The popularity of most NFTs on the Polygon (MATIC) network has increased, since the beginning of the year. The volume of transactions has increased by more than 6,000 percent.

Polygon (MATIC) is ranked seventeenth on CoinMarketCap and had a daily rate of $0.66 at the time of writing. Being very low-cost and reasonable enables it to compete with more expensive cryptocurrencies such as Solana (SOL) and Ethereum (ETH).

Chainlink (LINK)

Chainlink (LINK) is a decentralised Ethereum-based blockchain oracle network. The network is designed to ease the movement of tamper-proof data, from sources off-chain to smart contracts on-chain.

Chainlink announced that business tycoon Bela supernova won a grant from Chainlink and Filecoin. Bela Supernova is presently creating a programme that transfers medical data, from the widely used hospital management system Better HMS to Filecoin.

It will securely store private health data on the Filecoin (FIL) network and make public health statistics available through the Chainlink (LINK) Network.

Chainlink and Filecoin have created a collaborative award programme, to assist in the development of projects on both networks. The LINK token is used within the Chainlink ecosystem, which facilitates the integration of on-chain data into blockchains. The coin is destined for future growth and should be included in your long-term portfolio.

Mushe Token (XMU)

Mushe Token (XMU) intends to build a full ecosystem that supports fiat and cryptocurrency payments using blockchain technology. Mushe Token also intends to enable near-instant transactions at ultra-low costs, letting anybody engage in its ecosystem and access fundamental financial services such as money transfers, and banking at a cheap rate.

To do this, Mushe Token (XMU) is building a suite of native goods, including an integrated cryptocurrency wallet aptly named Mushe Wallet, and a local decentralised exchange (DEX) called Mushe Swap.

These aspects will be available through MusheVerse, a digital platform that will allow users to deal in any (crypto) of their choosing, wherever in the globe.

MusheVerse will also provide users access to various other financial services, ranging from personal banking to investment planning.

The pre-sale of XMU, Mushe Tokens native utility token, is now underway, and the Mushe team has set a target price of $0.50 for its mid-July public exchange listing. Let's see whether they reach this objective!

The addition of the three tokens described above, MATIC, LINK, and XMU, might provide enormous profits in the future. The Mushe Token (XMU) has positioned itself strategically as a coin for a diversified ecosystem. If you can amass large quantities, it might offer you an advantage. Always develop your portfolio using the dollar-cost averaging strategy, and never invest more than you can afford to lose. You may also track Mushe Tokens development using the links provided below.

Mushe Token (XMU):Presale: https://portal.mushe.world/sign-in

Website: https://www.mushe.world/

See the article here:
The Altcoin Market is Getting Redefined with these Powerful Coins: Polygon, Chainlink, and MusheToken - Deccan Herald

Read More..

How to pick the right cloud storage – VentureBeat

We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 - 28. Join AI and data leaders for insightful talks and exciting networking opportunities. Register today!

Lets be honest, storage has always been a complicated subject. Teams of dedicated storage administrators would choose between block (SAN), file (NAS) or direct-attached (DAS), and then each choice led to more details such as HDD versus SSD. The cloud is supposed to make everything simple with storage-as-a-service. While the details of infrastructure have now been passed on to the magic of the cloud, there are still many choices that have a notable impact on performance, costs and scale.

Unlike buying an on-premises storage array where customers typically had to make compromises around the best storage for a workload and then live with that decision for up to five years, the rate of innovation and offerings in the cloud are nonstop.

On-premises storage decision-making was a big container that had to support many different workloads, often with serious tradeoffs. The cloud now lets you pick the best storage for every workload and even subsets of those workloads. Need ultra-low latency storage for your database? Done! Need low-cost storage for long-term retention of rarely-accessed data? The cloud has the storage for you!

Taking full advantage of cloud storage requires IT managers to stop thinking in terms of storage volumes and pivot to thinking about the data. By segmenting datasets, you can pick the right resources and, more importantly, pivot data to a better cloud resource as your cloud vendor introduces new services or if you realize your initial choice is under- or over-utilized. Lets look at the options:

Object storage is built for the cloud. It boasts unlimited scale with global namespace, which is akin to a universal file directory that makes it seem as if all unstructured data distributed across devices and locations is in a single location.Accessible over the HTTP protocol rather than file protocols like NFS and SMB, object is perfect for web-scale access of unstructured data. Object storage is presented to applications through a URL and storage tasks such as read, write and delete are accessed through simple commands that make it easy to consume by applications.

Also known as Network Attached Storage or NAS, file storage is presented via the popular NFS and SMB protocols for unstructured data. File is often the choice for existing applications versus new born in the cloud apps. File will typically boast higher performance and lower latency versus object with the tradeoff of limited capacity both in terms of number of files and size of volumes.File access is optimized for local or corporate network versus the global namespace that object offers.

Object and file are abstractions on top of storage resources that can increase scale and simplicity, whereas block storage is the equivalent of a local hard disk or direct attached storage. When implemented over a network, block storage is referred to as a storage area network (SAN). Block storage provides the lowest latency and highest performance because it is dedicated to a single application or server without an abstraction layer.

Storage is a space for rapid innovation and the choices are getting more nuanced all the time:

The true innovation of the cloud is that all these options are just a few clicks away and you can change storage as your requirements change. The key is to understand those requirements by analyzing your data both before and after you move to the cloud. Make the best decision for your workloads and data based on current usage and then monitor over time with an eye for new offerings. By optimizing your data for performance, durability and costs over the available resources you can innovate faster and save money.

StevePruchniewski is Director of Product Marketing atKomprise.

Welcome to the VentureBeat community!

DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation.

If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers.

You might even considercontributing an articleof your own!

Read More From DataDecisionMakers

Follow this link:
How to pick the right cloud storage - VentureBeat

Read More..

How to free up iCloud storage when space is low – Android Authority

Compared to other cloud storage platforms, Apple only offers a feeble 5GB of free space on iCloud. Unless youre willing to open your wallet and spend cash on upgrading your iCloud account, you will have to get creative on saving space. Here are some quick tips on how to free up iCloud storage.

Read more:How to use iCloud on your Android device

QUICK ANSWER

To free up space on iCloud, first, uninstall any unused apps. Turn off any apps that don't need iCloud sync, and go through each iCloud folder, deleting the files you don't need, and moving the ones you want to keep to other storage solutions.

JUMP TO KEY SECTIONS

Uninstall unused apps

A lot of apps sync their settings to iCloud, both as a backup and to make life easier if you use that same app on another Apple device. So the first step to freeing up iCloud space is to delete any unneeded apps. Go through each one in turn and ask yourself if you really need it. If you like to install lots of apps, you may be surprised at how many unnecessary ones you can uninstall.

Now go into the iOS Settings app and tap your name at the top. On the next screen, tap iCloud.

On the next screen, you will see Apple features and other apps which sync their data and settings to iCloud. This is shown by the green toggle. Go down the list and if anything doesnt need to be synced, turn the toggle off to grey.

You can also switch off Apple features like syncing your photo library, and automatic backups of your phone (although this is not recommended for obvious reasons). You can also see a breakdown at the top of what is hogging the most space.

Now its time to delete all of the unnecessary backups. In the previous screenshot, you can see an option at the top called Manage Storage. Tap on this, and this is what you will see.

Everything in the list are backups of that particular app. If you tap one, say Backups, you will see the backups stored in your iCloud account. The following screenshot shows that I have an old backup from an iPad I no longer have. So I tapped it to delete it.

The next screen will give you a Delete Backup button. Tapping that wipes it from your iCloud account.

Once you have removed any unnecessary apps and backups, its now time to go through each iCloud folder and delete any unnecessary files.

On iOS, this includes the Files app and the Photos app.

On MacOS, you need to check the Photos app, iCloud Drive, and the Shared folder.

When youre down to the essential files that you cant delete under any circumstances, you have a few choices.

Read more: How to turn off your iCloud music library

Yes, but you need to go to iCloud.com in a web browser. Once youve logged in, go to iCloud Drive, and click Recently Deleted. Click Recover all to bring back everything, or select individual files and click Recover.

See the original post here:
How to free up iCloud storage when space is low - Android Authority

Read More..

ServiceNow ordered a year’s worth of hardware to avoid supply chain hassles – The Register

The tech world's pandemic supply chain meltdown drove ServiceNow to place orders for a year worth of datacenter kit in January 2022, believing that doing so was necessary to get the hardware it needed to cope with growing customer workloads.

"Pre-COVID, I could generally get stuff in 45 days," CTO Pat Casey told The Register at ServiceNow's Knowledge 22 conference in Sydney, Australia, today.

Well-publicized coronavirus-related supply challenges caused ServiceNow's lead time for some networking kit to stretch to 160 days, while servers can take 120 days to arrive.

So the company "literally placed our entire 2022 order in January," he explained.

"We did it to get in line with the supply chain. If we order it now, hardware starts landing in Q3. If I order in Q3 2022 to meet hardware demand for Q4 2022, I will get the product in Q3 2023."

ServiceNow can't afford to wait that long because, the biz hosts clients on its own infrastructure Casey finds it cheaper to do so. Startups and small companies, he said, rightly balk at paying for datacenter engineers to run their own operations. ServiceNow has reached a scale at which it can afford an infrastructure staff to manage the 200,000 or so instances it runs.

Casey also feels that Amazon Web Services offers "a generic cloud." ServiceNow prefers hardware tuned to the needs of its application, which requires servers loaded up with memory and disk.

"We run one app. I can buy gear optimized for that, which means I can stack it denser; I can often get exactly the stuff I need. The price points are there," Casey elaborated.

The CTO said ServiceNow has found that "middling CPUs" meet its needs. "There is a spectrum of chips: the fast chips with a small number of cores, and the slower you get the more cores they give you for the same amount of power. We are somewhere in the middle we can't run all that efficiently on the core-happy but fairly slow stuff, but it is not worth us to pay a pile of money for something with only four cores on it to get 12 percent faster."

ServiceNow is an x86 shop, though Casey said the company has considered alternatives including IBM's Power architecture. It has not been convinced to change.

ServiceNow's servers use locally attached NVMe storage, housed on separate cards. Shared storage is used sparingly, and usually in the same rack as the servers it, well, serves.

Casey said ServiceNow was one of the first customers of Fusion-io, the storage upstart that was early to market with flash storage on PCIe cards. "It was life changing for us because it was so much better than the spinning disk arrays we had," Casey enthused. "At one point we were buying ten percent of Fusion-io's annual production. We were their number one customer. We are still a big buyer of NVMe storage."

Yet Casey still sees some hangovers from the days of mechanical hard disks in the world of software.

"A lot of the internals of a database are really designed to work around the behaviors of spinning disk arrays," he told The Register. "On NVMe it is almost not worth it. The double write buffering behaviors you see in a lot of databases, you don't need that on NVMe. They are actually counterproductive."

Casey said ServiceNow is a big user of, and investor in, MariaDB, with around 200,000 instances running. In August 2021, ServiceNow acquired German database vendor Swarm64 in the expectation its Postgres-based tech will enable rapid analytics and potentially be useful for primary storage, too. MonetDB, an open source effort led by folks in the Netherlands, also has a home at ServiceNow. Casey said it is "very, very fast, but also sort of fragile."

"You set it up, you load your column storage fast as a thief, but if you change it, it degrades," he said. "So we have to run two of them in parallel, then swap them."

Casey doesnt see any technology on the horizon that he thinks will have a positive impact to compare with that of NVMe, though he is keeping an eye on Compute Express Link (CXL) without being close to a decision. He's aware of SmartNICs but has no plan to adopt them.

One project that has commenced is the development of a backup tier. ServiceNow wrote its own backup solution and is thinking of using a standardized protocol namely, Amazon Web Services' S3 for that tech.

That's not an indication ServiceNow will adopt AWS storage. Instead, Casey said the company may deploy software that speaks the S3 protocol. ServiceNow would not be alone in doing so AWS's cloud storage offering is so pervasive that many on-prem storage rigs and applications use its protocol to allow easier access to hybrid cloud storage. For ServiceNow, S3-compatible storage in-house therefore has value.

Casey's job isn't all infrastructure he also guides development of ServiceNow's products. In that role, he said, an ongoing challenge is the user interface both to ensure complexity does not become an issue and because end-user demands remain high. People expect the ease of consumer tech experiences replicated at work.

See the rest here:
ServiceNow ordered a year's worth of hardware to avoid supply chain hassles - The Register

Read More..

Preql raises $7M to build the future of data transformation – GlobeNewswire

NEW YORK, May 26, 2022 (GLOBE NEWSWIRE) -- Preql, a no-code data transformation solution, announced that it has raised $7 million in seed funding, led by Bessemer Venture Partnerswith participation from Felicis, and top founders in the analytics ecosystem including Taylor Brown from Fivetran, Keenan Rice from Looker, Tristan Handy from Dbt Labs, Eldad Fakash from Firebolt, and Benn Stancil from Mode. Preqls platform allows business users to structure data for reporting without having to write SQL or rely on specialized data talent.

Preql builds upon the innovation of tools like Snowflake and Fivetran, which have made aspects of the analytics workflow accessible to organizations without data engineering resources. The next evolutionary step in the modern data stack is to allow business users to manage their own logic for reporting something thats not possible today without advanced SQL and data transformation expertise.

Preqls Co-Founders and Co-CEOs, Gabi Steele and Leah Weiss, met while leading data teams at WeWork and went on to found a successful data engineering and visualization consultancy. During their time at WeWork, they experienced a disconnect between business users who need data for decision making and the data teams who structure and prepare data for analysis. Business users have to pass along definitions to data modeling specialists, who maintain logic in code but lack sufficient business context. Even with exceptional data talent, the result of this handoff is often lack of trust in data, frustrated data teams, and costly data investments without a clear path to ROI.

Preqls funding comes at a moment where companies of all sizes are now investing in data cloud data storage and ingestion tools. The cloud storage market is growing 22.3% each year. Despite these investments in modern infrastructure, few companies have the internal resources required to shape their data for analysis. Theres a misconception that simply storing data will help your organization become data driven. Data storage is necessary, but the hard part is agreeing on what you want to measure, how you want to measure it, and then translating that business logic into code, said Leah Weiss, Co-Founder. Preql gives business users the ability to contextualize their data and customize definitions, but then abstracts away the complex work of data transformation.

Preqls technology sits on top of the data warehouse, predicts the data model required for your business, and then lets business users customize metric definitions. It compiles all of that logic and delivers reporting ready datasets back in your warehouse, something that previously took months of manual effort from highly skilled data teams. Weve seen first hand the pain business users and data teams experience while building out a central source of truth for reporting, said Gabi Steele, Co-Founder. We are deeply committed to delivering a design forward and intuitive solution that business users will love and understand, and that more mature data teams are grateful for because it saves them so much back and forth.

Were excited to partner with Preql to make data capabilities more accessible to organizations and verticals that are currently underserved, said Amit Karp, Partner at Bessemer Venture Partners. We were really impressed with the unique insight the founders bring to this problem and the clarity of their vision. Viviana Faga, General Partner at Felicis adds, we couldnt be more thrilled to partner with Gabi and Leah, who are on a mission to change the way data is transformed and accessed, better serving the needs of business users at every company.

About PreqlPreql is building automated data transformation for business users. Its technology empowers business users to access analysis-ready data in minutes without having to learn SQL or rely on a data team. Preql is backed by Bessemer Venture Partners, Felicis, and several angel investors. Learn more at preql.com.

About Bessemer Venture Partners Bessemer Venture Partners helps entrepreneurs lay strong foundations to build and forge long standing companies. With more than 135 IPOs and 200 portfolio companies in the enterprise, consumer and healthcare spaces, Bessemer supports founders and CEOs from their early days through every stage of growth. Bessemers global portfolio includes Pinterest, Shopify, Twilio, Yelp, LinkedIn, PagerDuty, DocuSign, Wix, Fiverr and Toast and has $9 billion of capital under management. Bessemer has teams of investors and partners located in Tel Aviv, Silicon Valley, San Francisco, New York, London, Boston, Beijing and Bangalore. Born from innovations in steel more than a century ago, Bessemers storied history has afforded its partners the opportunity to celebrate and scrutinize its best investment decisions (see Memos) and also learn from its mistakes (see Anti-Portfolio).

About FelicisFounded in 2006, Felicis is a venture capital firm investing in companies reinventing core markets, as well as those creating frontier technologies. Felicis focuses on early stage investments and currently manages over $2.1B in capital across 8 funds. The firm is an early backer of more than 41 companies valued at $1B+. More than 91 of its portfolio companies have been acquired or gone public, including Adyen (IPO), Credit Karma (acq by Intuit), Cruise (acq by General Motors), Fitbit (IPO), Guardant Health (IPO), Meraki (acq by Cisco), Ring (acq by Amazon), and Shopify (IPO). The firm is based in Menlo Park, CA. Learn more at http://www.felicis.com.

A photo accompanying this announcement is available at https://www.globenewswire.com/NewsRoom/AttachmentNg/98753d0b-a231-4884-a213-3f20c212a9bf

Read more here:
Preql raises $7M to build the future of data transformation - GlobeNewswire

Read More..

The Obsessive World of Digital Music Collectors – Pitchfork

When it comes to playing the music, his sprawling setup involves multiple computers, tablets, monitors, speakers, and headphones. His collection stretches back to 2006, though hes got boxes filled with even older data CDs and hard drives, his adolescence encased in digital amber, ready to be unearthed.

Not all collectors are so chaotic. Marty Sartini Garner, a copywriter at the Los Angeles Philharmonic and music critic who has written for Pitchfork, occupies a place somewhere near the other end of the spectrum. He limits his CD rips to one per day, and moves them to an aging iPod Classic that he listens to with his Airpods and a Bluetooth adapter. This more minimal setup allows him to put down his phone, close the laptop, and narrow his focus to whatever hes listening to. Hell download records from Bandcamp, but likes the IRL experience of walking to Fingerprints, his local record shop in Long Beach, California, known for its impressive stock of new and used CDs. He still keeps a vinyl collection and an analog stereo, but his latest move necessitated a pruning of his physical collection.

My personal setup splits the difference between Nappys maximalist strategy and Garners more Zen-like approach. I still use the same iTunes library I started in 2003, when I bought my first iPod. And I subscribe to Apple Music, which includes a service called iTunes Match, that either matches songs in my library or uploads my own copy to their servers, letting me stream or download any song in my collection from any Apple device associated with my Apple ID. I currently have about 47,000 songs over 700GB, and the library is stored on a pair of hard drives inside my desktop computer that are backed up twice internallyonce to a single drive every night, and again to a Time Machine backup that saves hourly incremental backups. Every few months I swap a third backup in and out of a fireproof safe at my parents house just in case my building burns down. This process may sound like a lot, but in reality, its mostly self-sustaining: Once I add a record to the collection, its almost immediately available to play in any room of my home, or on any device I ownanywhere with an internet connection.

Every collectors setup is different, and what works for you should be based on what gear you already have, your budget for what you dont, and the effort youre willing to put in. Storage space has never been cheaper: Even the most durable 4TB hard drives can be had for less than $100 nowadays. Most of the equipment youll needcomputer, smartphone, stereo, headphones, routeryou likely already own. You might even have a dusty CD collection waiting to be ripped, old drives that can be repurposed, or a cloud storage account that can be set up for automatic backups.

You dont need thousands of dollars, a huge home, or an advanced degree to maintain a digital collection. Some collectorsincluding mecan be intense, but it doesnt have to be that serious to be rewarding. Everyone started with one file, one song, one album. All it takes is time, effort, and a little bit of love.

This week, were exploring how music and technology intersect, and what todays trends and innovations might mean for the future. Read more here.

Read more from the original source:
The Obsessive World of Digital Music Collectors - Pitchfork

Read More..

IDC expects the Indian public cloud services market to grow at a CAGR of 24% by 2026 – Wire19

According to the latest report by International Data Corporation (IDC) Worldwide, the Indian public cloud services market totaled revenue of $4.6 Billion in the second half of 2021. IDC expects that by 2026, the overall Indian public cloud services will reach $13.5 billion, growing at a CAGR of 24%.

With digital innovation leading the top business objectives for Indian organizations, cloud adoption is set to accelerate in 2022. Driven by the need for agility, flexibility, and faster access to digital technologies, cloud continues to gain momentum across segments. Additionally, the need to leverage data intelligently, is supreme and enterprises are able to do so with access to technologies that are built on a cloud foundation, saysRishu Sharma, Associate Research Director, Cloud and Artificial Intelligence, IDC India.

SaaS was the largest component of the overall public cloud services in India in 2021, followed by IaaS and PaaS. Enterprises have increased their spending on the public cloud and the top two cloud service providers are holding over 45% of the public cloud services market in India.

Public cloud adoption continued to surge in 2021 as enterprises invested in public cloud as part of their digital transformation initiatives to improve business resiliency and become a digital-first organization. The increased spend is expected to continue in the upcoming years as enterprises invest in emerging technologies like AI/ML, IoT, blockchain, etc., to automate processes and drive innovation with public cloud as the foundation. The increasing investments in areas like edge computing and IoT will drive the demand for public cloud infrastructure services, especially storage and data management, saysHarish Krishnakumar, Senior Market Analyst, IDC India.

India is a rapidly growing market for public cloud service providers. The demand from large enterprises, digital natives, and small and medium businesses drives this growth. Businesses continued to invest in public cloud services in 2021 to make sure they stayed running, became more resilient and productive and drove digital innovation. With organizations preferring a hybrid work mode in 2022, the demand for cloud-based security applications is increasing. Organizations are migrating their existing workloads to the public cloud and the demand for cloud-native application development has also increased, all of these driven by the need to fulfill the customer demands faster.

Read next:Meta launches WhatsApp Cloud API for global businesses of any size

See the article here:
IDC expects the Indian public cloud services market to grow at a CAGR of 24% by 2026 - Wire19

Read More..

Microsoft urged to do more to address European cloud antitrust complaints – ComputerWeekly.com

The founding member of a coalition of tech firms that accused Microsoft of anti-competitive behaviour, based on how it sells and packages its cloud services in Europe, has claimed the software giant must do more to address the antitrust complaints being levied against it.

Microsoft published a blog post this week acknowledging the antitrust concerns that have been raised with regulators and authorities about its cloud-related business practices in Europe, which it also used to outline a series of meaningful actions it would take to address the issues raised.

As detailed in a report on Reuters last month, these concerns are known to have prompted the European Commissions antitrust authorities to send a questionnaire to Microsoft customers and competitors, asking for their views on Microsofts cloud-related licensing deals.

The commission has information that Microsoft may be using its potentially dominant position in certain software markets to foreclose competition regarding certain cloud computing services, the questionnaire said, reported Reuters.

This information is based on complaints filed with the European Commission by several European cloud service providers, including German file sync and share software maker NextCloud and French infrastructure-as-a-service (IaaS) provider OVHcloud.

Nextclouds antitrust complaint, filed in early 2021, takes umbrage at the way Microsoft bundles its OneDrive cloud storage service and online collaboration platform Teams in with its flagship Windows operating system. It claims this practice is aggressively pushing consumers to sign up and hand over their data to Microsoft.

Nextclouds complaint has since won the support of more than 50 tech firms and non-profit organisations, leading to the formation of a coalition that is collectively speaking out against how Microsoft sells and packages its cloud software in Europe. The company has also filed a complaint against Microsoft of a similar nature with Germanys own antitrust authorities.

In ablog post dated 18 May 2022, Microsoft president and vice-chair Brad Smith said the company was taking meaningful action on the complaints being raised against it, including the adoption of five pledges that it claims will shape its approach to doing business in Europe in years to come.

These pledges include commitments to ensuring its public cloud meets Europes needs and serves Europes values, that its platforms are set up to ensure the success of European software developers and that it will provide support for European cloud providers through partnership.

The remaining two pledges made by Microsoft include a commitment to ensure our cloud offerings meet European governments sovereign needs, in partnership with local trusted technology providers and a vow to recognise that European governments are regulating technology, and we will adapt to and support these efforts.

According to Microsoft, these pledges mark the start of the work it is doing to address regulatory concerns, and are intended to guide all aspects of our cloud business, enhance transparency for the public, and help us to better support Europes technology needs.

In addition, the company said it was also taking steps to ensure European cloud providers could more easily host a wider variety of Microsoft products on their cloud infrastructure.

It added: This will make European cloud providers more competitive by enabling them to better serve customers.

While these actions are broad they are also not necessarily exhaustive, continued Smith. As I said in a video meeting a few weeks ago with the CEO of a European cloud provider, our immediate goal is to turn a long list of issues into a shorter list of issues.

In other words, lets move rapidly so we can learn quickly. Today were taking a big step, but not necessarily the last step we will need to take, and we look forward to continuing feedback from European cloud providers, customers and regulators, he added.

Speaking to Computer Weekly, Nextcloud CEO Frank Karlitschek said the actions Microsoft was committing to take were indicative of the pressure it is feeling in the wake of the complaints, but there is still more the company should be looking to do.

The main issue here is that we have a super-dominant position from Microsoft [It is] really dominating this whole market and this is not healthy, he said. Thats not healthy for the open market, thats not healthy for privacy and its not healthy for digital sovereignty for Europe. We want the regulators to do something against it to make sure theres fair competition and a level playing field.

In terms of the follow-up action Nextcloud and the coalition would like to see Microsoft take, Karlitschek said a commitment from the company to make parts of its cloud stack open source would be a start.

Across Europe, you have this movement towards digital sovereignty, where governments want to be in control of their data and applications. So, if you are a government or a company and you use Microsoft or Google or Amazons service even if its hosted in Europe thats still under US jurisdiction because of the CLOUD Act, he said.

This is what theyre trying to solve here by giving other cloud providers the option to hold this Microsoft stick, but obviously this is not enough, because you still have a dependency to Microsoft because Microsoft is not open source.

He continued: Digital sovereignty would only come with open source software. What it has proposed so far is interesting and is a move in the right direction, in response to the pressure it is under, but this is not enough.

Data from IT market watcher Synergy Research Group in September 2021 shed some light on the impact the US tech giants growing hold on the European market was having on the fortunes of local cloud providers.

While the market itself has grown nearly fourfold since 2017 to a value of $8.8bn, European cloud providers have seen their share of the market fall from 27% to 16% during that same time period, although the revenue these firms make has doubled during that time.

Computer Weekly also contacted OVHcloud for its take on Microsofts plans, given it has also raised an antitrust complaint against the company with regulators in the past, and received the following statement in response.

Microsoft acknowledges the merits of our complaint and we can only regret that it has to go as far as mobilising the relevant authorities to secure a level playing field in Europe, where competition is both open and fair, said the statement.

We are now waiting to see the concrete implementation conditions of these resolutions and remain committed to defending a level playing field for the European cloud ecosystem.

Go here to see the original:
Microsoft urged to do more to address European cloud antitrust complaints - ComputerWeekly.com

Read More..

Verizon, AWS expand edge computing to more metro areas – ComputerWeekly.com

Verizon has announced that it is now offering, with AWS, 5G mobile edge computing (MEC) in more US metro areas, with the addition of Nashville, Tennessee and Tampa, Florida.

Through a partnership that began in August 2020, the companies can now provide mobile edge computing via AWS Wavelength Zones in 19 locations in the US, which means 75% of the US population is now within 150 miles of a Wavelength Zone.

TheVerizon and AWS edge computing collaborationbegan with the launch of Verizon 5G Edge with AWS Wavelength.AWS Wavelengthextends AWS compute and storage services to the edge of Verizons public mobile network and provides access to cloud services running in an AWS region, therebyminimising the latency and network hops required to connect from a 5G device to an application hosted on AWS.

In August 2020, the companies announced the general availability of 5G mobile edge computing via Wavelength Zones in 10 cities across the US.

Verizon 5G Edge with AWS Wavelength is currently available in 19 locations: Atlanta, Boston, Charlotte, Chicago, Dallas, Denver, Detroit, Houston, Las Vegas, Los Angeles, Miami, Minneapolis, Nashville, New York City, Phoenix, the San Francisco Bay Area, Seattle, Tampa and Washington DC.

The relationship evolved to see the companies create technology fully integrating Verizons private 5G networks and private 5G Edge platform withAWS Outposts, a fully managed service that is said to offer the same AWS infrastructure, services, application programming interfaces (APIs) and tools to virtually any datacentre, colocation space or on-premise facility for a consistent hybrid experience.

The benefits for users of being in closer proximity to the applications they use means faster response times by shortening the round trip that data needs to travel, significantly reducing lag time, or latency, for getting data to a device from the cloud. For developers and businesses, Verizon said that by using 5G Edge with AWS Wavelength allows them to build and deploy a variety of latency-sensitive applications for use cases such as immersive virtual reality (VR) gaming, video distribution and connected and autonomous vehicles.

With the ongoing expansion of our mobile edge compute infrastructure, were enabling developers to build transformational applications that enhance consumers experiences by moving the data and processing done by applications and services to the edge of Verizons wireless network and closer to the end-users device, said Verizon Business CEO Tami Erwin. By offering both public and private mobile edge compute, we are giving businesses ultimate optionality. This can transform the way companies can leverage predictive analytics, allowing them to improve operational efficiency, mitigate risk and increase revenue.

George Elissaios, director and general manager of AWS EC2 core product management at AWS, added: With the rapid expansion of AWS Wavelength Zones across the US, even more developers can innovate faster and deploy powerful cloud-based applications to the edge offering ultra-low latency, high bandwidth, and high performance for these applications. We are excited to collaborate with Verizon to bring AWS services to the edge of the Verizon 5G network across the US to help our customers transform consumer experiences.

Read the original:
Verizon, AWS expand edge computing to more metro areas - ComputerWeekly.com

Read More..