Page 2,068«..1020..2,0672,0682,0692,070..2,0802,090..»

Cardano can be scarce like the king cryptocurrency – The Coin Republic

According to ADA whale, a Cardano community-focused Twitter account, Cardano remains one in every of the few coins that comes on the point of Bitcoin distribution and inflation. He additionally feels that adenosine deaminase may become an extremely scarce plus at some point within the future, because it might follow a BTC-like path. The ADA whale thinks that Bitcoin got its dynamics right due to its truthful distribution, its worth staying low long enough to permit several to buy, and its mounted offer.

Satoshi Nakamoto, the enigmatic Bitcoin creator, believes that deficiency may produce value. Hence, the most supply of Bitcoin was restricted to twenty one million coins. The nineteen millionth Bitcoin was strip-mined in April, exploiting solely 2 million BTC to be mined in approximately a hundred years.

Cardano, like Bitcoin however in contrast to Ethereum, has a finite offer limit, with only forty five billion adenosine deaminase ever to be created over the coin existence. Presently, 33.82 billion ADA are in circulation, accounting for 75% of the most supply, and 34.27 billion ADA are created therefore far, per CoinMarketCap data. However, due to continued unfavorable market conditions, the ADA whale believes that currently might not be the time to laden on ADA. this is often a bear market, therefore be ready for presumably months of integer negative returns if you do. simply making an attempt to place it into a semi permanent perspective.

consistent with the most recent weekly report by Cardano parent company, IOHK, the closed Vasil testnet has already been launched to assess its practicality with a cluster of dApps and users. The Cardano team continues to figure on consensus-specific enhancements in anticipation of the Vasil exhausting Fork Combinator (HFC) event in June.

In addition, IOHK provided a chart with network growth information. Currently, 986 homes are being built on Cardano, up from 943 previously. A total of eighty eight projects have recently been launched on Cardano, whereas the number of NFT projects has increased to 5,727. For the week, Github connections destroyed 3,028, while adenosine deaminase stood at 4.9 million. Also, the number of Plutus scripts was 2,745.

The rest is here:
Cardano can be scarce like the king cryptocurrency - The Coin Republic

Read More..

One Smart Choice and 2 Regrets From a Young Cryptocurrency Investor – Business Insider

Cryptocurrency investor Ariel Fox, 29, doesn't have many regrets about getting into buying and selling coins. Unperturbed by the recent crash, Fox's portfolio is still worth more than the initial dollars she invested into it, and she finds buying and selling cryptocurrency to be preferable to being a retail investor in the stock market.

"There are a lot of people who have access to information about stocks in the market that the average retail investor may not have," said Fox. "But if you get in at certain crypto projects at the right time, you are able to profit from that. It is somewhat speculative, just like stocks but it's also very interesting to me. It's exciting, it's growing."

Fox first heard about Bitcoin all the way back in 2009 when it was first released, but she didn't start investing in crypto until late 2020. She said that she's learned a lot along the way.

Fox said that the smartest thing that she did with her coins was invest them across different websites in order to earn more interest on them.

She originally kept her Bitcoin holdings primarily on Coinbase, but then "scattered it across some different marketplaces in order to invest in other coins. Eventually I collected it all together and put it into BlockFi where I earn interest on those holdings."

Fox added that her returns on interest is why she keeps a big chunk of her portfolio in the Gemini stablecoin (GUSD) which is backed up by US dollars and is pegged to the value of the US dollar. Currently, she steadily earns anywhere between 7% and 9% interest on her GUSD holdings.

One choice that Fox regrets making with her cryptocurrency portfolio is selling certain coins early because they spiked in price, not realizing how much higher they would later go.If she could do things differently, she would have "held some back for myself, just to watch and see how it fluctuated."

That said, Fox added that it's really hard to time the market in general, and that she doesn't recommend it. Sometimes the chips fell in her favor for selling early."There are other times where I sold, and the price dropped not too long after, and that was a smart sell for me," she said.

Another regret that Fox has was that she didn't learn about "gas fees" sooner. In cryptocurrency trading, a gas fee is a cost an investor incurs when they are trading or converting coins on the Ethereum blockchain.It requires a lot of computational energy to convert coins on this blockchain, and the gas fee at any given time can vary depending on many factors.

"When I was investing in a project, I had to convert Ethereum to this other coin and I had to pay a fee to convert it," said Fox. "You can time it so that your gas fee is lower. I didn't know."

Fox added that she "spent so much money on gas fees that I did not have to spend because I did way too many transactions, and I did it in a very busy time at the market."

A.J. Jordan

Personal Finance Reporter

Go here to see the original:
One Smart Choice and 2 Regrets From a Young Cryptocurrency Investor - Business Insider

Read More..

Learn More About The Ecosystems Of Cryptocurrency Shiba Inu (SHIB), Binance Coin (BNB) and Mushe (XMU) – TechCabal

Cryptocurrency is a unique sector in the finance industry which has shaped greatly the way we look at money. Almost everybody has digital assets these days, do you?

Cryptocurrency is a form of digital currency that allows users to transfer currency in a digital setting and is different from regular digital transfer products and applications.

The primary purpose of cryptocurrency is to provide a digital currency system that is not owned by only one central body. Thus, cryptocurrency is built and designed to be decentralised.

This way, the network participants run software and technology that connects them to other nodes, so that information can be shared conveniently. Crypto uses cryptographic techniques to make sure transactions between users remain secure.

Due to the security and lack of a central body, otherwise termed decentralisation, cryptocurrency has become a widely known investment choice for many newbies and veteran investors worldwide.

In the next few years, digital finance will be one of the biggest financial sectors in the world, as many companies are now integrating cryptocurrency into their payment platforms.

Different cryptocurrency projects come up in different cryptocurrency niches to solve different problems related to the crypto space.

However, there are some cryptocurrency tokens with a huge ecosystem that works across different parts of the crypto space all at once. Some of them include Shiba Inu (SHIB), Binance Coin (BNB), and Mushe (XMU).

Although some critics have called it a useless investment, Shiba Inu was one of the most successful investments in 2021.

Shiba Inu is a cryptocurrency meme token founded based on a Doge meme of a Japanese dog breed, Shiba Inu.

SHIBfollows the footsteps of the top meme coin, Dogecoin. It is even called the Dogecoin killer.

The network functions as an ecosystem of a community-driven crypto project. This means that the decentralised Shiba Inucommunity is in charge of what happens to the cryptocurrency and also its development.

This community is known as the ShibArmy. The Shiba Inuecosystem is built of Ethereum to enable it to conveniently run smart contracts.

The Shiba Inu ecosystem is divided into different tokens and sectors that run the entire ecosystem. The first is SHIB,which is the foundational and main token of the ecosystem. SHIB is used as a medium of exchange and can be traded. The others are LEASH and BONE, used as staling rewards and voting rights, respectively.

Binance Coin(BNB) is another cryptocurrency token that runs on a huge ecosystem that offers different services.

Binance coinwas created by Binance in 2017 as an ERC-20 token on the Ethereumplatform. The Binance Exchange ecosystem is a wide one.

The major function of this ecosystemrelies on the Binance crypto exchange platform, which is very popular in the cryptocurrency world as it is a platform for traders to store and trade crypto.

Binance Coin (BNB)is the utility token of the platform. It is used by users to get discounts while paying for trading fees and also to pay for travelling expenses, virtual gifts, shopping, etc.

The Musheecosystem is a decentralised peer-to-peer contract system created to aid interaction and governance and give its users rewards.

The Mushe(XMU) ecosystem was created to break the barriers between ecosystems and improve the interoperability between different blockchains.

XMUplans to improve the adoption of blockchain technology by providing digital access to these blockchains for everyone, whether newbies or veteran investors, while growing the social effect through teaching how digital currencies and economic management intersect.

The utility token for this ecosystem is XMU.It is an ERC-20 platform as it is built on the Ethereumnetwork. This XMUtoken rewards its users for staking and is a major exchange medium in the ecosystem.

Cryptocurrency is constantly growing, and with the increase of these ecosystemsthat try to fix the loopholes in the cryptocurrency sector, it will soon become top-notch. You can join the Mushepresale or learn more about it through the links below.

Join Mushes Presale:

https://www.mushe.world/

https://portal.mushe.world/sign-up

https://t.me/MusheWorldXMU

Keywords: Binance coin, BNB, Shiba Inu, SHIB, Ethereum, Mushe, XMU, Cryptocurrency

The rest is here:
Learn More About The Ecosystems Of Cryptocurrency Shiba Inu (SHIB), Binance Coin (BNB) and Mushe (XMU) - TechCabal

Read More..

Critical Emerging Technology: Claiming and Disclosing Blockchain, Fintech and Cryptocurrency – IPWatchdog.com

A blockchain is a digital ledger comprised of so-called blocks. Every piece of new information uploaded to the digital ledger is a block having a set of data. Once these blocks are linked that is, every time that new information is uploaded via a block it becomes part of the digital ledger for forever and all time; the blocks cannot be edited, deleted or modified, even by the company or person who initially created the blockchain. Because the history and genesis of blockchain data cannot be altered or deleted, blockchains are a valuable tool for identifying the provenance of an item and tracking the path from its original source to its ultimate destination.

A first example of blockchain usage is that blockchains can be implemented for understanding the genesis and lifespan of electric vehicle (EV) batteries. EV batteries are comprised of battery modules and battery cells, which can be either recycled or refurbished through multiple uses. Imagine the scenario in which an owner of an EV vehicle would like to know whether a battery that has been used over a certain lifetime can now be refurbished or partially refurbished in order to save cost. To do so, the owner wants to check if it is possible to replace some of the modules in the battery rather than the entire thing, which is far more expensive. Alternatively, the dealership may want to check whether an EV battery can be recycled for environmental purposes. But in order to make these kinds of decisions, a thorough diagnosis of the battery involving a comprehensive history of the battery and its physical components is desired.

By utilizing blockchain, each time the battery is charged, new data can be inputted to a blockchain for the battery. Other uploading events can include whether the battery has died or when the vehicle turns OFF and ON, indicating initiation or termination of use, etc. All of these events stimulate an automatic uploading of battery state information to the blockchain. Over time, the compendium of this information can be used to deduce the charge capacity of the battery components and how that has changed over its lifespan. This way, a decision can be made as to whether the battery needs full refurbishment or partial refurbishment, or whether the battery should be recycled or just tossed out. This will save money and lead to environmental benefits when implemented on a large scale.

As a patent practitioner, how would you claim the system just described? The first thing to think about when confronted with this question is to imagine the components involved for someone (like a potential infringer) to practice this type of invention. Specifically, what are the physical components needed? With the EV battery example, you can claim this as a system claim having at least one EV battery. Because the purpose of the entire system is to understand the chargeability capabilities of the battery over time, an electronic controller (ECU) with a processor and computer memory that stores the state of charge information is needed. The ECU will likely be part of a vehicle that has the battery, a mobile device with an app for the EV battery ledger, or it can be a central processing unit (CPU) on a cloud server. The system will also need a wireless communication unit to upload the information to the blockchain. An alternative to a system claim is an apparatus claim directed to a vehicle equipped with at least one EV battery, an electronic control unit and a wireless unit for uploading information to the blockchain.

Blockchain can also be used in FINTECH (financial technology). Banks can use blockchain to request confirmation of security information with immediate response using blockchain in order to expedite transactions. When thinking about what components to claim with FINTECH, software example components (e.g., parts of a computer or a computer program) can be algorithms and applications for a processor, the processor or microprocessor itself, or virtual reality (VR) trading platforms. All computers have some kind of storage that stores transitory or long-term computer-readable media. You can also think about claiming external storage such as memory that is not part of the computer the user is using (i.e., the blockchain or a cloud). Wireless communicators will be needed to upload the information to the blockchain so you can claim receivers and transceivers or Bluetooth capabilities.

When considering hardware components to claim, always think about what parts will be used by the user, what is gathering information or input. Think about claiming scanners (such as eye scanners or fingerprint scanners), sensors of all types, cameras, etc. Also think about whether the information to be processed will be inputted manually by a user or an operator, such as via a keyboard or a touchscreen. With mobile banking, the user is likely using an app like Venmo on a mobile device and will be inputting information via the touchscreen. Also, there has to be a display screen for the user to receive messages or information whether that be on a mobile device, an ATM or a computer screen. So a display screen is a physical component that can be claimed as well.

Recently, the Southern District of New York granted a motion to dismiss in favor of Block Inc. against AuthWallets patent assertion on the grounds that AuthWallets asserted claims are patent ineligible subject matter under the Alice Doctrine. AuthWallet owned a patent for a method and system for processing financial transaction data that involves a process of confirming authorization for transactions by the user. AuthWallets system included a processor, a storage component, a communications module and a stored value module. The court found that the claims only recited generic computer functions to be carried out by conventional computer components. What is the court saying here? The court is essentially saying that there is no technical improvement recited in these claims. Since there is no special purpose computer but only generic computer processes in AuthWallets claims, the claim needs to recite some kind of technical improvement to the way the computer functions. End user benefits are not considered technical improvements either at the Federal Circuit or at the Patent Trial and Appeal Board. Here, arguably a human brain can be a storage component, a communications module or a stored value module, as claimed in AuthWallets patent. It can be difficult to see the technical improvement in these kinds of claim recitations.

To learn more, watch the latest IP Practice Vlog here.

More here:
Critical Emerging Technology: Claiming and Disclosing Blockchain, Fintech and Cryptocurrency - IPWatchdog.com

Read More..

Cloud Computing Redefined the IT Industry: The Israeli Cloud Related Landscape – Geektime

Originally conceived in the 1960s and made widespread in the mid-2000s, cloud computing continues to evolve, serving as a key foundation for the rapidly advancing world of software and technology. If software is eating the world, the cloud is enabling software with an all-you-can-eat buffet.

Before the advent of cloud architecture, IT infrastructure essential to application development was largely managed on-premises. If more servers were needed, IT practitioners would be responsible for procurement and integration. When systems and software needed repairs or updates, IT practitioners would meticulously handle the process while avoiding any service interruption or data loss. The economic waste from idle or unused resources was not only accepted but expected. As a result of the inefficiencies of managing IT infrastructure, application development processes were slower, more expensive, and more prone to outages, ultimately hindering the scale and speed of development teams.

Today, with the on-demand delivery of shared computing resources via the internet, or the cloud, organizations can store and access their data on remote servers that are fully managed by third-party cloud service providers. Compute, networking, storage, servers, and other IT resources can easily be provisioned as needed and terminated when no longer in use. Organizations can enjoy the many benefits of leveraging cloud computing such as scalability, elasticity, and cost-effectiveness. Cloud computing has completely redefined the IT industry and has not only become the norm but a necessity for modern development teams.

As cloud and cloud-native technologies are shifting from nice-to-have to must-have, the global cloud market continues to grow without any signs of slowing down. Gartner predicts that global cloud revenue will reach $544 billion by 2022 and top $900 billion by 2025.

Aside from the many obvious benefits of cloud computing, the sheer growth in adoption can also be attributed to the ongoing Cloud Wars, or the heated contest over the massive cloud market share. Largely controlled by AWS, Azure, and GCP (in that order), cloud service providers are pouring tens of billions of dollars annually into R&D as they race to stay competitive by constantly improving and developing new features. This, coupled with the evidently perpetual increase in end-user demands, has created a unique environment where the possibilities are seemingly endless. Despite the overwhelming benefits, these continuous innovations also create additional complexity, which has spawned a wave of companies working to help end-users harness the powerful yet enigmatic world of cloud computing.

Behind every application is a stack of technologies supporting the development, production, and ongoing maintenance of said application. Modern cloud technologies such as containers, Kubernetes, serverless, and others have enriched application development while adding new complexities at the same time.

As the essential elements of the underlying infrastructure for cloud deployments (i.e., compute, networking, servers, storage, etc.) are becoming increasingly commoditized due to the domination of the public cloud, organizations are better positioned to focus on new technologies that help manage todays complexities. The modern cloud stack depicted here attempts to both identify and layout these new technologies in an easy-to-understand structure. From the bottom up, the core development stack includes layers such as infrastructure management, databases and data management, and developer tools which are key to building and running applications. On the periphery, tools that run across all layers that augment the operation of the core stack include CloudOps, AIOps, Monitoring & Observability, Data Analytics, and Security.

Israels innovative reputation is well known across many of todays hottest sectors, and cloud technology is no exception. With billions of dollars already invested in this space, Israeli cloud technology shows no signs of cooling off as investor appetite remains strong and promising startups are continuously emerging from stealth. Here is additional context on each area of the modern cloud stack along with a cloud landscape highlighting the Israeli startups operating in each area.

(Note that Data/Analytics (AI/ML/BigData/Data Analytics) and Cloud Security are also integral elements of cloud-native development but were left out of this landscape since they are significant areas that deserve their own individual landscaping.)

Infrastructure Management Infrastructure management is proving to be an increasingly difficult task. Key elements of the core infrastructure needs are being distributed across multiple service providers and data centers, while technologies such as microservices and serverless are only adding to the complexity. Tools that can help automate the management and provisioning of infrastructure from a single pane of glass are now more important than ever.

Databases and Data Management Data for applications is like gasoline for cars. Its needed to get things up and running and there needs to be a constant flow to the right places at the right times. As our daily lives continue to become increasingly infused with the digital world, the amount of generated data is growing exponentially. Databases that can make sense and order of the overwhelming surge of data, and even provide a layer of analytics, are now mission critical. Concurrently, tools that ensure the correct flow of data between data sources, databases, and applications are no less crucial.

DevOps and Dev Tools At its core, DevOps is a set of practices which aims to accelerate software delivery timelines by optimizing the method by which organizations develop and release code. Companies in this subsegment are providing solutions that directly help organizations successfully embrace DevOps practices. Dev Tools assist developers with their daily activities, such as automation of common tasks, no/low code tooling, and project management software.

Monitoring and Observability As modern infrastructure becomes more complex and distributed, gaining visibility and control over your environments and systems is an increasingly daunting task. Monitoring tools track the overall health and performance of a system and can alert you when issues arise. On the other hand, observability tools aim to provide deeper insight into the why, such as, why was there a service disruption or where are the performance bottlenecks. Observability relies on three main elements logs, metrics, and traces. Using this data, observability tools enable IT teams to view their complex systems through a single pane of glass and be better positioned to optimize performance and remediate issues.

AIOps Modern cloud technologies are great for increasing the rate of innovation but also create a flood of new data, making it extremely complex for IT and operations teams to manage incident response. This can cause performance issues and outages leading to poor customer experiences, lost revenue, customer churn, and other negative business consequences. AIOps aggregates data from all data sources and environments, normalizes the data, and uses AI and machine learning to streamline IT operations by automating event correlation, root cause analysis, and incident response.

CloudOps and FinOps With increasingly complex infrastructure needs, immense volumes of generated data, and multitudes of tools being used throughout organizations, a segment termed CloudOps has emerged, which aims to provide a holistic view of the entire cloud stack and better manage and optimize performance. FinOps is solely focused on managing cloud costs, enabling organizations to get the most business value out of their cloud operations. As more and more organizations across virtually all industries are shifting to the cloud, we can assume that the world of cloud technology is only in its infancy and the pace of innovation will increase dramatically.

Organizations continue to embrace new technologies that enable them to fully leverage the cloud in a simple yet sophisticated manner. As Israeli technology has already contributed considerably to this space, and with some of the brightest minds in the industry, we expect Israeli entrepreneurs to feature significantly among the next order derivatives of this landmark innovation, as the modern cloud stack becomes home to ever larger and more critical companies.

Written by Raz Mangel (principal) and Meir Cohen (investor) at Greenfield Partners

Read more from the original source:
Cloud Computing Redefined the IT Industry: The Israeli Cloud Related Landscape - Geektime

Read More..

Cloud Computing Expert, Broadus Palmer: The New Approach To Cloud Career Coaching That Changes Everything – LA Progressive

Cloud computing and cloud engineering are at the core of technological advancements that are fast being implemented by companies worldwide. Over recent years, founder of Level Up In Tech, Broadus Palmer has witnessed more and more companies in need of the support that he can provide in the field of cloud career coaching.

Cloud computing includes the use of servers, storage, databases, networking, software, analytics, and intelligence all under the title of computing services. It is done over the internet which as intangible as the internet is, is further referred to as the cloud. Cloud computing offers companies and brands more efficient innovation, and flexible resources.

When Broadus Palmer decided to leave his job as a banker behind after 14 years in the industry, the field of cloud computing and cloud engineering was even less established than it is today. Now it is strongly regarded as a lucrative career path with a trajectory of only upward and expanding possibilities.

After undertaking his own due diligence, Broadus Palmer learned everything that he needed to know to successfully become qualified and land a 6-figure job. His journey inspired the birth of his company Level Up In Tech which offers a comprehensive approach to cloud career coaching.

Scroll to Continue

Employers want to see experience, and that's what we help to create. We provide hands-on projects that can be used to juice up your resume and set you apart from other applicants. We provide real-world projects to help you expand your skills. shares Broadus.

Throughout his own career journey, Broadus was able to navigate the twists and turns. Outside of the certifications that he needed to be deemed qualified in cloud engineering, he was quick to realize that what made him stand out even further was his mindset and unique approach to problem solving. This ultimately led him to identifying the potential of what he could offer to companies, and how Level Up In Tech was birthed.

What Broadus brings to the table through Level Up In Tech is a combination of the aforementioned, but also the strong desire to help others, to passionately want to see others thrive in the field, and to see them achieve their goals.

The care and consideration for the human element in the learning and hiring process put Broadus and Level Up In Tech in a strong-standing position. This unique standpoint helps towards the development of the minds of future tech workers who will be at the forefront of what the cloud computing industry has to offer and what it is set to become.

Although Broadus is changing lives indirectly through motivational content across multiple social media platforms, he firmly believes in the power of one-on-one Cloud Career Coaching. Now highly regarded as a pillar of the cloud community, Broadus tirelessly works to restore the collective faith of self-belief with his can-do attitude and to encourage more individuals to follow their hearts into the ever-evolving field of cloud computing.

Original post:
Cloud Computing Expert, Broadus Palmer: The New Approach To Cloud Career Coaching That Changes Everything - LA Progressive

Read More..

3 Reasons Why We’ve Stopped Talking About Private Cloud Computing – ITPro Today

If you were to list the buzzwords dominating the cloud computing industry today, terms like "multicloud," "hybrid cloud," and "alternative cloud" would probably top it.

But here's one term that may not even make the list: "private cloud."

Related: A Guide to Cloud Architectures: Single Cloud, Multicloud, Poly Cloud and Beyond

Although private cloud architectures were once all the rage, you hear very little about them today. Likewise, major private cloud computing platforms, like OpenStack, don't tend to make many headlines these days.

That's interesting because it's not as if private clouds have gone away. On the contrary, they are alive and well. Platforms such as OpenStack also remain under quite active development, with new features being unveiled regularly.

Related: Lines Between Public, Private, Hybrid Cloud Architectures Are Blurring

So, if the lack of buzz surrounding private cloud isn't due to the death of the private cloud ecosystem, what does explain it? Why have we suddenly stopped talking very much about private cloud computing?

A few factors are likely at play. Let's take a look at each one to gain perspective on the state of private cloud computing as of 2022.

One reason why you hear less and less about private cloud computing today is that the big public clouds haven't found a way to sell private cloud services and so they haven't promoted private cloud in the way they promote other third-party solutions.

It wouldn't make much sense, for example, for public cloud providers to offer something like OpenStack as a service. And, although you can install OpenStack in a public cloud if you want, there are few use cases where you would want to do so. You'd be losing out on many of the privacy and cost benefits of running a private cloud on your own infrastructure, and you'd be duplicating a lot of the features that you could get by using the public cloud directly.

Compare OpenStack in this sense to, say, Kubernetes an open source platform that the public clouds have monetized quite successfully using an SaaS service model and it's easy to see why there is so much hype surrounding Kubernetes in the modern cloud computing industry, but little attention to classic private cloud solutions.

A second major reason why there is not a lot of buzz around private cloud computing at present, I suspect, is that private cloud platforms never really went head-first into the cloud-native computing realm.

What I mean by this is that most private cloud platforms were designed, and remain, solutions primarily for running workloads on bare-metal or virtual servers. They don't target containers, serverless functions, or other more "modern" types of workloads.

To be fair, this is probably because private clouds were conceived before containerization really took off. At the time that systems such as OpenStack were being designed, it was hard to envision a world where so many things would run as microservices and containers.

You certainly can run containers on a private cloud if you want via approaches like Kubernetes integration with OpenStack. But like running OpenStack in a public cloud, there are few obvious use cases for this practice. It would make more sense in most situations just to set up a Kubernetes cluster on its own, without OpenStack.

To put all of this another way, private cloud computing has sort of been left behind as the cloud computing industry as a whole has pivoted toward cloud-native architectures which means containerized, microservices-based architectures over the past several years. That's not due to any fault of the private cloud ecosystem. It's just what happened.

Ten or so years ago, the major talking points for running a private cloud instead of using a public cloud were that public clouds were less secure and they gave users less control.

Those points were mostly valid in the earlier 2010s. Since then, however, public clouds have evolved to become much more secure and flexible. They have built more extensive networking services, enhanced their access control frameworks, rolled out data privacy and compliance, and created sophisticated monitoring and auditing solutions, all of which help secure public cloud workloads. They have also introduced many more types of cloud services, and have given users many more configuration options, than they had in the days when public clouds mostly consisted of VMs, databases, and storage as a service.

Relatedly, the public clouds have also gone head-first into the hybrid cloud world, providing even more possibilities for users to build highly secure and flexible cloud environments using public cloud services and infrastructure.

In these ways, public cloud has become a much more obvious choice even for use cases with high security requirements or bespoke configuration needs. As a result, private cloud is no longer at the center of conversations about cloud security and control.

Private cloud computing is by no means dead. But the days are over when it's a major source of discussion, or when choosing between "private cloud versus public cloud" is a key issue for many businesses. That's mostly because private cloud just hasn't kept up with the other trends dominating the cloud computing industry, and it's hard to see that changing.

See more here:
3 Reasons Why We've Stopped Talking About Private Cloud Computing - ITPro Today

Read More..

Recession-fearing investors keep slashing the fastest-growing cloud stocks – CNBC

Nima Ghamsari, co-founder and chief executive officer of Blend, speaks during the Sooner Than You Think conference in New York on Oct. 16, 2018.

Alex Flynn | Bloomberg | Getty Images

Tech investors finally got some relief this past week, as the Nasdaq broke a seven-week losing streak, its worst stretch since the dot-com bust of 2001.

With five months in the books, 2022 has been a dark year for tech so far. Nobody knows that more than investors in cloud computing companies, which were among the darlings of the past five years, particularly during the stay-home days of the pandemic.

Paradoxically, growth remains robust and businesses are benefiting as economies re-open, but investors are selling anyway.

Bill.com, Blend Labs and SentinelOne are all still doubling their revenue year over year, at 179%, 124% and 120%, respectively. Yet the trio is worth around half of what they were at the end of 2021. The market has taken a sledgehammer to the entire basket.

Byron Deeter of Bessemer Venture Partners, an investor in cloud start-ups and one of the most vocal cloud-stock commentators observed earlier this month that the revenue multiples for the firm's BVP Nasdaq Emerging Cloud Index had fallen back to where they were in 2017.

One of Deeter's colleagues at Bessemer, Kent Bennett, isn't sure why the fastest growers aren't getting a pass on the slashing across the cloud category. But he has an idea.

"You can absolutely imagine in a moment like this it would go from revenue to, 'Holy crap, get me out of this market,' and then settle back into efficiency over time," said Bennett, who sits on the board of restaurant software company Toast, which itself showed 90% growth in the first quarter. The stock is now down 52% year to date.

Toast disclosed declining revenue in 2020 as in-person restaurant visits lightened up, leading to less intense use of the company's point-of-sale hardware and software. Then online ordering took off. Now people are increasingly dining in again, and Toast is seeing stronger demand for its Go mobile point-of-sale devices and QR codes that let people order and pay on their own phones, CEO Chris Comparato said in an interview with CNBC earlier this month.

Now that the company has recovered from its Covid stumble, investors are telling the company to "paint a better path toward profitability," he said.

Management is telling all teams to be very diligent about their unit economics, but Comparato said he's not ready to tell investors when exactly the company will break even, though.

What Toast did offer up is new information on margins. On Toast's first-quarter earnings call earlier this month, finance chief Elena Gomez said guidance implies that its margin for earnings before interest, tax, depreciation and amortization in the second half of 2022 will be 2 points higher compared with the first half as the company works to bolster margins in the future.

"A few investors pushed, and they want a little bit more detail, certainly," Comparato said. "But many of them are like, 'Okay, this was a different tone, Chris, thank you. Chris, and Elena, please keep executing on this on this vision.'"

Other cloud companies are getting the message, too.

Data-analytics software maker Snowflake, which just ended a two-and-a-half-year streak of triple-digit revenue growth, is "not a growth-at-all-costs company," CEO Frank Slootman declared on a call with analysts on Wednesday.

Zuora, which offers subscription-management software, is "focused on building a successful long-term company, delivering durable and profitable growth for years to come," CEO Tien Tzuo said on his company's quarterly analyst call. The company reported a $23.2 million net loss on $93.2 million in revenue, compared with a $17.7 million loss in the year-ago quarter.

Even across the wider software industry, there is a re-acknowledgment of the old-fashioned view that software should make money. Splunk, whose software helps corporate security teams amass and analyze data, included a slide in its shareholder presentation called "Growing Profitability With Scale." It charted the past few years of Splunk's performance against the "Rule of 40," a concept stipulating that a company's revenue growth rate and profit margin should add up to 40%. Splunk called for 35%, the closest it will have been in three years, in the current fiscal year.

The emphasis on efficiency isn't completely absent at Bill.com, whose software helps small and medium-sized businesses manage bills and invoices, but that's easier to miss, because the revenue is growing so much faster than it is at most businesses. Even before the software selloff began in November, executives have touted the company's healthy unit economics.

Blend Labs, which gives banks software they can draw on for mortgage applications and other processes, has been more active in repositioning itself for the new market reality, but it's also one-seventeenth the size of Bill.com by market capitalization.

Despite enjoying hypergrowth, Blend cut its headcount by 10% in April. Nima Ghamsari, the company's co-founder and head, told analysts the company was conducting a "comprehensive review to align our cash consumption and market realities near-term, while charting a clear course toward stronger product and operating margins that will lead to Blend having long-term profitability."

SentinelOne, which sells cybersecurity software that detects and responds to threats, has been busy working on its cost structure. Co-Founder and CEO Tomer Weingarten turned analysts' attention to its margin improvement during a March conference call, and he said the company aims to make more progress over the next year.

The comments, and the better-than-expected results in general, were well received by analysts. But many still lowered their price targets on SentinelOne stock anyway.

"While we are increasing our growth estimates on S, we reduce our PT to $48/share due entirely to a reduction in software multiples," analysts at BTIG wrote to clients. In other words, the category was getting crushed, and SentinelOne was not exempt.

By that point the WisdomTree Cloud Computing Fund, an exchange-traded fund tracking Bessemer's index, had tumbled 47% from its Nov. 9 high. The decline hasn't stopped as the Federal Reserve has reiterated plans to fight inflation with higher interest rates.

That leaves cloud observers wondering when the downward pressure will ease up.

"It's going to take us a couple months to get through this, said Jason Lemkin, founder of SaaStr, a company that holds cloud-centric conferences. He likens the decline to a hangover, after Covid got investors drunk on cloud stocks. "We haven't got through our Bloody Marys and Aspirins," he said.

Two of the biggest divas in the Covid cloud set, Shopify and Zoom Video Communications, saw the triple-digit growth go away last year as stores began to reopen and in-person social engagements began to return. If anything, that's when investors should have grasped that the demand boom was largely in the past, Lemkin said.

"We're reverting to the mean," he said.

The reset might not be uniform, though. Cloud companies that adhere to the Rule of 40 are showing considerably healthier revenue multiples than those that don't, said Mary D'Onofrio, another investor at Bessemer. Companies showing free cash flow margins above 10% are also enjoying higher multiples better these days, she said, with investors fearing a recession.

"The market has rotated to where cash is king," D'Onofrio said.

-- CNBC's Ari Levy contributed to this report.

WATCH: Tech will see cutbacks in marketing budgets, slower recruiting and layoffs, says Bessemer's Deeter

Read the rest here:
Recession-fearing investors keep slashing the fastest-growing cloud stocks - CNBC

Read More..

What happened to the IBM cloud? – Analytics India Magazine

IBM has signed a Strategic Collaboration Agreement (SCA) with Amazon Web Services to offer its software catalogue as Software-as-a-Service (SaaS) on AWS. As a result, AWS will now have over 100 resources in IBM Consulting, IBM Software, and Red Hat. By deepening our collaboration with AWS, were taking another major step in giving organisations the ability to choose the hybrid cloud model that works best for their own needs, said Tom Rosamilia, senior vice president of IBM Software.

Last year, CEO Arvind Krishna said IBM is betting big on hybrid cloud, automation and AI. IBM is all-in on hybrid cloud and AI, determining years ago that our clients only feasible path to rapid digital transformation is through a hybrid cloud strategy. Public cloud is an integral piece of that strategy, IBM said in a statement.

In the quarter ended March, IBM reported revenues of USD 14.2 billion at 7.7 percent growth. Red Hats revenues rose by 18 percent to around USD 1.41 billion, accounting for 1/10 of IBM revenue. In October 2020, IBM announced it would spin off its infrastructure services business unit into a new company- Kyndryl. For the quarter ended March 31, 2022, Kyndryl reported revenues of USD 4.4 billion, a year-over-year decline of 7 percent.

In November 2013, IBMs then CEO Ginni Rometty said IBMs top innovation, Watson, would run on the companys power chips inside SoftLayer. IBM had then just acquired the cloud-computing division. Amazon and Microsoft had spent a decade building an efficient cloud infrastructure by then. AWS launched the Amazon Elastic Compute Cloud in August 2006, and Microsofts Windows Azure became available in early 2010.

In 2013, IBM bought Dallas-based SoftLayer for USD 2 billion. SoftLayer was one of the largest privately-held cloud computing firms. With their support, IBM hoped their cloud services would generate USD 7 billion a year by 2015. On September 30, 2015, IBMs cloud services revenues reached USD 9.4 billion and SoftLayers revenues grew in the double digits.

SoftLayers cloud infrastructure was designed for smaller markets that preferred cheaper softwares instead of bigger organisations that focused on the cloud features. For instance, the data centres of SoftLayer (they were already operating 13 data centres in 2013) were designed for off the rack servers. Meanwhile, the likes of Amazon and Microsoft were designing independent servers with enterprise-grade performance and reliability criteria.

IBM soon realised they needed to build a cloud infrastructure to serve clients needs. For example, big organisations demanded resiliency features like availability zones and large application deployments from their data centres. AWS could meet such demands, but IBMs SoftLayer could not. The latter could also not provide the virtual private cloud technology AWS had introduced in 2009.

The companys executives hired people from Verizons cloud services business to rebuild the cloud.

The Big Blue later acquired Verizon. John Considine, Verizons then CTO, became the General Manager of Cloud Infrastructure Services at IBM- a position he held till 2019. His job was to replace SoftLayers approach and build a new cloud architecture. Considine was working on the project code-named Genesis- an attempt to build an enterprise-grade cloud system from scratch to accelerate the delivery of products on the web with next-generation infrastructure (NGI). IBM envisioned the end product as a fabric computer incorporating 3D Torus and a single large, expensive disk. The company hoped to reduce the latency to less than 20 milliseconds.

Apart from the Genesis project, another group, led by a team from IBM Research, designed a separate infrastructure architecture called GC. The GC hoped to use the original SoftLayer infrastructure design to scale the cloud and make it a virtual private cloud.

In 2017, Genesis was scrapped. Parallel to the GC effort, IBMs team started working on a new architecture project, the NG. The IBM teams worked on two different cloud infrastructures for two years, leading to internal conflicts and breeding confusion. Both the architectures became available in 2019 and ran for a few years till the GC was scrapped.

By the time IBMs cloud infrastructures were up in 2019, Amazon, Microsoft, and GCP already cornered the market.

When Arvind Krishna took over IBM Cloud in January 2019, he aimed to end the double-track infrastructure design strategy and focus on a single cloud approach. My approach is straightforward: I am going to focus on growing the value of the company. This includes better aligning our portfolio around hybrid cloud and AI to meet the evolving needs of the market, Krishna said during the companys first quarter 2020 earnings presentation.

This quarter, IBMs Cloud Paks, its AI-powered software designed for the hybrid cloud landscape, saw a 100 percent net retention rate. Today were a more focused business, and our results reflect the execution of our strategy, said Arvind Krishna We are off to a solid start for the year, and we now see revenue growth for 2022 at the high end of our model.At CNBCs Transform conference, Krishna said he wanted to take advantage of IBMs Red Hat acquisition and help customers manage a growing hybrid cloud world. The IBMs 2020 earnings showed the cloud and cognitive software revenues were down 4.5 percent to USD 6.8 billion. In the latest quarterly report, IBM clouds revenue stood at USD 5 billion.

See the rest here:
What happened to the IBM cloud? - Analytics India Magazine

Read More..

South Africas explosion of cloud computing providers – MyBroadband

South Africas cloud computing space has seen an influx of major players launching services in the country over the past three years.

These include Amazon, Microsoft, and Huawei, which have all launched cloud regions in South Africa.

Cloud providers have spent billions of rands establishing their infrastructure locally.

The launch of Amazon and Microsoft cloud nodes in South Africa meant that companies and developers could host their applications and data within the country.

In addition to lower latencies, this made it easier for government-linked entities and companies with strict data protection requirements to use such services.

Amazon Web Services (AWS) opened its Africa region, based in Cape Town, in April 2020.

The region is named Africa (Cape Town) with the label af-south-1.

AWS told MyBroadband that the arrival of its infrastructure in South Africa would assist a range of organisations. It would also help developers start businesses and build new products and services.

Microsoft launched its South African Azure region in March 2019.

The company recently launched Azure Availability Zones in its South Africa North region.

Microsft said this would bring higher availability and asynchronous replication across Azure regions for recovery protection.

Oracles Johannesburg region went live in January 2022. It is the cloud computing providers first region on the continent.

The company initially announced its intentions to launch 20 new cloud data centres, including one in South Africa, by the end of 2020, but the Covid-19 pandemic delayed its plans.

Oracle said the new facilities would make it easier for businesses in the region to improve performance and protect data.

Huawei began offering its commercial cloud services in South Africa in March 2019 when it launched its Johannesburg data centre. It aims to help African governments, carriers, and organisations across various industries.

The company said South Africa was one of the most diverse and promising emerging markets globally.

It plans to launch another data centre in Cape Town and has hinted at launching an availability zone in Durban.

Acronis unveiled its Cyber Cloud Data Centre in Johannesburg one of the 111 currently being deployed by the company in late January 2022.

The data centre in Johannesburg will provide local organisations with a location within South Africa to store critical business and client data.

A local presence is a necessity for modern cloud businesses, the company said when it launched.

Along with various global cloud computing players launching availability zones in South Africa, data centre infrastructure providers have been investing billions into their local facilities.

Vantage Data Centres announced in October 2021 that it had begun construction on its Waterfall data centre. The first phase is expected to come online in the last quarter of 2022.

Africa Data Centres is investing R7.2 billion in data centres throughout Africa, including one in South Africa.

Teraco Data Environments announced in October 2021 that it had completed the first phase of its new data centre in Brackenfell, Cape Town.

Open Access Data Centres (OADC) revealed that it had deployed its first open-access edge data centre in South Africa in early May 2022.

OADC said the facility would support 5G rollouts, network extensions, and the internet service provider and fibre network operator infrastructure in new areas.

The company also promised that the facility would contribute to improved latency and improve end-user experiences.

Continued here:
South Africas explosion of cloud computing providers - MyBroadband

Read More..