Category Archives: Cloud Computing
Hive: Distributed Cloud Computing Company Raises 12 Million – Pulse 2.0
SC Ventures (Standard Chartereds ventures arm) is leading a 12 million (USD $13 million) Series A funding round for distributed cloud provider Hive, which aims to increase businesses and individuals access to sustainable and high-powered computing resources. OneRagtime, a French venture capital fund that led Hives Seed round, and a collection of private investors joined the round.
Hives distributed cloud aggregates individual devices unused hard drives and computing capacities. And Hive is reinventing the cloud from a centralized model that uses expensive physical servers to a distributed cloud infrastructure that aggregates individual devices unused hard drives and computing capacities.
Hives model helps businesses efficiently manage their cloud-related expenses, reduce dependency on several cloud providers, and significantly reduce cloud energy use.
Since October 2023, Hive has had over 25,000 active users and contributors from 147 countries. These users store their files on hiveDisk and contribute unused hard drives to hiveNet to lower their subscription costs and effectively build the distributed cloud.
The computing capacity contributed to hiveNet also powers hiveCompute, allowing companies to manage workloads, such as running GenAI inference, video processing, and 3D modeling. And HiveNets architecture provides access to additional CPU, GPU, or NPU when needed, boosting the much-needed computing power. Companies looking for more control could build their private hiveNet, where IT managers control the devices entirely.
In December, Hive unveiled a Joint Development Partner (JDP) initiative, working closely with key partners to innovate the cloud landscape for businesses leveraging GenAI LLM computations.
Hive is a champion of sustainable technological progress, offering a practical solution to the challenges posed by traditional cloud computing models. With its latest funding round, Hive is set on growing its team and global footprint to address the enterprise markets, starting with startups and SMBs. The team prioritizes several business areas, including product development, building an engaged community of contributing Hivers, and sales and marketing efforts to reach users at scale.
KEY QUOTES:
Hive is addressing the pressing need for a new cloud paradigm that democratizes access, lowers financial barriers, and encourages innovation. With over 70% of the computing power available in our devices and billions of devices connected to the Internet, Hives community driven model builds The Right Cloud to offer a greener, more resilient network and secure alternative that also promotes a more equitable cloud solution. We thank our investors, as well as INRIA and Bpifrance, for their continuous support as we look to achieve our ambitious goals.
David Gurl, Hive Founder
We are big believers of Hives distributed cloud technology that will enable cheaper and more efficient access to computing power and storage, a critical point when most of our ventures may have an AI component requiring increasing such computing power. In addition to our investment, our ventures will be leveraging Hives services.
Alex Manson, who heads SC Ventures
Cloud technology has opened up horizons of innovation, but it also comes with challenges in terms of costs, security, data privacy, and environmental impact, heightened by the increasing demand for computing resources, especially for artificial intelligence. Hive, with its pioneering approach to distributed cloud, makes cloud access more secure, affordable, and efficient for everyone, and enables the sharing of computational power resources. As an early investor and believer, OneRagtime is particularly excited to support Hives vision and team.
Stphanie Hospital, Founder & CEO at OneRagtime
See the original post here:
Hive: Distributed Cloud Computing Company Raises 12 Million - Pulse 2.0
Cloud computing trends – Enterprise License Optimization Blog
The thirteenth annual Flexera 2024 State of the Cloud Report (previously known as the RightScale State of the CloudReport) highlights the latest cloud computing trends and statistics, including strategies, challenges and initiatives from a broad cross-section of industries and organizations. The cloud computing report explores the thinking of 753 IT professionals and executive leaders from a survey conducted in late Q4 2023 and highlights the year-over-year (YoY) changes to help identify trends. The respondentsglobal cloud decision makers and usersrevealed their experiences with cloud migration, cloud computing and their thoughts about the public, private and multi-cloud market.
Select highlights of the report on cloud computing are included below.
Terminology used:
This marks the second year in a row that managing cloud spending is the top challenge facing organizations. As in previous years, there needs to be more resources/expertise. More than a quarter of respondents spend over $12 million a year on cloud (29%), and nearly a quarter (22%) spend that much on SaaS.
Respondents saw a slight increase in multi-cloud usage, up from 87% last year to 89% this year.
Sixty-one percent of large enterprises use multi-cloud security, and 57% use multi-cloud FinOps (cost optimization) tools.
The top two multi-cloud implementations are: apps siloed on different clouds, DR/failover between clouds. Apps siloed on different clouds increased the most (up to 57% from 44% YoY). Data integration between clouds increased to 45% from 37% YoY as organizations looked for the best fit for applications and data analysis.
Adoption grew for Amazon Web Services (AWS), Microsoft Azure and Google Cloud. Forty-nine percent of respondents reported using AWS for significant workloads, while 45% reported using Azure and 21% reported using Google Cloud Platform. In contrast, Oracle Cloud Infrastructure, IBM and Alibaba Cloud usage is substantially lower and relatively unchanged compared to the previous year.
SMBs are the highest cloud adopters, but fell off slightly from the previous year, with 61% (a drop from 67% last year) of workloads and 60% of data in the public cloud for both years.
Nearly all platform-as-a-service (PaaS) offerings saw a gain in usage, with the most prominent being in the data warehouse (up to 65% from 56% YoY). Container-as-a-service (52%) and serverless (function-as-a-service) (48%) are both up nine percentage points this year. Machine learning/artificial intelligence (ML/AI) had a modest gain at 41%, up from 36% last year. However, ML/AI is the PaaS offering getting the most attention from companies experimenting (32%) or planning to use it (17%).
Forty-eight percent of respondents say they already have defined sustainability initiatives that include tracking the carbon footprint of cloud usage. When asked how sustainability compares to cost optimization, 59% prioritized cost optimization, though an additional 29% say that both cloud cost optimization and sustainability are equally prioritized.
The world has experienced extraordinary disruption in the past few years, and while organizations of all sizes are prioritizing every dollar of spend, the cloud and technology will weather economic storms. Enterprises that remain focused on digital transformation, seizing new opportunities and evolving strategic initiatives through a cost-conscious lens will be better positioned for success than their competitors.
Get the latest insights in cloud computing trends and cloud migration statistics by viewing the complete survey results here.
Read more here:
Cloud computing trends - Enterprise License Optimization Blog
Bitmovin to run Live Encoder on Akamai cloud – Televisual
Bitmovin, a leading video streaming software solutions provider is launching its Live Encoder running on Akamai Cloud Computing.
By reducing data transfer out (DTO) costs, the combination of Bitmovin Live Encoding and Akamai can significantly help to lower operation costs.
Running Bitmovin Live Encoder on Akamai Cloud Computing is intended to help streaming services deliver better live viewing experiences across a host of use cases including sports/eSports, news, online fitness, eLearning, religious services, large-scale events, corporate communications, and political campaigns among others. Bitmovins Live Encoder also supports several ad monetization models, including 24/7 linear television channels and Free Ad-supported Television (FAST) channels.
Bitmovin can help its live-streaming customers deliver higher-quality viewing experiences, and reduce and better control costs by running Live Encoder on Akamai Cloud Computing, said Dan Lawrence, Vice President of Cloud Computing at Akamai (pictured). Placing and executing Live Encoders critical compute functions closer to end users can realize lower latency streaming while maintaining the high quality of service that consumers have come to expect and demand from streaming providers. It can also help dramatically reduce DTO fees in many cases. Collectively, we believe this meets the industrys desire to continue raising the standards of live streaming, provide lower and more predictable operational costs, and more opportunities to monetize content.
Bitmovins Live Encoder has a user interface designed to make it easy for users of all levels to set up live streams quickly, while Bitmovins API gives developers control over every aspect of the encoding pipeline. Live Encoder is pre-integrated with Akamai Media Services Live to support live-to-VOD and live clipping, which is part of Akamai Connected Cloud to support secure and efficient streaming at massive scale across Akamais global content delivery network (CDN).
Customers who run Bitmovins Live Encoder on Akamai will also benefit from pre-integrated third-party solutions for their video streaming workflows, including Videons LiveEdge contribution encoders, Grass Valleys Agile Media Processing Platform (AMPP) for live production, Zixi for secure transport and ingest; EZDRM for multi-DRM and content encryption; Yospace for Server-side Ad Insertion (SSAI), and more.
Our Live Encoder elevates live streaming, eliminating sub-par image and audio quality so audiences can enjoy truly immersive live experiences, said Stefan Lederer, CEO and co-founder of Bitmovin, Its a huge honor to announce our Live Encoder is running on Akamai Cloud Computing, which will help organizations of every size accelerate the quality of their live streaming workflows and deliver world-class viewing experiences.
The Bitmovin Live Transcoder running on Akamai comes by way of Bitmovin joining the Akamai Qualified Computing Partner (QCP) Program. The program is designed to make solution-based services that are interoperable with Akamai Cloud Computing services easily accessible to Akamai customers. The services are provided by Akamai technology partners that complete a rigorous qualification process to ensure they are readily available to deploy and scale across the globally distributed Akamai Connected Cloud.
Bitmovin will demonstrate its Live Encoding on Akamai Cloud Computing at the 2024 NAB Show in Las Vegas, April 14-17 (Bitmovin exhibitor stand W3013, Akamai meeting space W235LMR).
Share this story
Follow this link:
Bitmovin to run Live Encoder on Akamai cloud - Televisual
Cloud Provider Vultr Has Bone To Pick After Reddit Post – CRN
'We do think this person knows better,' chief marketing officer Kevin Cochrane tells CRN. Were HIPAA compliant. If our terms of service meant we owned your data, we wouldnt be HIPAA compliant. Were GDPR compliant. If we owned your data, we wouldnt be GDPR compliant.
Private cloud provider Vultr is clearing the air after a widely viewed Reddit post claimed the company had changed its terms of services in a way that would give it ownership of all of the data stored or used on its network.
We do think this person knows better, Vultr Chief Marketing Officer Kevin Cochrane told CRN Thursday. Were HIPAA compliant. If our terms of service meant we owned your data, we wouldnt be HIPAA compliant. Were GDPR compliant. If we owned your data, we wouldnt be GDPR compliant.
Cochrane said that the terms of services cited in the Reddit post referred specifically to content posted to a public message board that has not been active in some time.
The content that you deploy on Vultr servers is wholly owned by you, said Cochrane (pictured above).
He said West Palm Beach, Fla.-based Vultr did update its terms of service. However, it was only to notify customers that the company will suspend accounts that have been dormant for two years.
Thats the reason everyone is having to click on this, Cochrane said.
Cochrane said the company believes the Reddit post was designed to spread misinformation after Vultr was among the first cloud providers to offer customers the ability to use Nvidias GH200 Grace Hopper Superchip with their workloads.
This is why the terms of service is such a concern for us. We specifically challenge the hyperscalers and other public clouds for using your private data for other purposes. business, he said. Our statement has always been Your private data is your private data.
The Reddit account that created the post has been active for five days. The post has 1,500 upvotes which gives the post added credibility inside Reddits platform. In it, the original poster claimed that Vultr changed its terms of service to state:
You hereby grant to Vultr a non-exclusive, perpetual, irrevocable, royalty-free, fully paid-up, worldwide license (including the right to sublicense through multiple tiers) to use, reproduce, process, adapt, publicly perform, publicly display, modify, prepare derivative works, publish, transmit and distribute each of your User Content, or any portion thereof, in any form, medium or distribution method now known or hereafter existing, known or developed, and otherwise use and commercialize the User Content in any way that Vultr deems appropriate, without any further consent, notice and/or compensation to you or to any third parties, for purposes of providing the Services to you.
Cochrane said this portion of Vultrs terms of service relates just to messages and content shared on a public discussion forum that Vultr hosts and is not related to the data and apps that customers use on Vultr systems.
The specific language in the post is, if you post content on one of our public mediums. IT was specific to when we had a forum. So if you are posting content on a forum, that forum is owned by us because we have to publicly publish it so other people can see the posts.
He compared the language to tech debt that is no longer needed, but carried forward, through newer iterations. To avoid confusion, he said Vultr is stripping the language from its terms moving forward.
CRN has reached out to the person who penned the Reddit post but had not heard back at press time.
Vultr, a privately held cloud computing platform, has 1.5 million customers across 185 countries. It offers cloud computing infrastructure and resources spanning from bare metal options to GPU compute available on demand.
Backed by parent company Constant, Vultr provides shared and dedicated CPU, block and object storage, Nvidia cloud GPUs, as well as networking and Kubernetes solutions. The companys mission is to make high-performance cloud computing easier to use, affordable and locally accessible.
Vultr is consistently expanding its data center footprint. In May 2023, Vultr Talon was launched to offer customers accelerating computing by enabling GPU sharing so multiple workloads can run on a single Nvidia GPU.
More here:
Cloud Provider Vultr Has Bone To Pick After Reddit Post - CRN
Cloud Email Filtering Bypass Attack Works 80% of the Time – Dark Reading
Computer scientists have uncovered a shockingly prevalent misconfiguration in popular enterprise cloud-based email spam filtering services, along with an exploit for taking advantage of it. The findings reveal that organizations are far more open to email-borne cyber threats than they know.
In a paper that will be presented at the upcoming ACM Web 2024 conference in Singapore in May, the authoring academic research team noted that services in wide use from vendors such as Proofpoint, Barracuda, Mimecast, and others could be bypassed in at least 80% of major domains that they examined.
The filtering services can be "bypassed if the email hosting provider is not configured to only accept messages that arrive from the email filtering service," explains Sumanth Rao, a graduate doctoral student at University of California at San Diego and lead author of the paper, entitled "Unfiltered: Measuring Cloud-based Email Filtering Bypasses."
That might seem obvious, but setting the filters to work in tandem with the enterprise email system is tricky. The bypass attack can happen because of a mismatch between the filtering server and the email server, in terms of matching how Google and Microsoft email servers react to a message coming from an unknown IP address, such as one that would be used by spammers.
Google's servers reject such a message during its initial receipt, while Microsoft's servers reject it during the "Data" command, which is when a message is already delivered to a recipient. This affects how the filters should be set up.
The stakes are high, given that phishing emails remain the initial access mechanism of choice for cybercriminals.
"Mail administrators that don't properly configuretheir inbound mail to mitigate this weakness are akin to bar owners who deploy a bouncer to check IDs at the main entrance but allow patrons to enter through an unlocked, unmonitored side door as well," says Seth Blank, CTO of Valimail, an email security vendor.
After examining Sender Policy Framework (SPF)-specific configurations for 673 .edu domains and 928 .com domains that were using either Google or Microsoft email servers along with third-party spam filters, the researchers found that 88% of Google-based email systems were bypassed, while 78% of Microsoft systems were.
The risk is higher when using cloud vendors, since a bypass attack isn't as easy when both filtering and email delivery are housed on premises at known and trusted IP addresses, they noted.
The paper offers two major reasons for these high failure rates: First, the documentation to properly set up both the filtering and email servers is confusing and incomplete, and often ignored or not well understood or easily followed. Second, many corporate email managers err on the side of making sure that messages arrive to recipients, for fear of deleting valid ones if they institute too strict a filter profile. "This leads to permissive and insecure configurations," according to the paper.
Not mentioned by the authors, but an important factor, is the fact that configuring all three of the main email security protocols SPF, Domain-based Message Authentication Reporting and Conformance (DMARC), and DomainKeys Identified Mail (DKIM) are needed to be truly effective at stopping spam. But that isn't easy, even for experts. Add that to the challenge of making sure the two cloud services for filtering and email delivery communicate properly, and the coordination effort becomes extremely complex. To boot, the filter and email server products are often managed by two separate departments within larger corporations, introducing yet more potential for errors.
"Email, like many legacy Internet services, was designed around a simple use case that is now out of step with modern demands," the authors wrote.
The documentation provided by each filtering vendor does vary in quality, according to the researchers. The paper points out that the instructions on the filtering products from TrendMicro and Proofpoint are particularly error-prone and can easily produce vulnerable configurations. Even those vendors that have better documentation, such as Mimecast and Barracuda, still produce high rates of misconfiguration.
While most vendors did not respond to Dark Reading's request for comment, Olesia Klevchuk, a product marketing manager at Barracuda, says, "Proper setup and regular 'health checks' of security tools is important. We provide a health-check guide that customers can use to help them identify this and other misconfigurations."
She adds, "most, if not all, email-filtering vendors will offer support or professional services during deployment and after to help ensure that their solution works as it should. Organizations should periodically take advantage and/or invest in these services to avoid potential security risks."
Enterprise email administrators have several ways to strengthen their systems and prevent these bypass attacks from happening. One way, suggested by the paper's authors, is to specify the filtering server's IP address as the sole origin of all email traffic, and to ensure that it can't be spoofed by an attacker.
"Organizations need to configure their email server to only accept email from their filtering service," the authors wrote.
Microsoft's documentation lays out email defense options and recommends setting a series of parameters to enable this protection for exchange online deployment, for example. Another is to ensure that all SPF, DKIM, and DMARC protocols are correctly specified for all domains and subdomains used by an enterprise for email traffic. As mentioned, that could be a challenge, particularly for larger companies or places that have acquired numerous domains over time and have forgotten about their use.
Finally, another solution, says Valimail's Blank, "is for the filtering application to include Authenticated Receiver Chain (RFC 8617) email headers, and for the inner layer to consume and trust these headers."
Read the original here:
Cloud Email Filtering Bypass Attack Works 80% of the Time - Dark Reading
The Shocking Power Problem Behind Cloud Computing and Artificial Intelligence – Channelnomics
The demand for electric power is outstripping demand for data center capacity, which has the potential to slow the development of cloud computing and artificial intelligence services.
By Larry Walsh
A passing press release at the beginning of the year received little attention, as is often the case, despite predicting substantial sales growth for the vendor and its partners over the next two years. The press release was issued by Vertiv, a power systems manufacturer that provides equipment essential for running servers in data center racks. In touting the success of its acquisitions of E&I Engineering and Powerbar Gulf, Vertiv stated that it expects to more than double capacity for its switchgear, busbar, and modular solutions capacity in the next two years.
This is a bold claim, especially considering the shift of most enterprises toward cloud-based infrastructure. While the sale of data center products servers, switches, storage is projected to increase by about 10% this year, cloud computing sales are expected to jump 22%, and artificial intelligence technologies are forecasted to soar by more than 50%.
However, its the increasing demand for AI and cloud computing thats driving the sales of basic, seemingly conventional technologies such as power conditioning and backup systems. The construction of data centers, whether on-premises or for cloud services, necessitates products like those offered by Vertiv, Eaton, and Schneider Electric.
Yet, the optimistic outlook of Vertiv reveals a startling problem lurking behind the trend of cloud computing and AI: a lack of power.
Insatiable Power Demand Earlier this month, the energy industrys leaders convened in Houston for the annual CERAWeek by S&P Global, an event typically centered on electric generation and its associated inputs (oil, gas, coal, renewables). This years attendees included tech elites such as Microsofts Bill Gates and Bill Vass, vice president of engineering at Amazon Web Services, who were there to sound the alarm over the diminishing electrical capacity and the urgent need for more data centers.
At the event, Vass remarked that the world is adding three new data centers every day, each consuming as much energy as possible, with demand being insatiable. Over the next decade, the United States alone could require new capacity exceeding 100 gigawatts, sufficient to power 82 million homes. This figure doesnt even account for the capacity needed to power new homes and offices, as well as electric-vehicle fleets.
The enormous capacity requirements to power the next-generation cloud and AI era explain why OpenAI CEO Sam Altman proposed a $7 trillion fund to build data center capacity globally.
The U.S. already faces a challenge with electrical production and distribution. The Grid, as its commonly referred to, is based on century-old technology. Significant portions of the distribution network are decades-old and in need of repair, with parts of the country already experiencing brownouts and periodic disruptions because capacity cant keep up with demand.
Other developed regions encounter similar issues. Germany, for instance, became heavily reliant on Russian gas after decommissioning all of its nuclear power plants. Now, with the war in Ukraine disrupting energy supplies, Germany is compelled to reactivate conventional power plants to meet power demands.
AI Is Making the Problem Worse The development of AI will only intensify this issue. Data centers, already known as heat blooms due to their high energy consumption, will become furnaces as the massive computational processes consume more electrons. Manufacturers of servers and storage hardware are already cautioning partners and customers about the pitfalls of low-cost but power-intensive alternatives.
At CERAWeek, Gates, an advocate for sustainability, stated that the success and profitability of a data center hinge on the cost of its inputs. If electricity costs are too high, data centers will struggle to turn a profit without passing costs onto consumers.
Given that vendors sell cloud and Software-as-a-Service (SaaS) contracts on a multi-year basis, passing on costs is complicated. If a series of data centers proves unprofitable, costs will rise across the board to compensate.
Constructing more data centers isnt a straightforward solution. Real estate services firm CBRE Group reports that data center construction timelines are delayed two to six years due to electrical capacity issues.
Cloud vendors are establishing new data centers near sustainable energy sources, which doesnt always align with population or commercial needs. Moreover, building new power-generation facilities conventional or sustainable is slowed by regulatory reviews and local opposition.
Sustainability: A Solution? Balancing new data center capacity with electrical consumption needs will become a contentious issue for the technology industry, which is largely committed to sustainability goals. Microsoft aims to be carbon-neutral by 2035, and many other technology vendors are pursuing similar objectives to reduce their carbon footprint, which includes minimizing their consumption of materials and resources such as electricity.
While sustainable energy sources such as wind and solar may appear to be the solution, constructing the necessary infrastructure is as challenging as establishing a coal-fired power plant.
The power issue underlying cloud computing and AI is alarming and could hinder sales and growth, at least in the short term. Over time, vendors will improve the power efficiency of AI systems.
In the interim, vendors and partners must emphasize sustainability and efficiency as key selling points to their cloud computing, AI, and infrastructure customers. Power consumption deserves a prominent role in the total-cost-of-ownership equation, illustrating to customers the full expense of opting for cheaper but less efficient product options.
In the long term, the technology and energy sectors are likely to find a solution to this power dilemma. The remarkable aspect of technology is that it often becomes the solution to the problems it creates. For the present, though, vendors and solution providers must manage market expectations carefully.
The technology industry has excelled in convincing everyone that anything is possible with cloud computing. Its now doing the same with AI. Anything is indeed possible, provided the lights remain on.
Larry Walsh is the CEO, chief analyst, and founder of Channelnomics. Hes an expert on the development and execution of channel programs, disruptive sales models, and growth strategies for companies worldwide.
Read the original:
The Shocking Power Problem Behind Cloud Computing and Artificial Intelligence - Channelnomics
The 3 Best Cloud Computing Stocks to Buy in Q2 2024 – InvestorPlace
If youre considering the best cloud computing stocks to buy, you might want to read a February article from InfoWorld contributor David Linthicum that details why companies are leaving the cloud.
Linthicum is a computer industry expert whos not just written numerous books about it, his latest published in 2023 focuses on cloud computing, but he has extensive his industry experience. He has served in roles such as CTO and CEO of several software companies.
His argument about why UK companies are moving their cloud-based workloads back to on-premises infrastructure is very straightforward.
The cloud is a good fit for modern applications that leverage a group of services, such as serverless, containers, or clustering. However, that doesnt describe most enterprise applications, Linthicum wrote on Feb. 9.
While public cloud providers are losing business as enterprises return to on-premises infrastructure, theyll gain hugely from generative AI applications and data.
With this in mind, here are three of the best cloud computing stocks to buy.
Source: Asif Islam / Shutterstock.com
Microsoft (NASDAQ:MSFT) is one of the world leaders in cloud computing through Azure, the companys cloud computing platform. In Q2 2024, Microsofts Cloud revenue was $33.7 billion, 24% higher than Q2 2023, a fair chunk of it from Azure. For example, Microsofts Azure AI had 53,000 customers at the end of the second quarter, CEO Satya Nadella said during its conference call, one-third of them completely new to Azure.
Im pretty confident that Linthicum is 500 times brighter than myself regarding anything computer-related. However, based on what he wrote in his InfoWorld commentary, I would bet dollars to donuts, and hed agree that Microsoft will be one of the long-term beneficiaries of generative AI and the public cloud.
Ive read articles about Microsoft making lots of money while customers gain little.
Fortune reported comments from GitHub COO Kyle Daigle in early February about AI.
Daigle said the companies finding the most success implementing Copilot are those that are integrating it into their workflows and not just adding an AI button or chatbot window because its the hot thing to do. Fortune contributor Sage Lazzaro wrote on Feb. 1.
Microsoft will figure out how to make its clients happy.
Source: Daniel Fung / Shutterstock
Amazon (NASDAQ:AMZN) has a massive cloud business through AWS. Nashville-based eWeek.com recently rated the top 20 generative AI companies in 2024. Amazon made the list. Not surprisingly, many of the names were private companies. Expect many of them to go public or acquire in the next few years.
eWeek said Amazon was the best for generative AI as a service.
AWSs customers for generative AI range from small startups to major enterprises and brands like Intuit, Nasdaq, Adidas, and GoDaddy. In addition to its managed services and knowledgeable in-house support specialists, customers can benefit from a diverse partner network and the AWS Marketplace, eWeek stated on March 14.
It offers four AI solutions: Amazon Bedrock, Amazon SageMaker, Amazon Q, and Amazon CodeWhisperer. I wont be checking out the last one anytime soon, but I digress.
It also invested another $2.5 billion in Anthropic, bringing its total investment in the AI startup to $4.0 billion. Anthropic uses AWS for the cloud and some of the companys specialized computing chips. Expect more developments in the future.
What I love about Amazon is that its always looking for large revenue generators for the long haul. Its willing to spend obscene money if it sniffs a massive market. They dont get much bigger than generative AI.
Just as its turned advertising into a significant revenue generator, it will likely do the same with AI sooner than investors realize.
Source: Piotr Swat / Shutterstock.com
When I first saw the name C3.ai (NYSE:AI), I thought it was a fly-by-night operation. Investors still doubt whether its got the right stuff when it comes to AI. This apprehension is reflected in the share price. Its down more than 5% in 2024 and up just 6% over the past year despite AI being the hottest thing since sliced bread.
Even when it reported better-than-expected third-quarter results at the end of February, it couldnt hang on to the 25% single-day gain on the news. The day before announcing earnings, its share price closed at $29.69. Its down 8.8% in the month since.
However, Siebels got the company focused squarely on generative AI. It mentions the two words 18 times in its Q3 2024 press release.
The companys press release stated, C3 Generative AI continues to gain traction with organizations that rely on technology solutions to produce accurate information and process highly sensitive data.
Long story short, it continues to grow revenues by double digits each quarter while losing millions on a non-GAAP basis.
Its the riskiest of the three cloud computing stocks. If youre an aggressive investor, the risk/reward proposition suggests its worth a small bet.
On the date of publication, Will Ashworth did not have (either directly or indirectly) any positions in the securities mentioned in this article. The opinions expressed in this article are those of the writer, subject to the InvestorPlace.com Publishing Guidelines.
Will Ashworth has written about investments full-time since 2008. Publications where hes appeared include InvestorPlace, The Motley Fool Canada, Investopedia, Kiplinger, and several others in both the U.S. and Canada. He particularly enjoys creating model portfolios that stand the test of time. He lives in Halifax, Nova Scotia.
View original post here:
The 3 Best Cloud Computing Stocks to Buy in Q2 2024 - InvestorPlace
Federal IT Budget for cloud computing hits $8.3bn in FY 2025 – report – DatacenterDynamics
Cloud computing spending in Fiscal Year (FY) 2025 could reach $8.3bn for federal civilian agencies.
According to research published by government contract database firm GovWin IQ as part of its Federal Market Analysis, the total budgets for cloud computing technology have almost doubled since 2020.
Actual cloud spending in 2020 reached $4.4bn, while the request budget for 2025 is as much as $8.3bn.
A notable jump occurred between FY2023 and 2024. Previously, budgets had been growing around $400 million per year, however, in this period it jumped by $2.2bn.
GovWin IQ noted that federal civilian agencies stopped reporting on their spend on cloud computing "several years ago," but that when still published, the results tended to be "wildly inconsistent" with the verified cloud contract spending data that the Federal Market Analysis would find.
To get around this lack of transparency, GovWin IQ ran all of its "cloud market keywords" against the program descriptions listed in the IT portfolio which can be accessed via the government's IT Dashboard. Through this process, those civilian agency programs using cloud technology or planning to in the next year can be viewed.
Across FY 2023 to 2025, the civilian agency with the largest cloud computing budget was the Treasury, at $5.054bn, followed by the Department of Health and Human Services (HHS) at $2.686bn. Of the ten agencies shared in the research, the Department of Veterans Affairs had the smallest budget at $718m.
GovWin IQ noted this smaller budget: "The VA has been among the civilian agencies spending the most on cloud computing annually for the last several years, and yet here was the VA coming in tenth compared to smaller agencies such as the Social Security Administration."
GovWin suggested that this demonstrates the limitations in the reported data by federal agencies, and that there could be "dozens of VA investments" that use cloud technology yet fail to mention the cloud or other solutions known to be cloud-based, thus not being shown in the results.
Similarly, GovWin IQ points to the Treasury leading the pack as a surprising outcome, as the Treasury's cloud journey has been "long and slow." This has changed in the last couple of years, according to GovWin IQ.
"The Treasury ramped up its budget for cloud services from $515m in FY 2023 to $2.2B in FY 2024," noted the report.
"This jump partially explains the rise in the total market from FY 2023 to 2024. In FY 2025, the Treasury then requested an additional $2.4bn, illustrating how it is making a full-on enterprise push into the cloud."
Overall, despite gaps in reporting, the data suggests massive moves toward the cloud across federal agencies.
Here is the original post:
Federal IT Budget for cloud computing hits $8.3bn in FY 2025 - report - DatacenterDynamics
SC Ventures Leads Hive’s 12M Series A, Enabling Sustainable Distributed Cloud Computing for the Masses – PR Newswire
Hive's technology aims to transform 70% of the world's unused device capacity into a global supercomputer
GENEVA, March 29, 2024 /PRNewswire/ -- SC Ventures, Standard Chartered's innovation, fintech investment and ventures arm, is leading a 12 million (USD $13 million) Series A round for distributed cloud provider Hive, to increase access to sustainable, high-powered computing resources for businesses and individuals. OneRagtime, a French venture capital fund that led Hive's Seed round, and a collection of private investors also joinedthe round.
Hive is reinventing the cloud from a centralized model that uses expensive physical servers to a distributed cloud infrastructure that aggregates individual devices' unused hard drive and computing capacities. Hive's model helps businesses efficiently manage their cloud-related expenses, reduce dependency on a select few cloud providers, and significantly reduces cloud energy use. In 2023, global data centres, which power the world's cloud, required 7.4 Gigawatts of power, a 55% increase from 2022. Currently, data centres account for up to 3% of global electricity consumption, with projections suggesting this could rise to 4% by 2030.
"Hive is addressing the pressing need for a new cloud paradigm that democratizes access, lowers financial barriers, and encourages innovation," said David Gurl, Hive Founder. "With over 70% of the computing power available in our devices and billions of devices connected to the Internet, Hive's community driven model builds 'The Right Cloud' to offer a greener, more resilient network and secure alternative that also promotes a more equitable cloud solution. We thank our investors, as well as INRIA and Bpifrance, for their continuous support as we look to achieve our ambitious goals."
Since October 2023, Hive has amassed over 25,000 total active users and contributors from 147 countries, who store their files on hiveDisk and contribute a portion of their unused hard drive to hiveNet to effectively lower their subscription costs and build the distributed cloud. The contributed computing capacity to hiveNet also powers hiveCompute, allowing companies to manage workloads, such as run GenAI inference, video processing, and 3D modelling. HiveNet's architecture provides access to additional CPU, GPU, or NPU when needed, boosting the much-needed computing power. Companies seeking more control could also build their own private hiveNet, where IT managers retain full control over the devices.
In December, Hive unveiled a Joint Development Partner (JDP) initiative, working closely with key partners to innovate the cloud landscape for businesses leveraging GenAI LLM computations.
"We are big believers of Hive's distributed cloud technology that will enable cheaper and more efficient access to computing power and storage, a critical point when most of our ventures may have an AI component requiring increasing such computing power," saidAlex Manson, who heads SC Ventures. "In addition to our investment, our ventures will be leveraging Hive's services."
"Cloud technology has opened up horizons of innovation, but it also comes with challenges in terms of costs, security, data privacy, and environmental impact, heightened by the increasing demand for computing resources, especially for artificial intelligence," said Stphanie Hospital, Founder & CEO at OneRagtime. "Hive, with its pioneering approach to distributed cloud, makes cloud access more secure, affordable, and efficient for everyone, and enables the sharing of computational power resources. As an early investor and believer, OneRagtime is particularly excited to support Hive's vision and team."
Hive is a champion of sustainable technological progress, offering a practical solution to the challenges posed by traditional cloud computing models. With its latest funding round, Hive's sights are set on growing its team and global footprint, with a focus on addressing the enterprise markets starting with startups and SMBs. The team is prioritizing several areas of the business, including product development, building an engaged community of contributing Hivers, and sales and marketing efforts to reach users at scale.
About Hive Hive is revolutionizing the digital world by bringing the power of distributed cloud computing directly to the masses, providing everyone with the tools to innovate, secure their data, and contribute to a greener planet. Through hiveNet, hiveDisk, hiveCompute, and many other applications to come, Hive is aggregating latent computing resources from a community of interconnected computers to provide the market with easy-to-implement solutions that will unleash the power of distributed cloud to the masses, all while minimizing environmental impact. Hive's Joint Development Partner (JDP) program develops innovations in the cloud computing landscape for businesses leveraging GenAI LLM computations. Together, Hive is shaping a technological future that equitably uplifts every community.
To learn more about Hive, please visit http://www.hivenet.com
About SC Ventures SC Ventures is a business unit that provides a platform and catalyst for Standard Chartered to promote innovation, invest in disruptive financial technology and explore alternative business models.
For more information, please visit http://www.scventures.io and follow SC Ventures on LinkedIn.
About Standard Chartered We are a leading international banking group, with a presence in 53 of the world's most dynamic markets and serving clients in a further 64. Our purpose is to drive commerce and prosperity through our unique diversity, and our heritage and values are expressed in our brand promise, here for good.
Standard Chartered PLC is listed on the London and Hong Kong Stock Exchanges.
For more stories and expert opinions please visit Insights at sc.com. Follow Standard Chartered on Twitter, LinkedIn, Instagram and Facebook.
About OneRagtimeOneRagtime is a venture capital platform founded by Stphanie Hospital and Jean-Marie Messier. Since 2017, OneRagtime has sourced, financed, and scaled +40 start-ups. With its unique platform model, OneRagtime allows its investor community to invest in the most-promising French and European startups through its funds or dedicated club deals, while giving entrepreneurs unparalleled network and business acceleration.
Photo - https://mma.prnewswire.com/media/2375182/Hive.jpg Logo - https://mma.prnewswire.com/media/2375183/Hive_Logo.jpg
SOURCE Hive
Kingswood Wealth Advisors LLC Invests $264000 in First Trust Cloud Computing ETF (NASDAQ:SKYY) – Defense World
Kingswood Wealth Advisors LLC bought a new position in shares of First Trust Cloud Computing ETF (NASDAQ:SKYY Free Report) in the fourth quarter, according to the company in its most recent 13F filing with the Securities & Exchange Commission. The fund bought 3,013 shares of the companys stock, valued at approximately $264,000.
A number of other institutional investors and hedge funds also recently made changes to their positions in the company. Raymond James Financial Services Advisors Inc. raised its position in shares of First Trust Cloud Computing ETF by 21.4% during the fourth quarter. Raymond James Financial Services Advisors Inc. now owns 138,730 shares of the companys stock worth $12,162,000 after acquiring an additional 24,473 shares during the last quarter. Raymond James & Associates raised its position in shares of First Trust Cloud Computing ETF by 5.7% during the fourth quarter. Raymond James & Associates now owns 147,672 shares of the companys stock worth $12,946,000 after acquiring an additional 7,914 shares during the last quarter. Premier Path Wealth Partners LLC acquired a new stake in shares of First Trust Cloud Computing ETF during the fourth quarter worth $244,000. City Holding Co. grew its stake in shares of First Trust Cloud Computing ETF by 4.2% during the fourth quarter. City Holding Co. now owns 15,875 shares of the companys stock worth $1,392,000 after purchasing an additional 635 shares during the period. Finally, Quad Cities Investment Group LLC acquired a new stake in shares of First Trust Cloud Computing ETF during the fourth quarter worth $220,000.
Shares of First Trust Cloud Computing ETF stock opened at $95.60 on Friday. First Trust Cloud Computing ETF has a 12 month low of $60.65 and a 12 month high of $97.78. The business has a fifty day simple moving average of $93.85 and a 200 day simple moving average of $85.01. The stock has a market cap of $3.18 billion, a PE ratio of 20.90 and a beta of 1.06.
The First Trust Cloud Computing ETF (SKYY) is an exchange-traded fund that is based on the ISE Cloud Computing index. The fund tracks an index of companies involved in the cloud computing industry. Stocks are modified-equally-weighted capped at 4.5%. SKYY was launched on Jul 5, 2011 and is managed by First Trust.
Want to see what other hedge funds are holding SKYY? Visit HoldingsChannel.com to get the latest 13F filings and insider trades for First Trust Cloud Computing ETF (NASDAQ:SKYY Free Report).
Receive News & Ratings for First Trust Cloud Computing ETF Daily - Enter your email address below to receive a concise daily summary of the latest news and analysts' ratings for First Trust Cloud Computing ETF and related companies with MarketBeat.com's FREE daily email newsletter.