Category Archives: Cloud Hosting

AWS is investing billions in one of its biggest US cloud regions – TechRadar

Top cloud storage and hosting giant Amazon Web Services (AWS) has pledged $11 billion towards the construction of several new datacenters in the US Midwest, in what the company is calling the largest capital investment in the states history.

In a press releaseannouncing the move, AWS said the deal is set to include $7 million worth of funding for community projects in and around St Joseph County, outside South Bend, Indiana - already an established base for Amazons cloud division, which has 26,000 employees in the state.

The extent of the work is as yet unclear, but, as The Register points out, hyperscale data centers typically cost in the region of $1 billion, suggesting that AWSs operation is about to see considerable expansion.

Yes, its likely that Amazon are planning on integrating some form of AI into its data centers, with CEO Andy Jassy discussing the generative AI capabilities of its AWS Trainum2 chips at length in his letter to shareholders earlier in April 2024. But when everyones at it, thats not interesting.

What interested (and pleasantly surprised) us is the extent of Amazons community investment in the area. Beyond simple things like renovating roads and bringing STEAM programmes to schools in the area, it also plans to bring its Fiber Optic Fusion Splicing Workshop, which trains students to install fibre cabling and connects them with local employees, to the area.

It also plans to start an Information Infrastructure Workshop which, it claims is designed to help students, educators and workforce leaders better understand the physical layer of cloud computing and our information economy.

On its own, this could be seen as PR bluster, but Indianas Governor, Eric Holcomb, seems optimistic about Amazons role in developing the economy of, *checks notes*, the Hoosier state.

Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!

Amazon has long been an important economic partner in Indiana, and we are excited to welcome AWS, he said.

This significant investment solidifies Indianas leadership position in the economy of the future, and will undoubtedly have a positive ripple effect on the town of New Carlisle, the north central region and the state of Indiana for years to come.

More:
AWS is investing billions in one of its biggest US cloud regions - TechRadar

As cloud computing evolves, Amazon’s AWS looks for its role in generative AI – TechSpot

One of the intriguing aspects of the generative AI phenomenon is the vast range of vendors that are offering solutions to leverage the new technology. Companies eager to deploy GenAI face a complex landscape filled with foundation model suppliers, AI platform companies, data management vendors, and model customization tool providers.

Surprisingly, the big cloud computing companies which have dominated the IT landscape for the last decade haven't played as central a role as many initially expected. At least, not yet.

But there are signs that situation could be changing. Google recently held their Cloud Next event where they unveiled a wide range of new AI integrations for Google Workspace, a GenAI tool for video creation and editing, and other enhancements aided by its Gemini Pro 1.5 large language model.

Now it's Amazon AWS taking the wraps off a host of new features and improvements for its Bedrock GenAI managed service, which are designed to make the process of selecting and deploying the right tools for GenAI applications much easier. Amazon is adding the ability to import customized foundation models into the service and then allow companies to leverage the capabilities of Bedrock across those custom models.

For example, companies that have trained an open-source model like Llama or Mistral with their own data, possibly using Amazon's SageMaker, can now integrate that customized model along with the existing standardized models within Bedrock. This integration allows the use of a single API to build applications that utilize both customized and existing Bedrock model options, including those from AI21 Labs, Anthropic, Cohere, Meta, Stability AI, and Amazon's Titan models.

Amazon also introduced version 2 of their Titan Text Embeddings model, which has been specifically optimized for RAG (Retrieval Augmented Generation) applications and announced the general availability of its Titan Image Generator model.

The ability to import custom models into Bedrock supports the integration of RAG functionalities, facilitating continuous fine-tuning of models with new data. Bedrock's serverless infrastructure also supports scalable performance across AWS instances, which aids in managing real-time demands. Furthermore, Bedrock includes tools for developing AI-powered agents capable of performing multi-step tasks. Agents are currently one of the hottest discussion topics in GenAI, so these kinds of capabilities are bound to be of interest to those organizations that want to stay on the cutting edge.

Additionally, Amazon has introduced new Guardrails for Bedrock, which add enhanced filtering features to prevent the creation and dissemination of inappropriate content and sensitive information. While existing models already incorporate basic content filtering, these new Guardrails offer an extra layer of customizable protection.

Amazon's Model Evaluation tool within Bedrock, now generally available, assists organizations in selecting the most suitable foundation model for their specific needs. This tool compares various models based on accuracy, robustness, and allows customization to evaluate how different models perform with user-specific data and prompts.

As organizations navigate the early stages of GenAI deployments, many are realizing the importance of locating their GenAI software and services near their data sources. With a significant amount of data hosted on AWS, the new features in Bedrock could prove particularly attractive to companies looking to enhance their GenAI capabilities.

We may also witness the emergence of multi-platform GenAI deployments. Just as companies have found benefits in using multiple cloud providers, they are likely to adopt a similar strategy with GenAI platforms, using different platforms for various applications.

The race is still on, but it's clear that all the major cloud computing providers want to be (and will be) important entrants in the GenAI landscape as well.

Bob O'Donnell is the founder and chief analyst of TECHnalysis Research, LLC a technology consulting firm that provides strategic consulting and market research services to the technology industry and professional financial community. You can follow him on Twitter @bobodtech

See original here:
As cloud computing evolves, Amazon's AWS looks for its role in generative AI - TechSpot

Akash Network to Be Listed on Upbit – TradingView

Coindar

Upbit will list Akash Network (AKT) on April 23rd.

Refer to the official tweet by AKT:

AKT Info

Akash Network is a decentralized cloud platform for hosting web applications and other technology solutions. The project was created to eliminate the problems associated with centralized cloud services such as Amazon Web Services (AWS), Google Cloud and others.

Compared to traditional cloud services, Akash Network is more energy-efficient. The network's consensus mechanism is based on Proof-of-Stake, which is considered to be more environmentally friendly than Proof-of-Work used by many other blockchain networks.

Akash Networks global marketplace ensures that AI developers and researchers worldwide have equal access to GPU resources, irrespective of their geographical location.

By providing an efficient, secure, and cost-effective alternative for AI hosting through its GPU marketplace, Akash Network is not only revolutionizing cloud computing but also making a substantial impact on the rapidly growing field of artificial intelligence.

In the Akash network, cloud hosting resources are distributed among network participants. Participants can offer their unused computing resources such as bandwidth, memory, and CPU time, which can then be rented for hosting applications and services. This makes cloud services more accessible and democratizes access to cloud computing.

AKT is the native cryptocurrency token of Akash Network. It is integral for securing the network, executing transactions and contracts, and incentivizing community participation through staking and rewards.

Excerpt from:
Akash Network to Be Listed on Upbit - TradingView

Explained | What is Project Nimbus and why are Google employees protesting against it? – Deccan Herald

Google has fired around 50 employees so far after some of its staff participated in protests against the company's cloud contract with the Israeli government called Project Nimbus.

What is project Nimbus

Project Nimbus is a $1.2 billion contract awarded to Google and Amazon.com in 2021 to supply the Israeli government with cloud services. The aim of the project is to provide Israel with public cloud services in order to address challenges in various sectors of the country like healthcare, transportation, and education.

The execution of this project will allow Israel to perform large-scale data analysis, AI training, database hosting, and other forms of powerful computing using Googles technology as per a report in Business Today.

Meanwhile, the government employees and senior leaders of the country will also get training to enhance their digital skill.

Why are Google employees in the US protesting?

Google employees organised sit-ins at the Google office in New York City, California and Seattle against the company's contract with the Israeli government.

The protests are being led by No Tech For Apartheid which has been organising demonstrations against Project Nimbus since 2021. The employees in Google are opposing the ties of their employer with Israel which is currently in war with Gaza.

The protesting faction says that Project Nimbus supports the development of military tools by the Israeli government.

In a statement on Medium, Google workers affiliated with the No Tech for Apartheid campaign called it a "flagrant act of retaliation" and said that some employees who did not directly participate in the protests were also among those Google fired.

"Google workers have the right to peacefully protest about terms and conditions of our labor," the statement added.

The protesting faction says that Project Nimbus, a $1.2 billion contract awarded to Google and Amazon.com in 2021 to supply the Israeli government with cloud services, supports the development of military tools by the Israeli government.

How has Google reacted?

Google CEO Sundar Pichai issued a public warning against those protesting in the company and urged employees not to indulge in any politics.

We have a culture of vibrant, open discussion that enables us to create amazing products and turn great ideas into action. Thats important to preserve. But ultimately we are a workplace and our policies and expectations are clear: this is a business, and not a place to act in a way that disrupts coworkers or makes them feel unsafe, to attempt to use the company as a personal platform, or to fight over disruptive issues or debate politics. This is too important a moment as a company for us to be distracted, he wrote in his blog.

In its statement, Google maintained that the Nimbus contract "is not directed at highly sensitive, classified, or military workloads relevant to weapons or intelligence services."

(With Reuters inputs)

(Published 24 April 2024, 13:23 IST)

Go here to read the rest:
Explained | What is Project Nimbus and why are Google employees protesting against it? - Deccan Herald

You Can Get a Lifetime of iBrave Web Hosting on Sale for $80 Right Now – Lifehacker

We may earn a commission from links on this page. Deal pricing and availability subject to change after time of publication.

You can get a lifetime subscription to iBrave Cloud Web Hosting on sale for $79.99 right now (reg. $899.10) through April 16 using the promo code ENJOY20 through April 16. iBrave's lifetime web hosting subscription gives you access to a control panel equipped with 80 one-click install apps for platforms like WordPress, Magento, and Joomla, and it comes with daily backups, unlimited SSD storage, monthly bandwidth, MySQL databases (limited to 1024 megabytes), and custom email addresses. It doesn't include domain names, but you can buy a new domain or use an existing one you already own.

You can get a lifetime subscription to iBrave Cloud Web Hosting on sale for $79.99 right now (reg. $899.10) using the promo code ENJOY20 through April 16 at 11:59 p.m. PT, though prices can change at any time.

See the original post here:
You Can Get a Lifetime of iBrave Web Hosting on Sale for $80 Right Now - Lifehacker

Future-proof your business: cloud storage without the climate cost – CloudTech News

With over half of all corporate data held in the cloud as of 2022, demand for cloud storage has never been higher. This has triggered extreme energy consumption throughout the data centre industry, leading to hefty greenhouse gas (GHG) emissions.

Worryingly, the European Commission now estimates that by 2030, EU data centre energy use will increase from 2.7% to 3.2% of the Unions total demand. This would put the industrys emissions almost on par with pollution from the EUs international aviation.

Despite this, it must be remembered that cloud storage is still far more sustainable than the alternatives.

Why should we consider cloud storage to be sustainable?

Its important to put the energy used by cloud storage into context and consider the savings it can make elsewhere. Thanks to file storage and sharing services, teams can collaborate and work wherever they are, removing the need for large offices and everyday commuting.

As a result, businesses can downsize their workspaces as well as reduce the environmental impact caused by employees travelling. In fact, its estimated that working from home four days a week can reduce nitrogen dioxide emissions by around 10%.

In addition, cloud storage reduces reliance on physical, on-premises servers. For small and medium-sized businesses (SMBs), having on-site servers or their own data centres can be expensive, whilst running and cooling the equipment requires a lot of energy, which means more CO2 emissions.

Cloud servers, on the other hand, offer a more efficient alternative. Unlike on-premises servers that might only be used to a fraction of their capacity, cloud servers in data centres can be used much more effectively. They often operate at much higher capacities, thanks to virtualisation technology that allows a single physical server to act as multiple virtual ones.

Each virtual server can be used by different businesses, meaning fewer physical units are needed overall. This means less energy is required to power and cool, leading to a reduction in overall emissions.

Furthermore, on-premises servers often have higher storage and computing capacity than needed just to handle occasional spikes in demand, which is an inefficient use of resources. Cloud data centres, by contrast, pool large amounts of equipment to manage these spikes more efficiently.

In 2022, the average power usage effectiveness of data centres improved. This indicates that cloud providers are using energy more efficiently and helping companies reduce their carbon footprint with cloud storage.

A sustainable transition: three steps to create green cloud storage

Importantly, there are ways to further improve the sustainability of services like cloud storage, which could translate to energy savings of 30-50% through greening strategies. So, how can ordinary cloud storage be turned into green cloud storage? We believe there are three fundamental steps.

Firstly, businesses should carefully consider location. This means choosing a cloud storage provider thats close to a power facility. This is because distance matters. If electricity travels a long way between generation and use, a proportion is lost. In addition, data centres located in cooler climates or underwater environments can cut down on the energy required for cooling.

Next, businesses should quiz green providers about what theyre doing to reduce their environmental impact. For example, powering their operations with wind, solar or biofuels minimises reliance on fossil fuels and so lowering GHG emissions. Some facilities will house large battery banks to store renewable energy and ensure a continuous, eco-friendly power supply.

Last but certainly not least, technology offers powerful ways to enhance the energy efficiency of cloud storage. Some providers have been investing in algorithms, software and hardware designed to optimise energy use. For example, introducing frequency scaling or AI and machine learning algorithms can significantly improve how data centres manage power consumption and cooling.

For instance, Googles use of its DeepMind AI has reduced its data centre cooling bill by 40% a prime example of how intelligent systems can work towards greater sustainability.

At a time when the world is warming up at an accelerating rate, selecting a cloud storage provider that demonstrates a clear commitment to sustainability can have a significant impact. In fact, major cloud providers like Google, Microsoft and Amazon have already taken steps to make their cloud services greener, such as by pledging to move to 100 per cent renewable sources of energy.

Cloud storage without the climate cost

The clouds impact on businesses is undeniable, but our digital growth risks an unsustainable future with serious environmental consequences. However, businesses shouldnt have to choose between innovation and the planet.

The answer lies in green cloud storage. By embracing providers powered by renewable energy, efficient data centres, and innovative technologies, businesses can reap the clouds benefits without triggering a devastating energy tax.

The time to act is now. Businesses have a responsibility to choose green cloud storage and be part of the solution, not the problem. By making the switch today, we can ensure the cloud remains a convenient sanctuary, not a climate change culprit.

Check out the upcomingCloud Transformation Conference, a free virtual event for business and technology leaders to explore the evolving landscape of cloud transformation.Book your free virtual ticket to deep dive into the practicalities and opportunities surrounding cloud adoption.Learn more here.

Tags: climate, costs, Storage, sustainability

Original post:
Future-proof your business: cloud storage without the climate cost - CloudTech News

On Cloud Computing And Learning To Say No – Hackaday

Do you really need that cloud hosting package? If youre just running a website no matter whether large or very large you probably dont and should settle for basic hosting. This is the point that [Thomas Millar] argues, taking the reader through an example of a big site like Business Insider, and their realistic bandwidth needs.

From a few stories on Business Insider the HTML itself comes down to about 75 kB compressed, so for their approximately 200 million visitors a month theyd churn through 30 TB of bandwidth for the HTML assuming two articles read per visitor.

This comes down to 11 MB/s of HTML, which can be generated dynamically even with slow interpreted languages, or as [Thomas] says would allow for the worlds websites to be hosted on a system featuring single 192 core AMD Zen 5-based server CPU. So whats the added value here? The reduction in latency and of course increased redundancy from having the site served from 2-3 locations around the globe. Rather than falling in the trap of edge cloud hosting and the latency of inter-datacenter calls, databases should be ideally located on the same physical hardware and synchronized between datacenters.

In this scenario [Thomas] also sees no need for Docker, scaling solutions and virtualization, massively cutting down on costs and complexity. For those among us who run large websites (in the cloud or not), do you agree or disagree with this notion? Feel free to touch off in the comments.

Original post:
On Cloud Computing And Learning To Say No - Hackaday

Google is now authorized to host classified data in the cloud – Nextgov/FCW

Google Public Sector achieved a major milestone Tuesday for its U.S. government customers, announcing Defense Department authorization for its cloud platform to host secret and top secret classified data.

The accreditation instantly makes Googles cloud offering more competitive with rivals Amazon Web Services, Microsoft and Oracle as they vie for billions of dollars worth of business within the Defense Department and intelligence agencies.

We're thrilled to announce another significant milestone for Google Public Sector: the authorization of Google Distributed Cloud Hosted to host Top Secret and Secret missions for the U.S. Intelligence Community, and Top Secret missions for the Department of Defense, Leigh Palmer, the companys vice president of delivery and operations said at Google Cloud Next conference in Las Vegas. This authorization underscores Google Public Sector's commitment to empowering government agencies with secure, cutting-edge technology.

Google Distributed Cloud is the companys air-gapped solution built to meet the U.S. governments most stringent security standards. The suite of accredited tools includes capabilities like compute and storage, data analytics, machine learning and artificial intelligence and does not need to be connected to the public internet to function.

According to Palmer, Google developed its air-gapped cloud with a security-first approach, leveraging zero trust principles, Google best practices and the latest federal guidelines in application and hardware security, cryptography and cybersecurity.

The accreditation represents the culmination of a pivot back to defense work for Google, which in 2018 opted not to continue controversial AI work it was doing under a Pentagon program called Project Maven in part over employee concerns. In 2022, the tech giant formed a new division, Google Public Sector, in part to target a growing government market that spends more than $100 billion on technology each year.

The Defense Department and intelligence agencies represent a significant portion of that spending, and the ability to host secret and top secret government data now allows Google Public Sector to compete for task orders against Amazon Web Services, Microsoft and Oracle on two multi-billion dollar contracts: The Central Intelligence Agencys C2E contract and the Pentagons Joint Warfighting Cloud Capability contract.

Even before the accreditation, Google Public Sector performed work for the Army, Defense Innovation Unit and Air Force.

Google Cloud is committed to being a trusted partner and enabling public sector agencies to achieve their goals with the highest levels of security and innovation, Palmer said.

See the original post:
Google is now authorized to host classified data in the cloud - Nextgov/FCW

Seekr finds the AI computing power it needs in Intels cloud – CIO

Intels cloud gives developers access to thousands of the latest Intel Gaudi AI accelerator and Xeon CPU chips, combined to create a supercomputer optimized for AI workloads, Intel says. It is built on open software, including Intels oneAPI, to support the benchmarking of large-scale AI deployments.

After it began evaluating cloud providers in December, Seekr ran a series of benchmarking tests before committing to the Intel Developer Cloud and found it resulted in 20% faster AI training and 50% faster AI inference than the metrics the company could achieve on premises with current-generation hardware.

Ultimately for us, it comes down to, Are we getting the latest-generation AI compute, and are we getting it at the right price? Clark says. Building [AI] foundation models at multibillion-parameters scale takes a large amount of compute.

Intels Gaudi 2 AI accelerator chip has previously received high marks for performance. The Gaudi 2 chip, developed by the Intel acquired Habana Labs, outperformed Nvidias A100 80GB GPU in tests run in late 2022 by AI company Hugging Face.

Seekrs collaboration with Intel isnt all about performance, however, says Clark. While Seekr needs cutting-edge AI hardware for some workloads, the cloud model also enables the company to limit its use to just the computing power it needs in the moment, he notes.

The goal here is not to use the extensive AI compute all of the time, he says. Training a large foundation model versus inferencing on a smaller, distilled model take different types of compute.

Follow this link:
Seekr finds the AI computing power it needs in Intels cloud - CIO

Competition under threat as cloud giants selectively invest in startups, watchdog says – TechRadar

In a recent address at the 72nd Antitrust Law Spring Meeting in Washington DC, UK Competition and Markets Authority (CMA) CEO Sarah Cardell delved into the potential impact of the current AI landscape on competition and consumer protection.

Emphasizing AI's transformative benefits, Cardell implied that tech giants like Amazon, Google, and Microsoft have been selectively investing in specific startups.

Her speech, recorded via speakers notes, highlighted the need for proactive measures to ensure fair, open, and effective competition in the AI landscape.

Reflecting on the CMAs ongoing scrutiny of the cloud and AI industry, Cardell outlined a series of risks that current practices pose.

Concerns were raised about tech giants controlling critical inputs (such as compute and data) for foundation model development, potentially restricting access for other companies. Such restriction could lead to incumbent firms protecting their existing positions from disruption, which Cardell fears might even lead to market power in other markets beyond AI.

The CMAs CEO also noted that partnerships involving key players in the AI landscape, such as the big three, could reinforce their existing positions of market power and dominance, making it even harder for smaller companies to reach the top.

To address these concerns, the CMA has already committed to enhancing its merger review process to assess the implications of partnerships and arrangements and to monitor current and emerging partnerships more closely, including that of Microsoft and OpenAI.

Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!

Finally, the CMA has plans to examine AI accelerator chips and their impact on the foundation model value chain.

As the AI landscape continues to evolve, its clear that the CMA remains committed to its existing investigations into dominant companies and encouraging competition.

Read the rest here:
Competition under threat as cloud giants selectively invest in startups, watchdog says - TechRadar