Category Archives: Cloud Servers
I went to paradise to see the future of AI, and I’m more confused than … – The Verge
Im here to see the future of computing. But at the moment, Im trying to coax a butterfly onto a nectar-dipped stick.
I feel like Im bothering the insects, but the monarch butterfly caretakers accompanying us in a 10-foot screened-in box insist that its okay, so I follow their instructions and keep gently prodding the feet of whichever butterfly is closest, willing it to hold on.
As we each gradually work on securing a butterfly, one of the butterfly experts asks our small group politely how our product launch is going. Theres a brief, collective silence. None of us have the energy to explain that its not our launch; were just here to cover and analyze it. But rather than explain this deeply boring backstory, someone in our group mercifully pipes up, Its going great.
After many failed attempts, I finally get one of the little guys to hang on. Theres a rush of pride as I turn to the rest of the group and announce, Look, I got one! And then theres nothing to do except stand awkwardly, wondering what comes next.
Qualcomm CEO Cristiano Amon presented his keynote speech with all of his usual vigor.
Qualcomms annual Snapdragon Summit is weird like that. Every year, the company invites a lot of industry partners, analysts, and members of the press to Hawaii to bear witness to its next flagship chip announcement. Ill tell you right now that industry partners, analysts, and members of the press are largely indoorsy people who are wholly unaccustomed to tropical climates.
By the end of day two, Id sweated through every item of clothing I packed and started doing laundry in my hotel room sink. On the plus side, my rooms patio is so hot that my clothes are bone-dry in a few hours. (Per The Verges ethics policy, we dont accept paid trips. With the exception of a few prearranged group meals, Vox Media paid for my travel, lodging, food, and other expenses.)
Our butterfly encounter is part of a circuit of demo stations designed to show off the capabilities of the companys latest tech. The stations are all positioned outside in the midday tropical sun, and by the time we get to the butterfly area, we are looking generally unwell and quite damp. Qualcomm has done a conscientious thing of incorporating elements of traditional Hawaiian culture into each station alongside its technology demos. Some are loosely connected; we learn the history of slack-key guitar while we try out a new audio switching technology.
Our first stop included a demonstration where we learned how poi is made, which is a traditional Hawaiian food staple.
Others dont tie in as neatly, and an hour into the session, Im not clear on what the monarch butterflies have to do with the next generation of mobile computing, but Im too hot to care. After a while, our butterfly guides show us how to gently grasp a butterfly by holding its closed wings between two fingers, and were instructed to take one out of the enclosure and release them en masse as we each make a wish. My mind flips rapidly through about a half-dozen, from thoughts of peace and healing for the people of Maui, where we are visitors, to, Id like to get out of the sun as quickly as possible.
With our butterflies free, we step over to the tech demo station and see one of the features Ive been waiting for: generative photo expansion. Its a feature supported by the Snapdragon 8 Gen 3, the mobile chipset Qualcomm has just announced. You pinch and zoom out of an image and watch as generative AI fills in the borders in a matter of seconds.
The concept is neat; the demo itself is a mixed bag. It handled some preloaded scenes quite well. But when challenged to fill in part of a picture of a face, things dont go so well. Later on, I would see similar results sometimes its incredibly impressive, but one time, it adds a disembodied sexy leg alongside a landscape. Other demos throughout the summit are a similar mix of impressive and not-quite-right. A couple of onstage demonstrations of on-device text generation go slightly sideways: what starts with a request to plan a trip from San Diego to Seattle shifts mid-demo to a trip from Maui to Seattle. Impressive, until it isnt.
And that kind of sums up my feelings about the vision of a generative AI future I was shown over the week. The most optimistic scenario is the picture Qualcomm executives painted for me through its keynotes and a series of interviews: that on-device AI is truly the next turn in mobile computing our phones wont be the annoying little boxes of apps that theyve turned into. AI will act as a more natural, accessible interface and a tool for all the things we want our devices to do. Well regain the mental and emotional overhead we spend every day on tapping little boxes and trying to remember what we were doing in the first place as we get lost in a sea of unscheduled scrolling.
Impressive, until it isnt
AI could also be a real dumpster fire, too. Theres all the potential for misuse that could undo the very fabric of our society. Deepfakes, misinformation, you know, the real bad stuff. But the AI were probably going to encounter the most just seems annoying. One of the demos were shown features a man talking to a customer service AI chatbot about his wireless plan upgrade options, which is a totally pleasant exchange that also sounds like a living nightmare. You better believe that AI chatbots are about to start showing up in a lot of places where were accustomed to talking to a real person, while the barriers to letting you just TALK TO A REPRESENTATIVE grow ever higher.
To someone who isnt constantly immersed in the whirling hot tub that is the consumer tech news cycle, this latest Coming of AI might sound thoroughly unimpressive. Hasnt AI been around for a while now? What about the AI in our phone cameras, our assistants, and ChatGPT? The thing to know and the thing that Qualcomm takes great pains in emphasizing over the course of a week is that when the AI models run on your device and not in the cloud, its different.
If you had to wait 15 or 20 seconds for confirmation every time you asked Google to set a timer, youd never use it again
The two keywords in this round of AI updates are generative and on-device. Your phone has already been using software trained on machine learning to decide which part of your photo is the sky and how blue it should be. This version of AI runs the machine learning models right on your phone and uses them to make something new a stormy sky instead of a blue one.
Likewise, ChatGPT introduced the world to generative AI, but it runs its massive models in the cloud. Running smaller, condensed models locally allows your device to process requests much faster and speed is crucial. If you had to wait 15 or 20 seconds for confirmation every time you asked Google to set a timer, youd never use it again. Cutting out the trip to the cloud means you can reasonably ask AI to do things that often involve several follow-up requests, like generating image options from text again and again. Its private, too, since nothing leaves your phone. Using a tool like Googles current implementation of Magic Editor requires that you upload your image to the cloud first.
Generative AI as a tool has well and truly arrived, but what Im trying to understand on my trip to the tropics is what it looks like as a tool on your phone. Qualcomms senior vice president of technology planning Durga Malladi provides the most compelling, optimistic pitch for AI on our phones. It can be more personal, for one thing. When I ask for suggested activities for a week in Maui, on-device AI can take into account my preferences and abilities and synthesize that information with data fetched from the cloud.
Beyond that, Malladi sees AI as a tool that can help us take back some of the time and energy we spend getting what we want out of our phones. A lot of the time you have to think on its behalf, learn how to operate the device. With AI at your disposal, he says its the other way around. Big if true!
The advanced speech recognition possible with on-device language models means you can do a lot more by just talking to your phone and voice is a very natural, accessible user interface. What AI brings to the table now is a much more intuitive and simple way of communicating what you really need, says Malladi. It can open up mobile computing to those who have been more or less shut out of it in the past.
Its a lovely vision, and to be honest, its one Id like to buy into. Id like to spend less time jumping from app to app when I need to get something done. Id like to ask my phone questions more complex than Whats the weather today? and feel confident in the answer I get. Outsourcing the boring stuff we do on our phones day in and day out to AI? Thats the dream.
Honor CEO George Zhao presented a glimpse of what on-device AI will look like when it reaches, you know, actual devices.
But as I am reminded often on my trip, Qualcomm is a horizontal solutions provider, meaning they just make the stuff everyone else builds on top of. Whatever AI is going to look like on our phones is not ultimately up to this company, so later on in the week, I sat down with George Zhao, CEO of Honor, to get the phone-makers perspective. In his view, on-device AI will and should work hand-in-hand with language models in the cloud. They each have technical limitations: models like Chat GPTs are massive and trained on a wide-ranging data set. Conversely, the smaller AI models that fit on your phone dont need to be an expert on all of humanity they just need to be an expert on you.
Referencing an example he demonstrated onstage earlier in the day, Zhao said an on-device AI assistant with access to your camera roll can help sort through videos of your child and pick out the right ones for a highlight reel you dont need to give a cloud server access to every video in your library. After that, the cloud steps in to compile the final video. He also reiterates the privacy advantage of on-device AI, and that its role in our lives wont be to run all over our personal data with wild abandon it will be a tool at our disposal. Personal AI should be your assistant to help you manage the future the AI world, he says.
Its a lovely vision, and I think the reality of AI in the near future lies somewhere between dumpster fire and a new golden age of computing. Or maybe it will be both of those things in small portions, but the bulk of it will land somewhere in the middle. Some of it really will be revolutionary, some of it will be used for awful things. But mostly itll be a lot of yelling at chatbots to refill your prescription or book a flight or asking your assistant to coordinate a night out with friends by liaising with their AI assistants.
It strikes me that the moments I appreciated the most on my trip to Maui werent in the tech demos or keynotes. They were in the human interactions, many of them unexpected, in the margins of my day. Talking about relentless storms on the Oregon coast with Joseph, my Uber driver. The jokes and in-the-trenches humor shared with my fellow technology journalists. The utter delight and surprise shared with other swimmers as a giant sea turtle cruised by just under the waves. (A real thing that happened!) The alohas and mahalos as I pay for my groceries and order my coffee.
Just a happy monarch butterfly chompin on a treat.
Sandra, another Uber driver, has printed lists of recommended restaurants and activities in her car. One comes with a tip to Tell them Sandy sent you, and theres a directive to check under the passenger seat for a notebook with more suggestions. Id rather walk into a restaurant and say Sandy sent me than My AI personal assistant sent me.
I dont think were headed for a future where AI replaces all of our cherished human interactions, but I do think a future where we all have a highly personalized tool to curate and filter our experiences holds somewhat fewer of these chance encounters. Qualcomm can set the stage and paint rosy pictures of an inclusive AI future, but thats the job of a tech company organizing an annual pep rally in the tropics to talk about its latest chips. What happens next will likely be messy and at times ugly, and it will be defined by the companies that make the software that runs on those chips.
Qualcomm got the butterfly onto the stick. Now what?
Photography by Allison Johnson / The Verge
Read more here:
I went to paradise to see the future of AI, and I'm more confused than ... - The Verge
Reclaiming Control Through Repatriation for Cloud Optimization – Sify
Role of a capable partner in guiding an organisation through this intricate maze of optimisation cannot be overstated.
The corporate world has been split by cloud computing. While it has undoubtedly generated compelling value propositions for global organisations, it has also produced a number of concerns. In this post, we will look at how organisations may optimise their operations by implementing a cloud repatriation plan.
However, its a heated debate in the corporate world. While the benefits are undeniable, so are the challenges. In this post, well explore the concept of cloud repatriation the process of moving data and applications from the public cloud back to on-premises or private cloud environments.
Well delve into the reasons why organisations are considering this strategy and how it can help them regain control and optimise their operations in the cloud era.
The corporate world finds itself at a crossroads when it comes to cloud computing. Undoubtedly, cloud technology has ushered in an era of unparalleled opportunities for global organisations. The promise of cloud-hosted digital environments, with their unmatched scalability, flexibility, and cost savings, has drawn forward-thinking businesses into its orbit. However, beneath the surface, a host of challenges have emerged, compelling many organisations to explore the concept of cloud repatriation as a means to optimise their operations.
So, what exactly is cloud repatriation, and why is it increasingly considered a worthwhile endeavour? In essence, repatriation is the process of moving data and applications from the public cloud back to an organisations on-premises data centre, private cloud, or a trusted hosting service provider. Its a strategic decision that isnt taken lightly and is not a one-size-fits-all solution. The ultimate goal is to discover and implement the most optimised architecture that seamlessly aligns with a companys unique business demands and objectives.
Managing cloud costs can be a daunting challenge if not executed efficiently. The Flexera 2023 State of the Cloud Report reveals that a staggering 82% of businesses identify managing cloud costs as their foremost obstacle. This challenge encompasses a web of factors, including data transfer costs, storage expenses, underutilised resources resulting from infrastructure sprawl, and the complexities of maintaining regulatory compliance.
Cloud security is a prominent concern for businesses, with 79% expressing reservations. Repatriating data or applications to on-premises infrastructure offers companies a greater degree of control over their security posture. This control extends to critical aspects like physical security measures, encryption techniques, network configurations, and access restrictions.
Navigating the cloud landscape can be as challenging as finding your way through a new city without a map or local guide. Its no wonder that 78% of businesses admit to struggling with a lack of resources and cloud-related skills.
Vendor lock-in adds another layer of complexity, as businesses become overly reliant on a single cloud provider for their infrastructure, services, or applications. Migrating data and applications becomes challenging, leading businesses to opt for repatriation as a strategy to avoid vendor lock-in.
In todays business environment, data security, compliance with location-specific data laws, and risk mitigation are paramount. Distant cloud environments can compromise data sovereignty and may not adhere to local data protection regulations. Businesses can lose control over how their data is processed and stored in various jurisdictions.
Increasingly, repatriated workloads are finding their rightful place in near-edge or on-premises edge locations. These locations offer benefits such as reduced latency, support for Internet of Things (IoT) use cases, and on-site data processing capabilities for real-time applications.
In light of these challenges, cloud repatriation emerges as a strategic choice for organisations seeking to regain control and optimise their cloud presence. It involves the movement of files and applications from the public cloud to a private cloud, hosting service provider, or an organisations on-site data centre.
As businesses undertake this journey to maximise their cloud presence, they inevitably find themselves evaluating the architecture of their existing security solutions and rethinking their network infrastructure. Thus, having a capable partner to navigate this complex terrain becomes invaluable.
The primary goal of cloud optimization is to utilise cloud computing resources in the most cost-effective and efficient manner possible. Repatriation can take on various forms, including hosted private clouds, multi-tenanted private clouds, and alternative deployment methods.
Recent research conducted by IDC reveals that clients are increasingly drawn to private cloud environments for both existing workloads and new projects born in the cloud, as opposed to public cloud settings. In response to this trend, system providers are now offering unified management platforms with administration, provisioning, and observability features. These platforms provide companies with access to specialised infrastructure that mirrors the user experience offered by public clouds. Projections from this research suggest that by 2024, the percentage of mission-critical applications operating in traditional dedicated data centres will decrease from 30% to 28%, while the percentage of updated versions of similar applications operating in private clouds will rise to 26%.
Modern businesses have the capacity to migrate some operations to the cloud without compromising data security while maintaining others on-site. This strategic flexibility allows them to harness the benefits of both environments.
To determine the optimal cloud optimization strategy, businesses must carefully evaluate their specific needs. However, its important to note that repatriation is a complex process. As organisations navigate this journey to maximise their cloud presence, they will inevitably need to assess the architecture of their existing security solutions and reconfigure their network infrastructure. Therefore, the role of a capable partner in guiding an organisation through this intricate maze cannot be overstated.
See the article here:
Reclaiming Control Through Repatriation for Cloud Optimization - Sify
AI in the Biden Administrations CrosshairsSummarizing the … – JD Supra
On October 30, 2023, President Bidenannounceda sweeping newExecutive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence(EO). The EO signals an all-hands-on-deck approach, with roles for agencies across the federal government, proposed requirements and/or guidance that will apply both to companies that offer artificial intelligence (AI)-related services and those that consume such services, and still-unfolding implications for the legal operation of such businesses.
Highlights of the EO for providers and consumers of AI products and services follow, with our 10 top takeaways for private sector investors and companies immediately after:
Highlights
Ten Top Takeaways for AI Builders, AI Investors, and AI Users
In sum: keep watching this space.Affected companies should carefully monitor the implementation of this executive order and any follow-on actions by agencies under the EO.
[1]More specifically defined as an AI model that is trained on broad data; generally uses self-supervision; contains at least tens of billions of parameters; is applicable across a wide range of contexts; and that exhibits, or could be easily modified to exhibit, high levels of performance at tasks that pose a serious risk to security, national economic security, national public health or safety, or any combination of those matters [. . .]
[2]In January 2023, NIST released an Artificial Intelligence Risk Management Framework intended to provide a resource to organizations designing, developing, deploying, or using AI systems to manage risks and promote trustworthy and responsible development and use of AI systems. See our previous alert for more details.
[3]As one example, in an effort to slow Chinas development of advanced AI technologies, the DoC recently issued an array of semiconductor and supercomputer-related export controls. See recent client alerts here and here for a discussion of these export controls. As another, see our recent client alert on proposed outbound investment rules to restrict U.S. support for AI innovation in China here.
[4]Id.
Read more from the original source:
AI in the Biden Administrations CrosshairsSummarizing the ... - JD Supra
What’s the Difference Between a Web Developer and a Software … – Dice Insights
Web developers and software engineers are popular roles within tech. Is there a lot of overlap between them? If not, what are the key differences? Were going to break down the differences between a web developer and a software engineer, and highlight what makes both roles unique.
In simplest terms, web developers build and maintain websites, web applications, and services. Depending on their interests and specialization, they may focus on the front-end (i.e., what web users see and do), the back-end (i.e., the servers and other components that actually keep websites and services running), or both. In the course of a typical day, this can mean engaging in tasks such as:
According to Lightcast (formerly Emsi Burning Glass), which collects and analyzes millions of job postings from across the country, companies frequently ask for the following skills in web developer job postings:
Those web developers who opt to focus on the front end will generally need to have a grasp of the following:
Meanwhile, those who want to concentrate their efforts on the back end will need to master skills including (but certainly not limited to):
Anyone who wants to become a master at full-stack web development will need to know how to use all of the above skills. That might seem like a lot, but keep in mind that many organizations will opt to hire a full-stack developer over someone who specializes exclusively in the front- or back-end; the difficulty of mastering the core concepts is commensurate with the opportunities out there.
Software engineers have a broad scope of responsibilities, and their daily tasks can vary wildly depending on their respective organizations goals. In general, software engineering involves:
Lightcasts necessary skills for a software engineer, based on job postings, include:
But that can also depend on the organizations needs; for instance, a software engineer tasked with mobile development will absolutely need to know the programming languages involved in building apps and services for iOS and Android, including Objective-C, Swift, Java, and Kotlin.
Although the technical skills utilized by a software engineer might differ considerably from those needed by a web developer, the professions do share some commonalities. Specifically, both web developers and software engineers need to understand the principles of software design, and they need effective soft skills (such as communication and teamwork) in order to accomplish their goals and work with other stakeholders.
Both web developers and software engineers are in high demand, and those who want to jump from one role to the other will find a lot thats familiar, especially when it comes to using programming languages to build services and apps.
The core difference between web developers and software engineers is obviously focus: web developers work on websites and applications, whereas software engineers can focus on anything from desktop and mobile software to cloud infrastructure. They use different tools and programming languages to achieve their respective ends.
Dices latest Tech Salary Reportsuggests software engineers can earn quite a bit, especially with specialization and seniority. For example, a principal software engineer can earn $153,288, while a cloud engineer can pull down $145,416. Back-end software engineers earn slightly lower ($129,150), just ahead of data engineers ($122,811) and systems engineers ($120,800).
Meanwhile, the Tech Salary Report puts the average web developer salary at $87,194but keep in mind that number can climb far higher with specialization and experience. (The Report also puts the average tech professional salary at $111,348, up 2.3 percent year-over-year).
See the article here:
What's the Difference Between a Web Developer and a Software ... - Dice Insights
How the UK crime agency repurposed Amazon cloud platform to … – ComputerWeekly.com
The UKs National Crime Agency (NCA) repurposed its cloud-based data analytics platform to help identify threats to life in messages sent by suspected criminals over the encrypted EncroChat phone network.
After placing a software implant on an EncroChat server in Roubaix, investigators from Frances digital crime unit infiltrated the encrypted phone network in April 2020, capturing 70 million messages.
The operation, supported by Europol, led to arrests in the Netherlands, Germany, Sweden, France and other countries of criminals involved in drug trafficking, money laundering and firearms offences. More than 1,100 people have been convicted under the NCAs investigation into the French EncroChat data, Operation Venetic, which has led to more than 3,000 arrests across the UK, and more than 2,000 suspects being charged.
UK police have seized nearly six and a half tonnes of cocaine, more than three tonnes of heroin and almost 14 and a half tonnes of cannabis, along with 173 firearms, 3,500 rounds of ammunition and 80m in cash from organised crime groups.
Europol supplied British investigators with overnight downloads of data gathered from phones identified as being in the UK, through Europols Large File Exchange, part of its Siena secure computer network.
With an estimated 9,000 UK-based EncroChat users, the NCA needed to quickly process a large volume of potentially incriminating data, so tasked its National Cyber Crime Unit (NCCU) with categorising it for human investigators to analyse. To automate the preprocessing of data once it had received the EncroChat material, NCCU staff added pre-built capabilities from Amazon Web Services (AWS) to its cloud data platform, including machine learning software with the capability to extract text, handwriting and data from EncroChat text messages and photographs.
For us, its about preventing harm and protecting the public, said an NCCU spokesperson, quoted in a technology company case study. We had a flood of unstructured data and had to operate swiftly to reduce harm to the public. Our data scientists could probably have devised ways of analysing this data themselves. But when we have more than 200 threats to life, we cant afford to spend time doing that. Using off-the-shelf services from AWS enabled us to go from a standing start to a full capability in the space of hours. If we were to build it ourselves from scratch, that might have taken over a month of effort.
The NCCU was able to scale-up its existing data analysis platform from tens of users in the NCA to 300 within two weeks of being informed of the EncroChat investigation.
Once the historic messages extracted from EncroChats in-phone database, called Realm, and live text messages sent from thousands of phones were processed, the NCA sent intelligence packages in the form of CSV files to Regional Organised Crime Units; the Police Service of Northern Ireland; Police Scotland; the Metropolitan Police; Border Force; the Prison Service; and HM Revenue & Customs.
These organisations were then responsible for analysing the data for further indications of threats to life, the drugs trade and other criminal activity.
The NCCU had been developing a cloud-based platform to analyse data for over three years before the EncroChat operation. Digital transformation consultancy Contino won the contract to build the platform on AWS.
By shifting from its on-premise infrastructure to the cloud, the NCCU said it has been able to spend more time on investigations, and less time on procuring and maintaining hardware and managing IT infrastructure.
Previously, we had on-premises infrastructure, which required a lot of management and prevented us from doing the data science we wanted to do, said an NCCU spokesperson. Our small tech team spent a considerable amount of time building and managing infrastructure.
This was a problem, because our recruitment and retention are based on providing people with engaging and challenging work fighting cyber crime, not administering IT.
Within a year of beginning its pilot of the analytics platform which used services including Amazon Elastic Compute Cloud (Amazon EC2) and Amazon Relational Database Service (Amazon RDS) the NCCU introduced more advanced data processing capabilities.
This included the Amazon EMR big data platform, which helps scale and automate data processing, and AWS Glue, a serverless data integration service that can combine and organise data from a wide range of sources.
As a law enforcement agency that handles sensitive and therefore potentially harmful data, the NCA and NCCU also needed the platform to be secure, so used Amazon GuardDuty to monitor network activity to shield it from malicious activity.
Moving data outside of our perimeter is not a decision we take lightly, said an NCCU spokesperson. The transparency of AWS, its shared security model, and the access we had to documentation and experts assisted us on that journey considerably.
At the start of May 2021, the Netherlands Forensic Institute (NFI) announced that its forensic big data analysis (FBDA) team had similarly modified a computer model it had previously developed to scan for drug-related messages sent between suspected criminals in large volumes of communications data, as part of a research and development project.
The NFI told Computer Weekly at the time that the drug-talk software was developed in-house before being modified for threat-to-life detection and passed on to the police.
Using deep learning techniques, the FBDA team initially trained the models neural network in generic language comprehension by having it read webpages and newspaper articles, before introducing it to the messages of suspected criminals, so it could learn how they communicate.
The team then began using similar techniques to develop a model to recognise life-threatening messages, said the NFI in a statement. That model was ready when the chats from EncroChat poured into the police in Driebergen on 1 April.
Continue reading here:
How the UK crime agency repurposed Amazon cloud platform to ... - ComputerWeekly.com
Data Management Workloads Drive Spending on Compute and … – IDC
NEEDHAM, Mass., November 2, 2023 Structured Databases/Data Management workloads are driving the largest share of enterprise IT infrastructure spending in the first half of 2023 (1H23), according to the International Data Corporation (IDC) Worldwide Semiannual Enterprise Infrastructure Tracker: Workloads. Organizations spent $6.4 billion on compute and storage hardware infrastructure to support this workload category in 1H23, which represents 8.5% of the market total.
Despite the high level of spending, Structured Databases/Data Management wasn't among the fastest growing workloads with just 1.1% annual growth. Industry Specific Business Applications saw growth of 33.3% in value compared to 1H22. HR/Human Capital Management (HCM), Business Intelligence/Data Analytics, and Development Tools and Applications workloads experienced double-digit year-over-year growth in hardware infrastructure demand with spending growing at 28.5%, 10.4%, and 10.3% respectively. However, only Business Intelligence/Data Analytics ranks in the top 5 workloads of hardware spending while the other two workloads (HR/Human Capital Management (HCM) and Development Tools and Applications) rank 15 and 10 in spending.
Workloads spending profiles vary among product categories. For ODM Direct the highest spending in 1H23 was concentrated on Digital Services with $2.6 billion representing 11.7% of ODM spending. In comparison, OEM Servers and OEM Storage spending is led by Structured Databases/Data Management.
Workloads priorities vary within regions as well, with Asia/Pacific's spending for AI Lifecycle in 1H23 at $2.0 billion just behind Structured Databases/Data Management. Infrastructure spending was the second largest workloads category in Europe, Middle East and Africa (EMEA) during 1H23 at $0.9 billion. In the Americas, the largest workload in 1H23 was Digital Services at $2.8 billion with ODMs representing 46% of total infrastructure spending in the region for the first half of the year.
As enterprise workloads continue to move into public cloud, investments in shared infrastructure (a hardware base for delivering public cloud services) will be increasing faster than investments in dedicated infrastructure across all workloads. Spending for workloads in cloud and shared infrastructure environments will grow at a compound annual growth rate (CAGR) of 11.6% over the next five years with Digital Services and AI Lifecycle spending leading the way. IDC predicts spending for Digital Services will reach $13.4 billion in 2027 and AI Lifecycle $8.1 billion with five-year CAGRs of 15%. Infrastructure spending for cloud and dedicated environments will grow at a 10.7% CAGR over the next five years, with AI Lifecycle being the fastest growing workload with a five-year CAGR of 16.3%. By 2027, IDC expects AI Lifecycle to be the second largest workloads category in terms of spending at $3.9 billion.
Over the next five years, IDC forecasts growth in compute and storage systems spending for cloud-native workloads to be almost twice as high as that of infrastructure supporting traditional workloads (12.2% vs 6.2% CAGR) although traditional workloads will continue to account for the majority of spending during the forecast period (71% in 2027).
Spending for workloads in non-cloud infrastructure environments will grow at a 1.7% CAGR over the next five years with Text and Media Analytics and AI Lifecycle as the fastest growing workloads with five-year CAGRs of 9% and 6.1% respectively. Structured Database/Data Management, Content Applications, and Business Intelligence/Data Analytics workloads combined will account for 24% of spending in 2027 while Text and Media Analytics and AI Lifecycle combined will only account for 11.3% of spending in the same year.
Taxonomy Notes
IDC estimates spending on compute and storage systems across 19 mutually exclusive workloads, defined as applications and their datasets. The full taxonomy including definitions of the workloads can be found in IDC's Worldwide Semiannual Enterprise Infrastructure Tracker: Workloads Taxonomy, 2023 (IDC #US51045423). The majority of workloads map to secondary or functional software markets while several, including Content Delivery and Digital Services, have no equivalent in the software market structure. Workloads are further consolidated into seven workload categories, which include: Application Development & Testing, Business Applications, Data Management, Digital Services, Email/Collaborative & Content Applications, Infrastructure, and Technical Applications.
IDC's Worldwide Semiannual Enterprise Infrastructure Tracker: Workloads provides insight into how enterprise workloads are deployed and consumed in different areas of the enterprise infrastructure hardware market and what the projections are for future deployments. Workload trends are presented by region and infrastructure platform and shared for the enterprise infrastructure hardware market with a five-year forecast. This Tracker is part of the Worldwide Quarterly Enterprise Infrastructure Tracker, which provides a holistic total addressable market view of the four key enabling infrastructure technologies for the datacenter (servers, external enterprise storage systems, and purpose-built appliances: HCI and PBBA).
For more information about IDC's Semiannual Enterprise Infrastructure Tracker: Workloads, please contact Lidice Fernandez at lfernandez@idc.com.
About IDC Trackers
IDC Tracker products provide accurate and timely market size, vendor share, and forecasts for hundreds of technology markets from more than 100 countries around the globe. Using proprietary tools and research processes, IDC's Trackers are updated on a semiannual, quarterly, and monthly basis. Tracker results are delivered to clients in user-friendly Excel deliverables and on-line query tools.
Click here to learn about IDCs full suite of data products and how you can leverage them to grow your business.
About IDC
International Data Corporation (IDC) is the premier global provider of market intelligence, advisory services, and events for the information technology, telecommunications, and consumer technology markets. With more than 1,300 analysts worldwide, IDC offers global, regional, and local expertise on technology, IT benchmarking and sourcing, and industry opportunities and trends in over 110 countries. IDC's analysis and insight helps IT professionals, business executives, and the investment community to make fact-based technology decisions and to achieve their key business objectives. Founded in 1964, IDC is a wholly owned subsidiary of International Data Group (IDG), the world's leading tech media, data, and marketing services company. To learn more about IDC, please visit http://www.idc.com. Follow IDC on Twitter at @IDC and LinkedIn. Subscribe to the IDC Blog for industry news and insights.
All product and company names may be trademarks or registered trademarks of their respective holders.
"); tb_show("Share the image", "#TB_inline?height=200&width=600&inlineId=embedDialog", null); } function calculateContainerHeight(attachmentId) { var img = $("img[src*=" + attachmentId + "]"); if (img === undefined) { return 600; } else { img = img[0]; } var iframeHeight; if (img.naturalWidth < 600) { iframeHeight = img.naturalHeight + 100; } else { iframeHeight = (img.naturalHeight / (img.naturalWidth / 600)) + 100; } return Math.ceil(iframeHeight) + 10; } function copyEmbedCode() { $("#embedCodeArea").select(); document.execCommand('copy'); $("#copyButton").val("Copied"); setTimeout(function() { $("#copyButton").val("Copy"); }, 2000); } $(".icn-wrapper a.embed-image-button").click(function(e) { e.preventDefault(); });
More here:
Data Management Workloads Drive Spending on Compute and ... - IDC
App server market size expected to reach 52B by 2030 – App Developer Magazine
The platform called an application server is typically used for cloud-based apps for mobile devices like tablets and smartphones. While allowing the right to access and the functionality of the corporate application, the application server serves as the host for the user's business logic.
The banking, financial services, and insurance (BFSI) industry are always dealing with new regulations, cybersecurity concerns, an increase in the volume of data being generated, and an increase in the number of transactions. Secure, scalable solutions that allow for effective service delivery are constantly needed in the BFSI sector. Additionally, to keep up with technological advancements and have up-to-date security features that guard against continually growing cyber threats, these apps must be updated at fairly regular intervals. Application servers provide tools for building online applications and middleware services for security and upkeep available, as well as network data access.
Get a Free Sample Copy of This Reportat https://straitsresearch.com/report/application-server-market
Rapid advancements in wireless networks and mobile device technology have created a wealth of possibilities for applying application servers in previously untapped fields like information dissemination and the digitization of public services. Additionally, e-commerce and mobile commerce are becoming more and more popular, which is encouraging for the market's expansion over the projection period. As workload and accompanying complexity have increased, government agencies have been utilizing technology more and more. Government agencies must also continually assess the functionality and effectiveness of their current applications and, where necessary or in order to keep up with the difficulties posed by technology, transition to new, better-suited platforms.
With a revenue share of more than 40%, North America dominated the application server market in 2020. North America is renowned for its early adoption of cutting-edge technologies and for having a high mobile device penetration rate. When it comes to early adoption of the newest and most innovative technology, the United States and Canada are regarded as having extremely mature marketplaces. As a result, North American-based companies use application servers at a significant pace.
The fastest-growing regional market is anticipated to be in the Asia Pacific. The Asia Pacific is renowned for the expansion of high-speed wireless internet networks, the proliferation of smartphones, and the exponential growth of the manufacturing and e-commerce sectors. Some of the additional important aspects that are anticipated to support the expansion of the regional market are the rising number of software and IT service providers, the rise in service-based startups, and the aggressive adoption of cutting-edge technologies, notably in China and India.
Microsoft Corporation, International Business Machines Corporation, Oracle Corporation, Red Hat, Inc., and TIBCO Software Inc. are some of the major companies active in the sector. To gain a larger portion of the market, industry companies are investing heavily in R&D projects and streamlining their internal operations. To enhance and broaden their current product and service portfolios, they are also constantly engaged in new product development. To develop technologically innovative products and acquire a competitive edge in the market, they are also placing a major emphasis on mergers and acquisitions and strategic partnerships. Microsoft Corporation, International Business Machines Corp., Oracle Corporation, Red Hat, Inc., TIBCO Software Inc., The Apache Software Foundation, FUJITSU, VMware, Inc., NEC Corporation, and SAP SE are a few of the well-known companies active in the worldwide application server industry.
See the original post:
App server market size expected to reach 52B by 2030 - App Developer Magazine
From groundfrost to cloud, Cohesity puts SmartFiles on Snowflake – ComputerWeekly.com
Data management matters, whether the process involved with locking down its existence are executed in the permafrost (on-premises) or in the frozen mists of the troposphere (cloud) as they circle our data universe.
Given the (some would say metric, or at least healthy) rise of Snowflake as the self-style Data Cloud company (a product term that it has officially branded), we have seen an increasing number of vendors align their own products, protocols, tools and trinkets to the Snowflake platform.
Artificial Intelligence (AI)-powered data management company Cohesity is the latest vendor to get some extra Snowflake proximity.
The company has this month come forward with Cohesity SmartFiles on the Snowflake Data Cloud.
As TechTargets Johnny Yu explained at the time of its launch in 2019, By migrating files from NAS systems, Cohesity SmartFiles can free space on the files original servers. The files are deduplicated on the Cohesity DataPlatform, where users can run global searches. Applications from Cohesity Marketplace for tasks such as antivirus and anomalous access detection run directly on the platform.
This new Snowflake integration is intended to help organisations to draw down analytics from their on-premises and cloud data while maintaining data sovereignty and compliance requirements.
Snowflake recognises the critical importance of providing customers with advanced data security and management while mining their data for strategic insights, said Kit Beall, chief revenue officer, Cohesity. As a leader in AI-powered enterprise data security and management, we seek partners equally dedicated to the secure storage and management of customer data. That is why we are delighted to partner with Snowflake to continue delivering innovative and secure solutions that our customers can confidently rely on.
By using the Snowflake Data Cloud, Cohesity (somewhat grandly) says it is joining Snowflake in mobilising the worlds data to help organisations reap the benefits of their analytics capabilities without having to move their data to the cloud for analysis.
With Cohesity SmartFiles, joint customers can store their data locally in SmartFiles and use Snowflakes analytics capabilities with the flexibility to keep data either on-premises or in the cloud.
This integration provides users with broader access and choice while allowing them to adhere to strict internal policies.
Cohesity SmartFiles augments customers cloud-native Snowflake Data Cloud to include on-premises repositories and extends secure access to sensitive local data records. Cohesity SmartFiles also provides a secure platform for consolidating application data that is designed to improve storage efficiency and reduce the overall cost of ownership for local Snowflake repositories.
See more here:
From groundfrost to cloud, Cohesity puts SmartFiles on Snowflake - ComputerWeekly.com
Hewlett Packard’s (HPE) GreenLake to Replace iSKi’s Server – Yahoo Finance
Hewlett Packard Enterprise HPE has announced that its new on-demand cloud service, HPE GreenLake for Private Cloud Business Edition, has been adopted by the Istanbul Water and Sewerage Administration (iSKi) to replace iSKis existing server and storage systems.
The decision to install HPE GreenLake came at a time when iSKis previous storage infrastructure was nearing its limit and required an upgrade to prevent unexpected disruptions. The HPE GreenLake will support wastewater management and water distribution spanning 20,000 Kilometers.
Additionally, iSKi is in the process of developing a new mobile application, enhancing its billing system, establishing a disaster recovery site and leveraging a range of Microsoft solutions, such as Active Directory and Exchange. These initiatives will benefit from the increased computational capabilities provided by HPE GreenLake.
Hewlett Packard Enterprise Company Price and Consensus
Hewlett Packard Enterprise Company price-consensus-chart | Hewlett Packard Enterprise Company Quote
Collaboratively, the two entities have undertaken the task of centralizing and streamlining the storage infrastructure, unifying a total of 33 file servers and storage systems located at remote sites. Furthermore, iSKi has enhanced its application performance with the advanced artificial intelligence capabilities of HPE InfoSight in the realm of IT operations, enabling the prediction and prevention of potential issues. As a result, iSKi is set to establish a robust and continuous operating system, offering an impressive 99.9999% availability guarantee.
HPE GreenLake for Private Cloud Business Edition also comes with disaggregated hyperconverged infrastructure (dHCI) that doesn't require a system-wide upgrade when one of its components reaches its limits. The dHCI addresses shortcomings by scaling the specific components as needed and decoupling the compute, storage and networking resources. With these advanced features, iSKi will be able to ensure uncompromising levels of security and availability of water supply that is critical.
Story continues
The iSKi is among the extensive list of customers who have adopted HPE GreenLake to optimize their operations. This year has also witnessed the adoption of GreenLake by notable organizations, such as Dubai Islamic Bank, Wihuri Group, Inedys, Fastweb (an Italian telecom operator), Ashok Leyland, DSolution (a Swedish service provider) and Toppan Forms.
HPE is making strides in its Cloud segment with HPE GreenLake, which is strengthening the companys other service pivots. In third-quarter fiscal 2023 earnings report, it was highlighted that GreenLake saw a remarkable 122% year-over-year increase in orders. Hewlett Packard is forming a Hybrid Cloud Business unit, merging HPE GreenLake with HPE Storage, GreenLake Cloud Services and the CTO team. This will fast-track its hybrid cloud strategy, offering a unified portfolio of storage, software, data and cloud services through HPE GreenLake.
Currently, HPE carries a Zacks Rank #2 (Buy). Shares of the company have returned 7.5% year to date.
Some other top-ranked stocks from the broader technology sector are NVIDIA NVDA, NetEase NTES and Dell Technologies DELL, each sporting a Zacks Rank #1 (Strong Buy) at present. You can see the complete list of todays Zacks #1 Rank stocks here.
The Zacks Consensus Estimate for NVDAs third-quarter fiscal 2024 earnings has been revised by 2 cents northward to $3.34 per share in the past 30 days. For fiscal 2024, earnings estimates have increased by 7 cents to $10.74 in the past 30 days.
NVIDIA's earnings beat the Zacks Consensus Estimate in three of the trailing four quarters, while missing the same on one occasion, the average surprise being 9.8%. Shares of NVDA have rallied 175.9% year to date.
The Zacks Consensus Estimate for NetEase's third-quarter 2024 earnings has been revised downward by a penny to $1.56 per share in the past 30 days. For fiscal 2024, earnings estimates have increased by 35 cents to $6.54 per share in the past 90 days.
NTES' earnings beat the Zacks Consensus Estimate in three of the trailing four quarters, while missing the same on one occasion, the average surprise being 24.54%. Shares of NVDA have surged 44.7% year to date.
The Zacks Consensus Estimate for DELL's third-quarter 2024 earnings has been revised upward by 11 cents to $1.47 per share in the past 60 days. For fiscal 2024, earnings estimates have increased by 3 cents to $6.33 per share in the past 30 days.
Dells earnings beat the Zacks Consensus Estimate in each of the preceding four quarters, the average surprise being 39.52%. Shares of DELL have climbed 60.4% year to date.
Want the latest recommendations from Zacks Investment Research? Today, you can download 7 Best Stocks for the Next 30 Days. Click to get this free report
Dell Technologies Inc. (DELL) : Free Stock Analysis Report
NVIDIA Corporation (NVDA) : Free Stock Analysis Report
NetEase, Inc. (NTES) : Free Stock Analysis Report
Hewlett Packard Enterprise Company (HPE) : Free Stock Analysis Report
To read this article on Zacks.com click here.
Zacks Investment Research
Read the original:
Hewlett Packard's (HPE) GreenLake to Replace iSKi's Server - Yahoo Finance
Nexthink Commits to reach a Net-Zero Standard by 2050 – Yahoo Finance
The company will work with the Science Based Targets initiative to reduce emissions by 90% by 2050
BOSTON & LAUSANNE, Switzerland, October 30, 2023--(BUSINESS WIRE)--Nexthink today announced it has signed the commitment letter to the Science Based Targets initiative (SBTi) to achieve a Net-Zero Standard by 2050. The company is aiming to reduce greenhouse gas emissions by at least 42% by 2030, and 90% by 2050. Nexthink's targets are designed to meet those of the Paris Agreement - limiting global warming to 1.5C above pre-industrial levels - and will work with the SBTi to validate this trajectory.
"End-user computing accounts for 60% of all IT emissions in the enterprise and we believe our commitment to Net-Zero should lead to bigger efforts from companies in the space to improve sustainability. Our customers, partners, shareholders, and employees are asking for it, and there is no more important mission than a cleaner and better future for the planet."
Nexthinks Net-Zero commitment is the next step in a continued investment in environmental sustainability. Since 2021, the company has tracked its carbon footprint over its entire operations and has taken measures to reduce its emissions and energy consumption internally. As part of this initiate, Nexthink will:
Reduce the carbon impact per hour of cloud servers used by 50% by 2030 by selecting energy efficient instances and optimizing their usage
Work on its IT assets to improve device lifetime, and reduce the unitary impact of IT devices by 20% by 2030
Implement a company-wide energy efficiency plan that includes improving insulation of the offices and drastically reducing the footprint of air-conditioning gas leaks
Change the global travel policy to manage the emissions linked to professional travels and commuting
Additionally, earlier this year Nexthink announced enhancements to its Sustainable IT Solution, which is aimed at providing vital insights to help customers embed sustainability into the core of their IT strategy. The company was also awarded, for the second year in a row, an EcoVadis Silver Medal and was among the top 8% of companies assessed in the software industry.
Story continues
The SBTi is a partnership between CDP, the United Nations Global Compact, World Resources Institute (WRI) and the World Wide Fund for Nature (WWF). The SBTi call to action is one of the We Mean Business Coalition commitments and is aimed at driving ambitious climate action in the private sector by enabling organizations to set science-based emissions reduction targets.
About Nexthink
Nexthink is the leader in digital employee experience management software. The company provides IT leaders with unprecedented insight allowing them to see, diagnose and fix at scale issues impacting employees anywhere, with any application or network, before employees notice the issue. As the first solution to allow IT to progress from reactive problem solving to proactive optimization, Nexthink enables its more than 1,200 customers to provide better digital experiences to more than 15 million employees. Dual headquartered in Lausanne, Switzerland and Boston, Massachusetts, Nexthink has 9 offices worldwide.
View source version on businesswire.com: https://www.businesswire.com/news/home/20231030003480/en/
Contacts
For further information, please contact: press@nexthink.com
Go here to see the original:
Nexthink Commits to reach a Net-Zero Standard by 2050 - Yahoo Finance