Category Archives: Cloud Servers

AWS and Azure own lion’s share of $120B cloud infrastructure market – The Register

The global market for cloud infrastructure services grew by 30 percent last year, exceeding $100 billion in value for the first time and two lions account for nearly two-thirds of that entire spend.

These latest figures come courtesy of number cruncher Gartner and cover just the infrastructure-as-a-service (IaaS) side of the public cloud comprising servers, storage and network resources that organizations are paying to access.

According to Gartner, the worldwide infrastructure services market grew by 29.7 percent to hit a total of $120.3 billion during 2022, up from $92.8 billion the previous year.

This is despite the growth in cloud services showing signs of slowing during the second half of the year as businesses started to rein in spending, and compares with upwards of 40 percent growth seen during 2021.

The figures confirm that the IaaS market is locked down with Amazon Web Services and Microsoft's Azure cloud platform, which account for 40 percent and 21.5 percent of the entire global market share respectively. That translates to over $48 billion in revenue for AWS, and nearly $26 billion for Azure.

Compared to those giants, China's Alibaba cloud came in a distant third at $9.28 billion, just beating Google Cloud into fourth place, which had revenue of just over $9 billion. Huawei snuck into the top five global IaaS providers with $5.25 billion in revenue and 4.4 percent market share.

Altogether, those top five providers made up over 80 percent of the infrastructure services market, with all the other players making up the remaining 18.9 percent, some $22.7 billion in revenue.

Gartner VP analyst Sid Nag said that IaaS growth during 2022 had been stronger than expected despite the slowdown later in the year "as customers focused on using their previously committed capacity to its fullest potential."

"This is expected to continue until mid-2023 and is a natural outcome of the market's maturity. We expect an acceleration in 2024, as there is still room for plenty of additional future growth," Nag claimed.

Nag said Alibaba continued to rule the IaaS market in China, but limited potential for expansion outside China has slowed its growth, leading to the company's recent decision to spin off the Alibaba Cloud business into a separate entity.

Google achieved the highest growth rate of the top five IaaS vendors, growing 41 percent in 2022, attributed to its investment in sovereign cloud efforts and expanded partner programs that helped to broaden its customer base.

Nag also predicted that generative AI would drive further cloud market expansion.

"As enterprises integrate generative AI into their technology portfolio, new markets and opportunities for cloud hyperscalers will emerge related to sovereignty, ethics, privacy and sustainability," Nag said.

Recent data from Synergy Research Group indicates that hyperscalers accounted for 37 percent of worldwide capacity of all data centers, with non-hyperscale colocation capacity amounting to 23 percent of capacity and on-premise data centers some 40 percent.

"This is in stark contrast to five years ago, when almost 60% of data center capacity was in on-premise facilities," the group said.

The total capacity of all datacenters will continue to rise steadily, the company predicts, driven primarily by hyperscale capacity almost doubling over the next five years. While the on-premises share of the total will drop by about 2 percent per year, the actual capacity of on-premises datacenters will decrease only marginally.

The difference is that 10 years ago, enterprises were spending over $80 billion per year on IT hardware and software for their own datacenters, with less than $10 billion going on cloud infrastructure services.

Fast forward to the present day, Synergy says, and spending on datacenter hardware and software has grown by an average of just 2 percent per year, while spending on cloud services has ballooned, growing by an average 42 percent per year.

Read the original:
AWS and Azure own lion's share of $120B cloud infrastructure market - The Register

Anviz Reveals IntelliSight, a Cloud-Based Distributed Video … – PR Newswire

UNION CITY, Calif., July 17, 2023 /PRNewswire/ -- Anviz, a leading provider of professional and converged intelligent security solutions, recently announced the launch of IntelliSight, its latest video surveillance offering that harnesses the power of distributed cloud and 4G technology to create an all-in-one security solution that delivers unmatched versatility, security, and data analytical capabilities. Now, User can enjoy One-year free cloud storage (7-day event-based video retention).

The Anviz IntelliSight cloud video surveillance management solution combines Anviz's proprietary cloud-based distributed video surveillance management platform with its iCam series artificial intelligence (AI) cameras to provide customers with a comprehensive and flexible surveillance performance. Equipped with best-in-class video analytics and classification, the solution is an ideal option for small and medium-sized companies across various industries, including logistics, education, healthcare, and retail.

"The birth of the IntelliSight solution marks a milestone in our years of effort to provide reliable and versatile security systems for global users," said Mike, Product Manager of IntelliSight. "We believe the solution, whose development is built on our industry-leading security solutions that have proven successful in the global market, will fill the market gap where customers are seeking an all-in-one cloud-based video surveillance system that can safeguard their properties without adding unnecessary costs to their budgets."

Built for Simple Deployment and Scalability

The IntelliSight solution greatly benefits both small to medium sized business installations by eliminating the need for redundant, complex on-site hardware that is traditionally used for setting up a CCTV system, streamlining deployment steps, and keeping costs to a minimum for users. While customers can simply connect the cameras directly to the internet for effortless and fail-safe surveillance, the solution also allows them to easily scale their surveillance system without the extra process of storage and network device installation and debugging.

Immediate Access from Mobile Devices

IntelliSight's cloud-based architecture means that users have the freedom to easily access the surveillance system from anywhere, at any time. Via the Internet and the efficient P2P transmission protocol developed by Anviz, users have the option to view real-time video monitoring and manage devices at home and the office without any restrictions, coupled with a purpose-built mobile that allows for effortless remote access and on-the-go control, keeping users connected 24/7 and giving them peace of mind with greater convenience and easier operability.

Expanded Storage with Cloud Data Backup

IntelliSight enables users to store important event footage securely in cloud servers that provide expandable and flexible storage options, removing the need for extra hardware installation for media data. Additionally, IntelliSight's cloud-based storage reduces the risk of data loss in case of local device failures, with features such as data redundancy and disaster recovery offering an added guarantee for data security.

Advanced Video Analytics Powered by AI

Leveraging the state-of-the-art AI capabilities of Anviz surveillance cameras, the IntelliSight system can deliver advanced data analysis functionality to greatly enhance the efficiency and accuracy of security systems. The smart features of the systems can identify and categorize suspicious activity, categorize objects, and provide critical, timely information that enables users to quickly identify and respond to potential risks, streamlining their security operations while providing all-around protection for their assets.

"One of the biggest differentiators that sets Anvizapart from its competitors is its product's technological and architectural advantages, which also allows us to pioneer the development of a new generation of security systems powered by AIoT and cloud technology. Having been encouraged by the early adopters of the IntelliSight solution who have said it has exceeded their expectations in terms of costs, quality, and simplicity, we hope this solution will pave the way for our entry into the North American market, which is another springboard towards the global $30 billion surveillance system market,"Mike added.

Contact:Title: Marketing SpecialistName:Nic WangPhone: +86 15541141093Email: [emailprotected]com

SOURCE Anviz Global

See the original post:
Anviz Reveals IntelliSight, a Cloud-Based Distributed Video ... - PR Newswire

5 benefits cloud BI has over on-premises options – TechTarget

Cloud BI offers far more capacity, capabilities and features than its on-premises counterpart of the past. Organizations can use cloud BI to give a wider range of users self-service BI features that help them achieve insights and make decisions faster.

BI software remains a critical part of a modern organization's IT portfolio, even as the use of artificial intelligence and other analytics capabilities takes off. Cloud computing is influencing the sophistication and power of BI tools that vendors offer.

"You're pushing the power to the end user without having IT do the work and manage it," said Kevin Martelli, principal of technology at professional services firm KPMG.

The term cloud BI is a marketing label according to some analysts, but they and other experts agree that BI delivered as SaaS has significant advantages over on-premises BI software.

With cloud BI, IT teams that own and maintain the physical servers running the BI software have the cloud providers take over those tasks. The move to cloud BI also means that IT teams can offload implementing, maintaining and updating the software. This leaves them more time to focus on IT activities that provide critical value and help differentiate the organization from competitors.

Organizations can scale cloud BI up and down as needed without having to build -- and pay for -- the infrastructure required for peak capacity.

"If there's a huge demand for a BI tool at a particular time, the underlying infrastructure that has to be accordingly expanded isn't something the organization has to worry about. That's a performance or technical benefit," said Vinay Manne, partner at Guidehouse, which provides consulting services for management, technology and risk, including cloud and digital services.

This provides better value because the cost of cloud BI scales based on consumption. That can mean lower costs than running BI software on premises, where IT must maintain enough infrastructure to meet peak demand, even if that's infrequent.

Organizations often have multiple on-premises BI tools because different teams within the business prefer different products. That perpetuates the problem of data silos.

Cloud BI, on the other hand, can integrate with other systems and modern cloud data warehouses, so it's able to quickly ingest and process the different kinds of data. This removes the data silos and enables teams to fetch data from all the different systems the organization uses, said Vishal Gupta, vice president at the research firm Everest Group.

With on-premises BI, there were limitations around data and data literacy, and you needed IT to step in and create new models for people to build new dashboards. Kevin MartelliPrincipal of technology, KPMG

Early generations of on-premises BI tools required users to have technical skills, so technologists were typically the ones to run and build reports when needed by business units. But vendors that offer BI software as a service include more self-service features. This enables business users to analyze data, create reports and visualize information through dashboards on their own.

"With on-premises BI, there were limitations around data and data literacy, and you needed IT to step in and create new models for people to build new dashboards," Martelli said. "And although [second-generation] BI tools had more capabilities, they still sometimes required IT to help."

Cloud-based BI software continues to evolve with the goal of "getting the most amount of data to the most people in the quickest, easiest way without having to deal with IT for changes and integrations," Martelli said. "These BI platforms have become very decentralized and low-need for IT."

Organizations that adopt cloud BI can put more analytics capabilities into the hands of the business users who need to understand the data and generate insights to make decisions. Giving those capabilities to the right people accelerates decision-making, potentially in real time.

Thanks to the nature of cloud, SaaS-based BI delivers more capabilities and features, which makes it more user-friendly than the prior generation of on-premises BI tools.

Cloud BI tools allow for customized dashboard and data visualization; have interactive visualizations and storyboarding capabilities; feature easier integration with data sources; and offer traditional BI capabilities as well as an increasing amount of advanced analytics options.

Some organizations still use on-premises BI software, and experts note that they tend to do so because they're early in their overall cloud journeys. These organizations have yet to move their core transactional systems to the cloud, and those systems hold the data required for analysis on premises.

Others have adopted cloud BI, but are not making the most of what they have.

"Many companies do not fully appreciate the capabilities provided by the BI vendors," Martelli said.

Enterprise leaders can't assume their workers will optimize the use of BI software because they now have access to more capabilities and those capabilities are more user-friendly. They should continue to create a data literate culture that includes good data governance practices so insights from BI tools are accurate and trustworthy.

"At the end of the day, the objective is to use BI and analytics solutions to provide decision-support from an agile standpoint," Manne said.

Visit link:
5 benefits cloud BI has over on-premises options - TechTarget

Nvidia is showing shades of Apple and its stock could hit $625, says … – Morningstar

By Emily Bary

The 'obvious flagship AI company' can keep racking up stock gains even with its roaring run to start the year, analysts say

Nvidia Corp. continues to rack up bullish endorsements as analysts contemplate the vast opportunities ahead for the leader in artificial-intelligence chips.

Nvidia (NVDA) "is the obvious flagship AI company, whose decisions over the last two decades have positioned it for long-term benefits," wrote Melius Research analyst Ben Reitzes, who initiated coverage of the stock late Monday with a buy rating and $625 price target.

The company is reminding him of Apple Inc. (AAPL) "with a full-stack approach that in our experience tends to deliver an outsized profit share in the industry for longer than expected once the ball starts rolling downhill due to developer support and becoming an industry standard."

See also: Nvidia's stock could have a pathway to $600, Citi says

While Nvidia shares have soared more than 200% so far this year as the now-trillion-dollar company has solidified its AI positioning in Wall Street's mind, Reitzes suggests would-be investors haven't missed the boat.

"On the rare occasion a company like this comes along, we caution investors not to get caught up in [an] arbitrary market-cap milestone and [to instead] focus on sustainable long-term earnings power," he wrote. "Apple has also shown us ... sometimes the consensus can keep being right -- and keep working when the business model is right."

Read: Nvidia could follow 'blowout' outlook with more massive upside, analyst says

Meanwhile, BofA Securities analyst Vivek Arya became more upbeat about Nvidia's stock Tuesday, boosting his price objective on the stock to $550 from $500 while keeping a buy rating.

The upcoming earnings season could lead chip stocks to diverge between the "haves" and "have-nots" of AI, and Arya likes Nvidia's stock setup even more in that scenario.

"The next few quarters could see greater dispersion among chip stocks based on real vs. overstated/perceived AI benefits," Arya wrote Tuesday, calling Nvidia a player that "can hold its dominance." He expects the company to keep a 75% to 80% share of the market for accelerators.

More from MarketWatch:Amazon, Microsoft and Google cloud services bet heavily on AI, but do their customers even want it?

Arya noted that Nvidia's accelerators for servers are critical to driving intensive AI workloads and that the market opportunity is still ripe, since only around 10% of cloud servers are currently "accelerated," adding that "we are in just the early stages of AI investment."

He also highlighted that Nvidia houses other AI products including networking and software offerings that can help customers position themselves for the world of AI.

Read: AMD may suffer from elevated expectations -- but Wells Fargo is warming to Intel's setup

-Emily Bary

This content was created by MarketWatch, which is operated by Dow Jones & Co. MarketWatch is published independently from Dow Jones Newswires and The Wall Street Journal.

(END) Dow Jones Newswires

07-18-23 1107ET

See the original post here:
Nvidia is showing shades of Apple and its stock could hit $625, says ... - Morningstar

Why We Need Software Monitoring – Forbes

observes a woman walking in the Embankment area of central London, 04 April 2007. A system of "talking" CCTV cameras which let operators shout at people who misbehave is being extended across England, ministers said Wednesday, fueling fears of a "Big Brother" society. Under the scheme, local council workers in local control centres monitor pictures from the cameras and can berate passers-by if they feel they are doing something wrong. AFP PHOTO/LEON NEAL (Photo credit should read Leon Neal/AFP via Getty Images)AFP via Getty Images

Clouds are natural. Clearly, the vapor-like billowing mists that make up our planets cloud formations and systems are part of the natural phenomena that make our world so special. Computing clouds are obviously less natural i.e. we fabricate them out of virtualized compute instances that we define via Software-as-a-Service (SaaS) processes and tools that enable us to provision them for specific tasks and functions.

But as digital and ordered as they are, computing clouds often work themselves into a state of tension, almost quite naturally. To be fair to the cloud, it is us the users (and the machines that we also empower to connect with the cloud) that knock cloud instances out of kilter as we overload them, misconfigure them, integrate them with non-native services that they dont dovetail or balance well with.

What these realities bring us to is a point where cloud monitoring has become a subset specialist discipline in and of itself. We have cloud observability specialists, we have Application Performance Management (APM) specialists and we have cloud-native security specialists that devote a large proportion of their efforts to cloud controls - and then we have monitoring purists.

Styling itself as a dedicated monitoring vendor, eG Innovations is a company known for its cloud-based application performance and IT-infrastructure monitoring solutions. The company has monitoring tools that work on both operational clouds and on software application development environments and virtual workspaces used by its software engineers.

Technical product specialist at eG Innovations Rachel Berry says that like many organizations, the company has evolved to have multiple on-site development teams in multiple countries. It also has a substantial number of employees who work from home, work remotely or operate on hybrid work schedules. This dispersed diversity means that eG Innovations has to make sure it keeps its developers productive and content by making sure they have software tools and applications available 24/7 i.e. if someone cant check in a 'code fix', it ultimately also affects the ability to service customer support tickets

Developers need to be able to properly test their work, so they dont get swamped with support tickets when code goes into production or is released to customers. Collaboration tools and mechanisms are used to collect data so different teams arent finger-pointing or blaming each other, said Berry. We use a mixture of real user monitoring and synthetic monitoring (robot users, simulating access 24/7) to detect issues proactively and to resolve them. Virtual Desktop Infrastructure (VDI) is extremely useful for standardizing development environments and ensuring our IT teams only support a limited known configuration. VDI also helps us avoid problematic technologies such as VPNs.

Thinking about the working environment that she and the team oversee, Berry explains that many of the companys developers access its VDI systems from laptops remotely. Often when they encounter user experience problems, the root cause is something associated with the physical endpoint or associated with the workers home location (poor ISP connection, Wi-Fi router issues, other household members gaming or streaming and hogging bandwidth etc.) - and this means that having to have tools in place to troubleshoot home and remote workers hardware and home networking.

We have many shared resources that our developers leverage, particularly databases. If these have a problem the effects can impact multiple teams and block progress, clarified Berry. Having database monitoring in place that our developers have visibility on is extremely important for our business continuity. Similar services that should be monitored are systems responsible for building and delivering customer patches and responses. Uptime and performance of infrastructure services including file servers, Active Directory, hypervisors and even storage devices are important to ensure that developers remain productive.

The eG Innovations operations team works to continually provide developers with performance monitoring data (both live and historical) that allows them to assess the impact of change in IT services. Being able to automatically detect changes in an application or cloud services baseline performance and then correlate that performance change to newly deployed versions and code releases (of that same application or service) as they are implemented makes development processes faster and raises quality.

This is particularly important to us for find issues early, having Application Performance Monitoring (APM) in place alongside stress and load testing allows our developers to identify bottlenecks and their causes even down to a single line of code in a Java or .NET application. This helps us avoid many bugs or performance issues reaching customers or even our own QA team, clarified Berry, with an appropriate nod to the use of APM, which still remains a key discipline in this context.

Beyond the usual suspect configuration management, testing and code repository review developer tools deployed at this level such as GitHub, Jenkins, Ansible and so on, the team also continually monitor applications and tools such as O365, Zoom and Microsoft Teams. It is important to have visibility into the root cause of issues, particularly if tools are delivered as SaaS or are cloud hosted is it an Azure problem vs. a bandwidth issue?

The companys monitoring infrastructure is also integrated with an IT Service Management (ITSM) function and the ticketing tools that offers, such as JIRA and ServiceNow. This helps ensure the team can track and review problems and set targets for issue resolution. Treating the companys software application developers and their issues with the same diligence and urgency as its customers is the mantra and ethos being used here.

Many development teams now deploy applications on cloud infrastructure including public clouds such as Azure, Amazon AWS or Google GCP for agility. Often there is a lack of coordination between IT teams provisioning cloud resources and the development teams that need those resources. An important decision that has to be taken when provisioning resources is the type of cloud instances to use. Development teams often describe their requirements in terms of CPU and memory needed (e.g. 4 vCPUs, 16 GB RAM), while IT teams have to provision VMs by choosing an instance type, noted Berry, in specific detail.

For example, she says, if the team uses a burstable [fast to use] IT instance type because it is cheaper, it may not match the resource usage needs of the development team (who may be thinking they are getting a virtual machine (VM) with dedicated capacity). When stress testing the application, the VM may run out of CPU credits and performance may be poor, leading to developer frustration.

No amount of debugging code will reveal the cause of some issues. Having the right oversight and monitoring for cloud environments is key for application success in the cloud. By monitoring and tracking the availability and performance of developers tools and applications, we can set internal Service Level Agreements (SLAs) and Key Performance Indicators (KPIs) to quantify whether our developers and of course our users are getting what they need.

The only question now is, whos monitoring the monitoring team, right?

I am a technology journalist with over two decades of press experience. Primarily I work as a news analysis writer dedicated to a software application development beat; but, in a fluid media world, I am also an analyst, technology evangelist and content consultant. As the previously narrow discipline of programming now extends across a wider transept of the enterprise IT landscape, my own editorial purview has also broadened. I have spent much of the last ten years also focusing on open source, data analytics and intelligence, cloud computing, mobile devices and data management. I have an extensive background in communications starting in print media, newspapers and also television. If anything, this gives me enough man-hours of cynical world-weary experience to separate the spin from the substance, even when the products are shiny and new.

Read the rest here:
Why We Need Software Monitoring - Forbes

Chinese Hackers Breached Government Email Accounts, Microsoft … – The New York Times

Chinese hackers intent on collecting intelligence on the United States gained access to government email accounts, Microsoft disclosed on Tuesday night.

The attack was targeted, according to a person briefed on the intrusion into the government networks, with the hackers going after specific accounts rather than carrying out a broad-brush intrusion that would suck up enormous amounts of data. Adam Hodge, a spokesman for the White Houses National Security Council, said no classified networks had been affected. An assessment of how much information was taken is continuing.

Microsoft said that in all, about 25 organizations, including government agencies, had been compromised by the hacking group, which used forged authentication tokens to get access to individual email accounts. Hackers had access to at least some of the accounts for a month before the breach was detected, Microsoft said. It did not identify the organizations and agencies affected.

The sophistication of the attack and its targeted nature suggest that the Chinese hacking group was either part of Beijings intelligence service or working for it. We assess this adversary is focused on espionage, such as gaining access to email systems for intelligence collection, Charlie Bell, a Microsoft executive vice president, wrote in a blog post on Tuesday night.

Although the breach appeared to be far smaller in scale than some recent intrusions like the SolarWinds hack by Russia in 2019 and 2020, it could provide information useful to the Chinese government and its intelligence services, and it threatened to further strain relations between the United States and China.

The vulnerability the hackers exploited appeared to be in Microsofts cloud security and was first detected by the U.S. government, which immediately notified the company, Mr. Hodge said.

Inside the government, the attack showed a significant cybersecurity gap in Microsofts defenses and raised serious questions about the security of cloud computing, the person briefed on the intrusion said. The government has been moving data to the cloud, which promises better access to information and improved security, because pushing out patches to vulnerabilities is faster. The U.S. also operates classified cloud servers, but they have more security protocols in place.

The person briefed on the intrusion said that government security requirements should have prevented the breach, and that Microsoft has been asked to provide additional information about the vulnerability.

We continue to hold the procurement providers of the U.S. government to a high security threshold, Mr. Hodge said.

The hack comes at a delicate point in U.S.-China relations, as the Biden administration seeks to cool tensions that have been aggravated in recent months by several incidents including the transit of a Chinese spy balloon across the United States. It could increase criticism that the Biden administration is not doing enough to deter Chinese espionage.

Cliff Sims, a former spokesman for the director of national intelligence in the Trump administration, said China had been emboldened because President Biden had not confronted Beijing over its attempts to influence recent elections.

We need to have some serious conversations about how much hacking well tolerate before taking action, Mr. Sims said.

Mr. Bell, in the blog post, said that people affected by the hack had been notified and that the company had completed efforts to mitigate the attack. But government officials are continuing to ask the company to provide more details of the vulnerability and how it occurred, according to the person briefed on the intrusion.

Microsoft said it was told of the intrusion and compromise on June 16. The companys blog post said the Chinese hacking group first gained access to email accounts a month earlier, on May 15.

Microsoft did not say how many accounts it believes might have been compromised by the Chinese hackers.

China has one of the most aggressive and most capable intelligence hacking operations in the world.

Beijing has, over the years, carried out a series of hacks that have succeeded in stealing huge amounts of government data. In 2015, a data breach apparently carried out by hackers affiliated with Chinas foreign spy service stole huge numbers of records from the Office of Personnel Management.

In the SolarWinds hack, which took place during the Trump administration, Russian intelligence agencies used a software vulnerability to gain access to thousands of computer systems, including many government agencies. The hack was named after the network management software the Russian agencies had exploited to get into computers around the world.

Read more:
Chinese Hackers Breached Government Email Accounts, Microsoft ... - The New York Times

MoonQube Launches New Website and Data Hosting Platform – AccessWire

ATLANTA, GA / ACCESSWIRE / July 17, 2023 / MoonQube, an all-new cloud hosting service, is excited to announce its launch. By taking customer-focused service to the next level, MoonQube's revolutionary approach offers businesses unique opportunities to ensure seamless operations, data access and responsive availability for their clients.

In today's competitive market, the ability for a company to have its operations, data and website on fast and reliable servers is crucial to success. More and more businesses are migrating their operating systems, applications, and data storage to the cloud. This shift allows employees and customers to have 24/7 access to everything they do while minimizing overhead costs and environmental footprint.

MoonQube fills that sought-after niche in the marketplace by providing affordable, advanced and state-of-the-art services around the world. MoonQube's core values of stability, affordable pricing, security, customer service and transparency ensure that customers remain at the center of what they do.

What does MoonQube offer?

Domains and Web Hosting: For many companies their website is the primary interface with new and existing customers; it's a vital part of doing business, and so too is reliable website hosting. MoonQube helps with domain features and website migrations with no interruption in availability or responsiveness.

Object and Block Storage: Large image, audio and video files can quickly clog up onsite data storage options; with MoonQube those files can be stored in a secure, redundant and instantly-available space that customers organize and configure themselves.

Virtual Machines (Qubes) and Kubernetes: Whether your focus is security, ease of use, scalability or automation, we offer containerized and isolated hosting options that meet a variety of needs.

What sets MoonQube apart is its approach to personalization, allowing clients to customize what works best for their company's unique needs. MoonQube provides technology support that is scalable, simple, affordable and accessible for businesses of all sizes. This means that data and applications are not only available lightning-fast anywhere in the world but are backed up constantly, providing the needed redundancy to prevent loss of invaluable tools and work progress.

MoonQube also is focused on ensuring that its offerings are easy to interface with and affordable for every budget. Behind everything they do, however, is a dedication to 24/7 live customer support, accessible whenever it is needed.

For more information, to receive updates or to set up an account, visit http://www.moonqube.com.

Media Inquiries:

Morgan GeorgeForum Communications678-629-7797[emailprotected]

SOURCE: MoonQube

See the original post:
MoonQube Launches New Website and Data Hosting Platform - AccessWire

iOS 16.6 RC is now available to developers with bug fixes and … – BGR

A week after releasing beta 5, Apple is now seeding iOS 16.6 RC to registered developers. Even though the Cupertino firm nears the release of this upcoming iOS 16 update for iPhone users, its unclear the new features expected for this operating system, besides important bug fixes and security updates.

Sign up for the most interesting tech & entertainment news out there.

By signing up, I agree to the Terms of Use and have reviewed the Privacy Notice.

After a small iOS 16.5 update, Apple is still sharpening its software as the company preparesiOS 17, iPadOS 17,tvOS 17, watchOS 10, and macOS Sonoma for a fall release.

An early iOS 16.6 beta build added references forContact Key Verification, according toiOS developer Steve Moser.

Message Contact Key Verification is a feature announced by the company at the end of 2022. While Apple has already rolled out Security Keys for Apple ID and Advanced Data Protection for iCloud during the iOS 16 cycle, the firm still had to release that other function.

For those unaware, iMessage Contact Key Verification lets users verify if they are communicating only with whom they intend. This is especially helpful for journalists, human rights activists, and government members.

Apple says that conversations between users who have enabled iMessage Contact Key Verification receive automatic alerts if an exceptionally advanced adversary, such as a state-sponsored attacker, were ever to succeed breaching cloud servers and inserting their own device to eavesdrop on these encrypted communications. And for even higher security, iMessage Contact Key Verification users can compare a Contact Verification Code in person, on FaceTime, or through another secure call.

As of now, its unclear if iOS 16.6 will bring this feature, but Apple has already started testing it. In addition, Steve Moser says Apple continues to work onSports related features in the TV app, although he couldnt disclose whats new.

Besides iOS 16.6, Apple is also seeding the RC versions of iPadOS 16.6, watchOS 9.6, macOS 13.5, and tvOS 16.6.BGRwill let you know whats new when we learn more about these operating system updates.

See the article here:
iOS 16.6 RC is now available to developers with bug fixes and ... - BGR

The Energy Crunch: AI Data Centers and the Battle for Power – Digital Information World

The demands on energy in a cloud-powered world are astonishing, and the emergence of generative AI is just exacerbating the situation. Tom Keane, a Microsoft data center veteran, has warned about AI models' energy consumption and the issues existing data centers confront.

AI model training in data centers may consume up to three times the energy of regular cloud workloads, putting a strain on infrastructure. The present generation of data centers is unprepared to meet the increased demand for AI-related activities. Last year, Data Center Alley in Northern Virginia almost faced a power outage, anticipating the looming energy issues.

Access to power is becoming increasingly important as big businesses like Amazon, Microsoft, and Google race to satisfy the need for generative AI. The current data center infrastructure cannot accommodate this next wave of technologies. Data center power usage is expected to exceed 35 gigawatts(GW)per year by 2030, more than tripling the amount consumed last year.

AI model training, in particular, is highly energy-intensive, requiring large amounts of power from graphic processing units (GPUs). AI servers with many GPUs may require up to 2 kilowatts of power, as opposed to 300 to 500 watts for a standard cloud server. This growth in energy demand presents data center operators with new hurdles.

To overcome these issues, businesses like DigitalBridge are investing billions of dollars in constructing and renovating data centers mainly intended for generative AI workloads. Smaller data centers are intentionally located in suburban locations, away from big markets, to connect to existing electrical networks without overburdening them. These sites provide higher power density, quicker connectivity, and reduced expenses.

The following data center will be more comprehensive than established hubs like Virginia or Santa Clara. It will instead emerge in low-cost places where electricity supply is not a constraint. To satisfy the expectations of AI growth, data center operators must adapt and adopt novel solutions.

As AI data centers struggle to meet the increasing demand for generative AI, the competition for power heats up. Companies are trying every possible tactics to secure the success, laying the groundwork for a transformational energy future.

The struggle for power becomes increasingly important as the fight for AI dominance proceeds. The present infrastructure needs to be equipped to meet the energy requirements of AI data centers. The sector can only secure a sustainable and efficient future by adopting innovative techniques and exploring other places.

Fasten your seat belts and buckle up for the insane yet amazing technological ride in the coming times. As the world of AI and data centers embark on a trip to secure the energy supplies that will define the future of technology.

Read next:Mobile App Economy Surges: Global In-App Expenditure Skyrockets to Record $67.5 Billion In H1 2023

The rest is here:
The Energy Crunch: AI Data Centers and the Battle for Power - Digital Information World

Machine Learning and Data Engineering Applications in Agriculture – TechNative

In God we trust, all others bring data. William Edwards Deming

The main problems that farmers face on an everyday basis, such as climate changes, manual time-consuming tasks, and changes in consumers everyday dietary are presented in the article. Along with possible solutions that Machine Learning and Data Engineering can offer by applying such tools as IoT, data collection, computer vision, blockchain, and Augmented Reality.

Impact of cultural transformations on Productivity in Agriculture

Over the past centuries, human beings have gone through certain transformations that led to agricultural revolutions afterward. Each revolution played a key role in the increase of production and optimization of the production process. A long way from the first revolution to the fourth traces a path from hunting and gathering to digitalization and artificial intelligence. Lets dive into details, there are four agricultural revolutions:

First agricultural revolution (1700- onwards). This revolution is characterized by stationary farming with core principles mainly based on manual labor, horsepower, and simple tools, which means that productivity remained relatively low during that time period.

Second agricultural revolution (1914-1980s). At this time happens a shift from natural nitrogen supplementation to synthetic fertilizers. The introduction of crop rotation and drainage dramatically increased crop and livestock yields, improved soil fertility, and reduced fallow. An increase in production and a decrease in labor demand led to migration and urban expansion.

Third agricultural revolution (1920s-present). This revolution is all about combustion engines and rural electrification, the introduction of biotechnology, and genetic engineering, alongside computerized programs. The result of this revolution we could observe nowadays in the markets.

Fourth agricultural revolution (the 1970s onwards). This revolution made a transition from industrial production to a digital model that optimizes production processes, reduces time and cost, and enhances customer value. At this point, the agricultural industry started to operate with keys like the Internet of Things, Big Data, Artificial Intelligence, cloud computing, remote sensing, ingestion, and processing of big data into Data Lakes as a foundation for the decision-making approach.

The first and Second agricultural revolutions show a transition from manual labor to production, while the third and fourth revolutions show the importance of computerization and the collection of data. Nonetheless, there are still enormous problems in the agricultural sector of economics that seek solutions in Machine Learning.

Problems of traditional agriculture and the Role of Machine Learning

Demand for agricultural production has highly increased over the last year and ought to be one of the major causes of inflation all over the world in the nearest future. In the meantime, intensive agricultural yield increase is limited due to a number of external reasons:

A limited global land surface that is suitable for cultivation based on climate conditions, good soil, and urban development. According to up-to-date statistics, approximately 40% of the land is covered by jungles, deserts, urban places, or other natural land states such as forests. So, very little land is left for agricultural expansion.

Constantly changing consumer dietary habits and patterns push farmers to shift from one type of production to another. For example, demand for meat products is rising rapidly in societies due to inequality among the population.

Climate changes and natural disasters that have increased in the last century are likely to result in more extreme weather patterns, with average temperatures increasing, resulting in fluctuating yields and production shortfalls.

The application of Machine Learning in the agricultural sector can smooth the above-mentioned problems. In order to better understand the interference between these two fields, lets look at the example. Lets imagine that there is a farmer who relies mainly on calculations of input efforts and output yields. This farmer forecasts his/her profit based on scientific calculations. At the same time, he/she operates with data from sensors on the machinery, such as crop and GPS data, whilst other data is retrieved from a drone, and correlated against GIS information. At the same time, this farmer can also observe pricing data, livestock position, and demand for his product based on third-party cloud servers (diagram 1). All this together creates a picture of the potential crop value and demand. Also, weather forecast comes from open sources and clearly predicts conditions for the upcoming week. At the same time, sensors detect moisture in the ground and check the health status of plants and animals. All this data is gathered and stored for future analysis.

Tracking of every product entity is simple and can be clearly observed on the dashboards for the farmer and the final customer. Combining all this information together could save a lot of effort, help organizations work more efficiently, and solve the main problems that farmers face nowadays.

Integration of Machine Learning and Data Engineering into Agriculture

The covid-19 pandemic had a huge impact on Agriculture and at the same time on the development of Machine Learning and Data Engineering. The main goal of the integration between Agriculture and Machine Learning is to increase final crop yield, save efforts and resources, and help control every step of plants and animals growth. There is a number of techniques in Machine Learning that help collect agricultural data and ease the process for farmers. The following applications could be of great use for the development of communication between data collection and actual production.

1) A unified protocol

It is useful to have a single, unified protocol for cross-manufacturer compatibility of electric and electronic components. All mechanical and automotive devices have to be combined like a LEGO piece into one big machine. All parts of the final construction should communicate with others via protocols. This unified protocol is based on the International Standard ISO 11783 and started to be applied all over the world in 2008.

2) Internet of Things (IoT)

Different devices in a system have to be connected to the Internet and can interact with one another in real-time (diagram 2.) The number of sensors and their application is growing every year and will probably be around 250 billion in the next 5 years. According to this incredible number of sensors development of software products, it becomes a non-trivial task to collect and store information in a single place.

3) Drones and remote sensing

Development in information technology and agricultural science has made it possible to merge drones and Sensing, leading to the rise of precision farming. Such a scheme brings maximum profit and production with minimum input and optimal use of resources. One of the interesting applications is based on the global positioning system (GPS) and GIS technologies that help calculate optimal paths for tractors. With Machine Learning Algorithms challenging trigonometry task from university is transformed into simple solution and real money that wasnt spent on extra fuel.

4) Data collection and social network communication

The key to this is the creation of an efficient chain with local food production systems and livestock systems. This approach will create a greater understanding of the entire food supply chain efficiency and the integration of these two systems will generate long-term positive environmental impact and will deliver greater food security.

5) Computer vision

Image analysis and detection are among the most intensively growing fields in informatics research. All automated machines start from sensing, most often using cameras to obtain data that provides information about the crop and location of the harvesting system. Typically, this is an RGB camera, depth camera, or lidar system. Images are passed to machine learning pipelines that are based on effective classification approaches, including support vector machines(SVMs), neural networks, k-means, principal component analysis (PCA), feature extraction, etc. Among applications that were developed by different teams, the following could be mentioned:

6) Data transparency and blockchain

A blockchain is a method of encrypted data that conducts a search for every single transformation that has been applied to a target entity such as storing, linking, and recovering. The modern agricultural industry has accelerated and now uses a blockchain in the agriculture value chain because it is seen as a mechanism for optimizing different issues, such as transparency, cost-effectiveness, traceability, quality supply systems, etc. For example, the French food market Carrefour has been using blockchain solutions for the traceability of its products since 2018. The aim is to provide a QR code scan where consumers can retrieve data on the product on their mobile phones. The information available through the code includes the place and date of production, the products composition, the method of cultivation and etc.

7) Augmented Reality

This field is only at the development stage but has already demonstrated high potential in a specific field that requires 3D image resolutions. Visualization of animals, their diseases, and crops damages in order to assess and carry out treatment. Augmented Reality promises a lot in the near future, especially if it is combined with Artificial Intelligence (AI).

The importance of Machine Learning and Data Engineering in Agriculture should not be underestimated. Implementation of new techniques could definitely benefit farmers by increasing revenue and customers by saving them time.

Conclusion

The evolution of human transformations in agriculture outlines the main changes that the agricultural sector historically went through and at the same time points out problems that farmers face nowadays. In the era of computers and digital communication, farmers seek ways to increase profit, while consumers demand quality service in a short amount of time. In order to satisfy these needs, the Sigma Software group researched and developed several approaches, such as data communication, computer vision, blockchain, augmented intelligence and etc. These approaches along with other tools of Machine Learning and Data Engineering create a core outline for the agricultural sector. As a result, in order to increase revenues in the agricultural sector and create a more efficient chain supply, farmers need to rely more on machinery.

About the Author

Ihor Oleinich is Software Developer at Sigma Software Group. Sigma Software provides top-quality software development solutions and IT consulting to more than 170 customers all over the globe. Volvo, SAS, Oath Inc., Fortum, IGT (previously GTECH), Checkmarx, Formpipe Software, JLOOP, Vergence Entertainment, Collective, Genera Networks, Viaplay, and others trust us to develop their products. Our clients choose us for our timely and efficient communication, flexibility, strong desire, and ability to reach clients business goals.

Featured image: slonme

Link:
Machine Learning and Data Engineering Applications in Agriculture - TechNative