Page 2,545«..1020..2,5442,5452,5462,547..2,5502,560..»

Bust latency with monitoring practices and tools – TechTarget

Latency originates from two separate sources: a data center's network or its storage system. To reduce latency in your data center, consider its potential causes, then evaluate the various ways to troubleshoot it.

You can implement a variety of tools to help manage latency. Consider using latency monitoring software -- such as EdgeX Foundry or traceroute -- to pinpoint bottlenecks or keep tabs on network speeds, or adopt more latency-resistant hardware, including NVMe drives, persistent memory and SD-WANs.

Latency is the primary way to judge overall performance when it comes to storage systems. Low latency leads to faster transactions, which in turn leads to reduced storage costs for your business.

Storage latency comes from four main sources: storage controllers, storage software stacks, internal interconnects and external interconnects. You can reduce latency in each of these sources by selecting a fast CPU for your storage controller server, adopting storage software that prioritizes efficiency and CPU offload, implementing remote direct memory access networking and utilizing NVMe drives.

Persistent memory can also optimize storage and cut down on storage latency. It connects directly to the memory bus and offers two separate operating modes -- one to convert it to volatile memory, and the other to use it as a high-performance storage tier.

Network latency determines how long it takes between a request for data and the delivery of that data, which affects an entire infrastructure. High network latency can increase load times and even render certain applications unusable. Network latency usually comes from sources such as poor cabling, routing or switching errors, storage inefficiencies or certain security systems.

To improve network latency, start by measuring packet delay. Know how long it takes for your network to fulfill a request. Tools such as Ping, Traceroute and MTR can help you with this. Next, identify potential bottlenecks in your network. Depending on the source of your network latency, you can take steps such as improving routers or implementing network speed amplifiers. Finally, introducing nearby edge servers can also reduce networking strain and improve latency. Such edge servers can shorten the distance that a request packet must travel, thereby improving your system's response time.

Cloud latency can create significant issues for both organizations and end users. Distance often causes the majority of cloud latency, but equipment such as WANs can also create cloud latency problems.

Implementing SD-WAN instead of WAN networking can reduce cloud latency. Most SD-WAN offerings feature increased reliability, end-to-end security, and extensibility and management automation. You can improve remote connections, but SD-WAN requires virtual endpoint appliances.

Edge computing moves data and calculations out of the data center to edge locations. To minimize decision-to-action latency, some cloud providers have even moved their cloud environments to the edge. This process cuts out public commercial internet traffic to enable faster and more efficient delivery of services to customers.

However, due to its remote nature, the edge can present its own problems with latency. Software that monitors edge devices should measure latency in real time. Edge device monitoring services such as AWS IoT services, EdgeX Foundry and FNT Command all possess latency monitoring tools or features.

When monitoring latency in large, complex systems, first ensure that monitoring latency won't increase latency. Synthetic monitoring and log monitoring tools can often do more harm than good when it comes to latency issues. Metrics- and event-based monitoring tools cause less strain in comparison but can also still increase latency.

You can ensure your latency monitoring tools don't negatively affect latency by evaluating and altering the sequence of scripts your monitoring tools run. This enables you to scale back on the frequency of latency testing and prevents your latency monitoring tools from creating issues.

Continued here:
Bust latency with monitoring practices and tools - TechTarget

Read More..

Mercy Ships Sends Out Clear Signal of Hope With Intellian Technology – The Maritime Executive

Image courtesy Mercy Ships / Intellian

PublishedOct 24, 2021 6:08 PM byThe Maritime Executive

Two years ago, Intellianstruck up a close working relationship with the international surgical care charity Mercy Ships. With its International Support Center (ISC) located in Garden Valley, Texas, Mercy Ships has branches in the UK, West Africa, Canada, Australia, New Zealand, South Africa, Sweden, Norway, Denmark, Korea, France, Switzerland, Spain, the Netherlands and Belgium. Through its floating hospitals, the organizations work encompasses a range of procedures, including orthopedic, maxillofacial, ocular and reconstructive surgeries; mental health services; healthcare training projects; and palliative care.

Since its inception in 1978, Mercy Ships has provided free medical aid plus physical and moral support for people in countries ranked in the lower third of the World Health Organizations Human Development Index. The organization has shown unwavering determination in its quest to eradicate diseases of poverty.

For the most part, such diseases should be relatively straightforward to prevent or treat, but the World Health Organization has estimated that over 50 percent of the population in the poorest parts of Africa and Asia lack regular access to essential medicines and healthcare. With only around half of sub-Saharan African children vaccinated against childhood diseases, many die from preventable diseases such as diarrhea, measles, malaria and malnutrition-related causes. The work carried out by Mercy Ships could not be more vital.

The pandemic has been a devastating setback a global catastrophe that has only served to make extreme situations even more disastrous. However, last years challenges have only strengthened Mercy Ships resolve, and Intellian Technologies is with them every step of the way.

Beginning in 2019, Mercy Ships successfully utilized an Intellian v240C C-band VSAT solution on board its hospital ship Africa Mercy. This has now been updated to an Intellian v240M 2 dual-band, multi-orbit VSAT system using a connectivity solution from SES, which has been our provider of choice in scoping out the antennas for Global Mercy, the new vessel coming out of China, says Jonathan Dyson, Director of Enterprise Infrastructure at Mercy Ships.

We wanted to future-proof our installation. Historically, we had the antenna as a lease from another provider, so now that were moving forward and purpose-building our ship, we decided that we wanted to own our antennas and switched providers, he said. If we need to upgrade, downgrade, switch from GEO to MEO or even LEO constellations, we can do that with SES because of the antennas that Intellian has provided.

Seamless connectivity

Global Mercy, Mercy Ships new vessel, is the largest civilian hospital ship in the world. It can accommodate up to 950 people in port, including 641 volunteer crew members, and its six operating theaters and hospital wards are designed to manage 200 patients at a time. The company estimates that it will more than double Mercy Ships current surgical and training capacity. By the time the vessel has seen out its anticipated 50-year working lifespan, more than 150,000 lives will have been improved and saved by the surgeries carried out on its purpose-built hospital decks.

Global Mercycompleted its deep-water sea trials in May 2021, during which exhaustive tests were carried out to ensure that engine performance, manoeuvrability and fuel consumption are all on target. Safety evaluations were also conducted, as were a series of tests to confirm that all onboard equipment and systems are fully operational, in advance of the vessel entering service in sub-Saharan Africa next year.

Global Mercy is fitted with dual 2.4m Intellian v240MT 2 antennas, integrated and controlled by Intellians Intelligent Mediators, which guarantees seamless connectivity by switching between antennas should the vessels superstructure ever block a satellite signal. Installation of the antennas was completed just a few weeks ago, and it was a concerted team effort that overcame several logistical and circumstantial difficulties, as Dyson recalls.

The 2.4m Intellian v240MT 2 antenna

Intellian and SES collaborated to get all the hardware installed and take it out on the last sea trial to make sure that it was working. It was good to have all these teams working together positively and get that accomplished, he said.

Mercy Ships increased investment in Intellian hardware - upgrading the system on Africa Mercy and installing a top-of-the-range, orbit-agnostic v240MT 2 dual-antenna solution on Global Mercy - directly reflects the companys operational growth and the changing landscape of connectivity, according toDave Shwadlenak, VP of IT at Mercy Ships.

As Mercy Ships has matured more and more, weve had an increased need not to have on-premise servers and applications, and thats driving a lot of what we do technology-wise. Our technology stack has become so broad, and so much is in the cloud these days, that having data centers on board the ships is problematic for us, said Shwadlenak. So, one of the reasons for the upgrade and moving towards more bandwidth is the ability to communicate dynamically and immediately with our ships. We transfer a lot of data back and forth: many of our applications are in the cloud, so its essential to be able to access this. In most countries, we can get shore connections, but the satellite is our lifeblood. It is our mainstay, the one thing upon which we constantly rely.

The work we do in the hospital, and for that matter everything that we do aboard, is centralized here at the ISC. Most of our servers are here or in the cloud, so this communication piece has become critical for us: and its only because weve been able to upgrade that bandwidth that were able to move to that technology, he added. It was just a couple of years ago, when I joined, that we had a 3MHz pipe with the satellite. It was impossible to do some of the things were doing today.

Essential communications

Global Mercy is due to deploy to Sub-Saharan Africa Madagascar, Cameroon, Guinea, Senegal, and Benin in 2022. In addition to the vital medical support and reinforcement it will bring to these areas, it will also host mentoring and training programs for local health professionals.

Satellite connectivity is at the heart of Mercy Ships work. It enables the smooth running of the day-to-day business, including essential communications with the sponsors and donors whose input is key to Mercy Ships ongoing ambitions. It also allows teleconferencing applications for capacity building and remote training; image sharing; diagnostic discussions with specialists located worldwide; and all-important crew welfare connectivity for social media access and web browsing.

Dave Shwadlenak is under no illusions about the potentially disastrous effects upon the companys capabilities were its connectivity to go down. It could be catastrophic, in all honesty. We could still operate, but only for a short time. For example, staff turnover is all centralized back here at the ISC, so we would be forced to go back to some paper environment where were trying to keep up with room registrations and so forth. Our finance department would have difficulty, while the ability for trainees to remotely watch surgeries and learn from that would be unavailable as well. So much of our work would be greatly diminished or cease altogether.

Both Dave Shwadlenak and Jonathan Dyson are unstinting in their praise for Intellian. Ive been very, very happy with them, Shwadlenak emphasizes. Frankly, theyve been marvelous in terms of working with us and helping us through many of the pitfalls we had. Especially in China, it was tremendous what they did there.

We basically needed letters of permission from the Chinese Ministry of Foreign Affairs, Dyson explains, but travel restrictions meant that it was suddenly going to be a major problem to get people onsite for the installations. Intellian and SES worked really well together to come up with a plan. It was very much outside of process, but from a customer service perspective, it was great to just be able to lean on them and have a team from Shanghai who came down to build the system, to put everything together and check it, and make sure that all the hardware was there when we couldnt get anybody there. That was just fantastic.

Ive worked with many different service providers, Dyson adds, to figure out costs of bandwidth per MB per month because we need to be financially responsible towards our donors. So Ive spoken to a broad range of vendors, and all of them recommended Intellian hardware. All of them. That speaks volumes about the quality of the equipment, the hardware, the products that Intellian provides. Through a fiscally responsible investment in Intellian products, this partnership allows us to fulfill our mission and provide quality access to safe surgeries.

Ive appreciated everybody that Ive worked with from Intellian; theyre all very professional. The partnership between SES and Intellian appears from our standpoint to be very strong, so if we reach out to SES, we know it will be handled by Intellian, and its transparent to us as the customer as it should be. Intellians customer service has been seamless from our perspective, he concluded.

This message is sponsored by Intellian.

The opinions expressed herein are the author's and not necessarily those of The Maritime Executive.

Read the original:
Mercy Ships Sends Out Clear Signal of Hope With Intellian Technology - The Maritime Executive

Read More..

GM Joey Antonio still the king of online chess tourney – PhilBoxing.com

GM Joey Antonio still the king of online chess tourney

By Marlon BernardinoPhilBoxing.comMon, 25 Oct 2021

MANILA---Grandmaster Rogelio "Joey" Antonio Jr. showed why he is the Filipino Ironman of online chess as he topped the Coach Robert Racasa birthday chess tournament Sunday night, October 24, 2021 virtually held at Lichess.org Platform.

The 59-year-old Antonio was clear first with the higher score of arena 101 points to rule the 155 players bullet event.

National Master Eric Labog Jr. was second with 88 points followed by third place National Master Joey Florendo with 77 points.National Master Carlos Edgardo Garma and Francis Talaboc were tied at fourth to fifth places with 70 and 68 points respectively.

Adjudged category winners were National Master Jonathan Tan (Top Senior) and National Master Almario Marlon Quiroz Bernardino Jr. (Top Media).-Marlon Bernardino-

Click here to view a list of other articles written by Marlon Bernardino.

See more here:
GM Joey Antonio still the king of online chess tourney - PhilBoxing.com

Read More..

Alibaba launches new server chip to boost its cloud business in challenge to Amazon and Microsoft – CNBC

Signage at the Alibaba Group Holdings Ltd. headquarters in Hangzhou, China, on Wednesday, March 24, 2021.

Qilai Shen | Bloomberg | Getty Images

GUANGZHOU, China Chinese e-commerce giant Alibaba launched a new server chip on Tuesday, as it looks to boost its cloud computing business and compete against U.S. rivals like Amazon.

The processor, called Yitian 710, will go into new servers called Panjiu.

The chip and servers will not be directly sold to customers. Instead, Alibaba's cloud computing clients will buy services based on these latest technology. These servers are designed for artificial intelligence applications and storage.

The company did not say when the services based on the latest chip and server will be available for customers.

Alibaba will not be manufacturing the semiconductor but will be designing it instead.

That's a trend among Chinese companies. Huawei designed its own smartphone chips and Baidu raised money this year for a standalone semiconductor business. U.S cloud computing rivals including Google and Amazon have also done the same.

Read more from the original source:
Alibaba launches new server chip to boost its cloud business in challenge to Amazon and Microsoft - CNBC

Read More..

19 Cloud Computing Statistics That Will Keep You Awake at Night – Hashed Out by The SSL Store

Cloud adoption rates are increasing by the day as more businesses take advantage of everything the cloud has to offer. But along with the benefits the cloud offers, it also brings increasing security risks. Heres a list of 19 cloud adoption and security statistics you should know in 2021 and beyond

As kids, its fun to lay outside with family or friends and stare up at the sky. You look for fun and exciting shapes and characters in the clouds. But as adults, particularly those working in IT security for enterprises, we look to clouds in a different way both as a source of scalable storage and as a means of delivering services in our digital world. This is particularly true when it comes to increasing remote work requirements since Covid-19 first reared its ugly head nearly two years ago.

However, as you know, the cloud has its ups and downs when it comes to security and usages. Thats why weve put together a list of 19 attention-grabbing cloud computing statistics this will cover data relating to cloud adoption, cloud security, and other informative statistics.

Lets hash it out.

When it comes to cloud adoption, you must decide whether its better for your business to use a single cloud provider or work with multiple cloud providers. Part of this decision entails choosing whether to use in-house resources (on-prem), cloud, or perhaps a hybrid approach.

Theres no right or wrong answer that decision is something you must determine the answer to based on factors that include your budget, needs, and in-house resources (including personnel).

Data from Flexeras 2021 State of the Cloud Report shows that nine in 10 enterprises are taking a multi-cloud strategy approach. Within that group, more than 80% of enterprises report having a hybrid cloud strategy in place.

Organizations that have multi-cloud strategies are less likely to be dependent upon individual vendors. Gartners 4 Trends Impacting Cloud Adoption in 2020 report shows that by 2024, nearly two in three organizations will use more than one vendor.

Furthermore, data from a separate Gartner survey also shows that 81% of public cloud users currently use two or more cloud providers. However, their research also predicts that 50% of the public cloud market will be controlled by the 10 biggest providers by 2023.

Every day, the cloud is playing a larger role in organizations operations and services, and businesses are increasingly turning to PKI to provide identity and security. In their 2021 State of Machine Identity Management report, Keyfactor and the Ponemon Institute report that more than half of organizations report cloud-based services as the impetus for public key infrastructure deployment and usage growth rates. This is followed by:

Cynets 2021 Survey of CISOs with Small Cyber Security Teams shows that companies with smaller security teams are looking primarily to the cloud (57%) as a means of implementing security technologies. This top-ranking priority is followed by on-prem (21%) and hybrid (13%).

By outsourcing resource-intensive processes like managing in-house IT infrastructure, companies can free up their limited personnel to focus on other priorities.

Whether its the cost of running servers or having the personnel in place to manage them, everything costs money and prices are increasing. With this in mind, lets explore some of the cloud computing-related costs for businesses.

Data from Proofpoint and the Ponemon Institutes report The Cost of Cloud Compromise and Shadow IT shows that security incidents cost businesses a pretty penny. Their survey of 662 U.S. IT and IT sec professionals shows that cloud account compromises cost an average of more than $6.2 million over 12 months. 86% of the surveyed organizations said cloud account compromises cost them at least $500,000.

To put this into context, the cost of cloud account compromises averages out to be 3.5% of their total revenues within the same period.

Cloud migration can cost a pretty penny, but not all organizations properly plan for the expenses. While organizations often plan for the direct costs associated with this process, planning for indirect costs can fall through the cracks. As such, Gartners cloud migration cost research predicts that almost two in three infrastructure and operations (I&O) leaders through 2024 will experience public cloud migration cost overruns that will take a toll on their on-prem budgets.

Flexera reports in their 2021 State of the Cloud Report that 30% of the organizations they surveyed have annual public cloud spends ranging between $2.4 million and $12 million. Another 31% of their survey respondents indicate they spent more than $12 million a year!

Based on the estimate of $12 million a year, that means these large organizations are spending an average of $32,876.72 a day on public cloud.

$50 million thats more money than some business owners will ever see in their entire lifetimes. But according to research from IDC and the cloud infrastructure security company Ermetic, that number is just the annual cloud infrastructure expenses for some companies. Their research also shows that 71% of businesses invest up to $50 million a year to expand their cloud infrastructures.

Cloud security is integral to your organizations overall security health. Lets explore some relevant cloud security statistics and data to give you a better view of what this sector of the industry looks like.

Would you believe me if I told you that nearly all organizations have experienced a cloud data breach within the previous year and a half? Data from IDC and Ermetic shows that almost all the organizations (98%) they surveyed experienced a minimum of one cloud data breach in that period. Their study involved 200 CISOs and other security decision-makers from U.S. companies.

The report indicates that this rose nearly 20% from their 2020 survey in which only 79% of respondents indicated the same.

In early 2021, Keyfactor hired the Ponemon Institute to conduct a survey of 100 North American IT security execs to better understand their zero-trust strategy priorities and where public key infrastructure fits in:

Verizons 2021 Data Breach Investigations Report (DBIR) shows that servers lead the way in terms of being the top assets targeted in data breaches. Web application servers took the lead as the primary breach target, encompassing more than 50% of breaches; mail servers came in second with ~25% of data breaches targeting them.

Compromised external cloud assets were more common than on-premises assets in both incidents and breaches. Conversely, there was a decline of user devices (desktops and laptops) being compromised. This makes sense when we consider that breaches are moving toward Social and Web application vectors, such as gathering credentials and using them against cloud-based email systems. Verizon 2021 DBIR Executive Brief

More than 50% of the web app attack-based data breaches Verizon analyzed in their 2021 Data Breach Investigations Report (DBIR) involve mail server compromises. Of those compromised, the overwhelming majority (96%) were cloud-based mail servers.

Data from IBM and the Ponemon Institutes 2021 Cost of a Data Breach report shows that the average cost of a data breach for a hybrid cloud environment is $3.61 million. This average cost is less than other cloud environments (i.e., on-prem, private, and public clouds). The average total cost of a data breach for businesses globally was $4.24 million.

IBMs 2021 Cost of a Data Breach report shows that cloud misconfigurations were the third most common initial attack vector in the breaches they analyzed. These types of data breaches require an average of 186 days to identify and another 65 to contain. The price tag accompanying this attack vector? A cool $3.86 million in total costs.

Needless to say, these types of cloud security statistics underscore the importance of having strong security measures, processes, and policies in place. Having these resources in place can help you avoid oops situations that put your company, data, and reputation at risk.

Its no secret that security misconfigurations are a big issue for businesses. But Palo Altos Unit 42 2H 2020 research indicates that two in three security-related incidents are due to misconfigurations. But thats still lower than estimates from Gartner, which estimates that 99% of cloud security failures through 2025 will be the customers fault pointing to the idea that misconfigurations are a much bigger issue.

Palo Altos Unit 42 Cloud Threat Report 1H 2021 data shows that nearly one in three organizations fail to implement proper security measures in the cloud. These controls are what help to protect cloud environments against attacks and mitigate vulnerabilities.

Access security measures are the locks on your doors and the walls around your castle. But data from IDC and Ermetic shows that access-related vulnerabilities cause four in five cloud data breaches. The top-ranking industries? Healthcare, Utilities, and Media.

Overall, their report also shows that larger companies are more at risk of experiencing these security issues than businesses with fewer employees. (60% of businesses with 10,000 or more employees cite access as their leading factor for causing cloud breaches.) This makes sense, considering that they have much larger attack surfaces than their smaller business counterparts.

Before we jump into the data, ask yourself: what types of data are you storing in the cloud? And what steps are you taking to ensure that data remains secure?

When it comes to cloud storage, organizations store everything from employees and customers sensitive information to intellectual property and trade secrets. Lets explore some key cloud computing and usage statistics relating to stored data.

Frankly, storing sensitive data in the cloud is always risky this is why companies that choose to do so need to take extra precautions in terms of how they secure that data. According to SANs 2021 Cloud Security Survey data from a presentation sponsored by Blue Hexagon, the top three types of sensitive or regulated data that companies store include:

Flexeras 2021 State of the Cloud Report data shows that almost half of enterprises workloads (47%) and data (44%) are currently stored in public cloud environments. These organizations aim to add another 8% and 7%, respectively, to the cloud within the following 12 months.

Compare this to SMB respondents, who say they have 64% of workloads and 59% of data in the public cloud now, and that they plan to add another 5% of workloads and 8% of data within 12 months.

Its obvious youre in a rush if youve skipped right to this section. As such, weve put together a brief highlights list of the top five cloud computing statistics to note from the list above:

As always, feel free to leave a comment and share your most notable cloud security statistics and cloud computing statistics below

See the original post here:
19 Cloud Computing Statistics That Will Keep You Awake at Night - Hashed Out by The SSL Store

Read More..

The Army’s Enterprise Cloud Management Office looks to deliver computing resources around the world – Federal News Network

As choices are examined and strategies perfected about cloud migration in the greater National Capital Region, one of the many federal-government spaces where important decisions are being executed is within the office of the Army CIO.

The person who is the principal adviser, assisting the Army CIO in the development of strategy, use, and optimization of Cloud resources is Paul Puckett, appointed to the Senior Executive Service in 2019 and assuming the role of the director of the Enterprise Cloud Management Office.

Im able to influence policy and governance, but then Im also able to strategize and then implement capabilities and deliver them for the United States Army, Puckett said on Federal Monthly Insights Cloud Migration Strategy and Cloud FinOps.

When Puckett arrived at the office of the Army CIO, he was greeted by a multi-cloud Army. He found that it was not as coherent as it could be to serve the greater needs of the Army.

So what we did is we delivered a capability that we call cArmy, which is really the foundational security services, if you will, that allow us to adopt commercial cloud services, meeting all DoD policy and governance in security, Puckett said on Federal Drive with Tom Temin.

But more importantly what it means is weve already prepared an environment and ecosystem so system owners, as they come to the front door, arent worried about what their security posture might look like, Puckett said. Weve already addressed that foundation, what were now trying to account for is how is your application or system designed? Who is your customer? And what data sets are you either consuming or creating? And how can we start to expose your services and data to the rest of the Army and start to deliver greater this greater enterprise ecosystem?

In 2022, Pucketts vision is to extend cArmy from both an infrastructure computer-storage, networking perspective, as well as a common services perspective, into the OCONUS (Outside the Continental United States) regions.

So were adopting a very similar regionally distributed model that any hyperscaler would, because we want to be mindful of physics, we want to be mindful of our customers and our data sets, Puckett said. You have to be mindful of latency. So its not just the availability of the connectivity, but also the speed of light is still a bound thing. And so we want to make sure that were being thoughtful of the customer experience, as we deliver capabilities globally.

Read the original post:
The Army's Enterprise Cloud Management Office looks to deliver computing resources around the world - Federal News Network

Read More..

Profitable Investment: Top Cloud Computing Stocks You Must Know This October – Analytics Insight

Cloud computing stocks are in high demand for the emergence of digitalization since the outbreak of the COVID-19 pandemic. Investors have started keeping an eye on cloud computing stocks apart from tech stocks as well as the cryptocurrency market. The global cloud computing stock market is expected to hit US$791.48 billion in 2028 with a CAGR of 17.9%. Multiple tech companies are leveraging cloud computing making it a profitable investment for the tech-driven future. Lets explore some of the top cloud stocks in October that are suitable for investment.

Market cap: US$38.80 billion

The Trade Desk is one of the top cloud computing stocks for investors in October that operates as a tech company across the world. It operates a self-service cloud-based platform that allows buyers to create and optimize data-driven digital advertising campaigns in multiple ad formats and channels. There are data and other value-added services for advertising agencies and other service providers for advertisers.

Market cap: US$263.25 billion

Oracle Corporation is highly popular in the tech industry as a tech company that provides Oracle cloud software as a service offering. It includes different cloud software applications such as Oracle Fusion cloud enterprise resource planning, Oracle Fusion cloud enterprise performance management, Oracle Fusion cloud supply chain and manufacturing management, Oracle Fusion cloud human capital management, Oracle Fusion cloud advertising, and customer experience, as well as NetSuite applications suite. There are also cloud-based industry solutions for different industries.

Market cap: US$2.33 trillion

Microsoft Corporation is one of the well-known tech companies in the world with top-notch cloud computing services. It offers multiple segments to its global customer base productivity and business processes, intelligent cloud, and more personal computing. Microsoft Azure is one of the top cloud platforms for all kinds of businesses in 2021 with Windows cloud services. It is focused on selling its cloud products through OEMs, distributors, online stores, and many more.

Market cap: US$49.99 billion

Veeva Systems is a popular tech company that provides cloud-based software for the life science industry in different parts of the world. There is a wide range of products and services related to cloud computing that makes it one of the top cloud computing stocks to buy in OctoberVeeva Commercial Clouds that include Veeva data cloud, Veeva CRM engages, Veeva CLM, and many more. It is also focused on offering professional and support services, technical consulting services, and ongoing managed services.

Market cap: US$41.56 billion

Unity Software is a well-known tech company that works with cloud computing and one of the top cloud computing stocks for investors in October. It operates a real-time 3D development platform that provides software solutions for different smart devices such as mobile phones, PCs, consoles, and many more. It offers its products and services directly through the online store and field sales operations in different parts of the world including the UK, Finland, Denmark, Japan, China, and Germany.

Market cap: US$10.24 billion

DigitalOcean Holdings, Inc. is one of the popular cloud computing stocks and operates a cloud computing platform across the world. This tech company provides on-demand infrastructure and platform tools for different clients such as start-ups, small businesses, as well as medium businesses. This cloud computing platform also offers infrastructure solutions while extending the native capabilities of the cloud with a fully managed application. It can be operated in multiple industry verticals such as web and mobile applications, website hosting, personal web projects, and many more.

Continued here:
Profitable Investment: Top Cloud Computing Stocks You Must Know This October - Analytics Insight

Read More..

Cloud Computing in Automotive Market 2021 Growing Demand and Precise Outlook- Amazon Web Services, Microsoft Azure, and Google Cloud Platform Puck77…

Cloud Computing in Automotive Market research report is the new statistical data source added by Adroit Market Research. Cloud Computing in Automotive Market is growing at a High CAGR during the forecast period 2021-2027. The increasing interest of the individuals in this industry is that the major reason for the expansion of this market.

The latest report on the global Cloud Computing in Automotive market suggests a positive growth rate in the coming years. Analysts have studied the historical data and compared it with the current market scenario to determine the trajectory this market will take in the coming years. The investigative approach taken to understand the various aspects of the market is aimed at giving the readers a holistic view of the global Cloud Computing in Automotive market. The research report provides an exhaustive research report that includes an executive summary, definition, and scope of the market.

Further Cloud Computing in Automotive market research report provides regional market analysis with production, sales, trade, and regional forecast. it also provides market investment plan like product features, price trend analysis, channel features, purchasing features, regional and industry investment opportunity, cost, and revenue calculation, economic performance evaluation, etc.

This report strategically examines the micro-markets and sheds light on the impact of technology upgrades on the performance of the Cloud Computing in Automotive market. The report presents a broad assessment of the market and contains solicitous insights, historical data, and statistically supported and industry-validated market data. The report offers market projections with the help of appropriate assumptions and methodologies. The research report provides information as per the market segments such as geographies, products, technologies, applications, and industries.

Top Leading Key Players are: Amazon Web Services, Microsoft Azure, and Google Cloud Platform

Researchers also carry out a comprehensive analysis of the recent regulatory changes and their impact on the competitive landscape of the industry. The research assesses the recent progress in the competitive landscape including collaborations, joint ventures, product launches, acquisitions, and mergers, as well as investments in the sector for research and development.

It includes the research studies about the current trends in different sectors on the basis of their scope. The analyst of this report focuses on the static and dynamic pillars of the industries, for basic understanding of the strategies. In addition to this, it identifies the drivers and opportunities for the development of the businesses. Additionally, it focuses on restraints to analyze the issues from the existing business strategies. It focuses on the various aspects, such as application areas, platforms, and leading players operating across the globe.

The Cloud Computing in Automotive analysis also includes the precise shares of the market research. The Cloud Computing in Automotive article, likewise, provides overall percentage shares and breakdowns. Primary and secondary sources are used to research and analyze the industry. In addition, the Cloud Computing in Automotive study uses SWOT analysis to provide an in-depth analysis of the Cloud Computing in Automotive market, including Capacity, Vulnerability, Opportunities, and Risks. A detailed survey of the worlds leading manufacturers is also included in the Cloud Computing in Automotive study report, which is focused on the industrys various priorities, including consumer profiles, supply quantity, product definition, critical raw materials, and financial structure. Similarly, the Cloud Computing in Automotive report is investigated and evaluated after a detailed background check. As a result, the Cloud Computing in Automotive report focuses on understanding different market segmentation, regional segmentation, market dynamics, market growth drivers, and comprehensive analysis of the competitive landscape in this market.

Scope of the study:

The research on the Cloud Computing in Automotive market focuses on mining out valuable data on investment pockets, growth opportunities, and major market vendors to help clients understand their competitors methodologies. The research also segments the Cloud Computing in Automotive market on the basis of end user, product type, application, and demography for the forecast period 20202027. Comprehensive analysis of critical aspects such as impacting factors and competitive landscape are showcased with the help of vital resources, such as charts, tables, and infographics. This report strategically examines the micro-markets and sheds light on the impact of technology upgrades on the performance of the Cloud Computing in Automotive market.

Global Cloud Computing in Automotive market is segmented based by type, application and region.

Based on Type, the market has been segmented into:

Based on application, the market has been segmented into:

The ongoing status of global Cloud Computing in Automotive market current market updates and regional levels

Understanding of global marketplace development

A study of this market-attracted place on product sales

Competitive analysis is specified for eminent players, price structures, and value of production.

Various stakeholders in this industry, including research and consulting firms, investors for new entrants, and financial analysts, product manufacturers, distributors, and suppliers are listed.

Reason to Buy:

Save and reduce time carrying out entry-level research by identifying the growth, size, leading players, and segments in the Cloud Computing in Automotive market

Highlights key business priorities in order to assist companies to realign their business strategies.

The key findings and recommendations highlight crucial progressive industry trends in the Cloud Computing in Automotive market, thereby allowing players to develop effective long-term strategies.

Develop/modify business expansion plans by using substantial growth offering developed and emerging markets.

Scrutinize in-depth global market trends and outlook coupled with the factors driving the market, as well as those hindering it.

Enhance the decision-making process by understanding the strategies that underpin commercial interest with respect to products, segmentation, and industry verticals.

About Us

Adroit Market Research is an India-based business analytics and consulting company incorporated in 2018. Our target audience is a wide range of corporations, manufacturing companies, product/technology development institutions and industry associations that require understanding of a markets size, key trends, participants and future outlook of an industry. We intend to become our clients knowledge partner and provide them with valuable market insights to help create opportunities that increase their revenues. We follow a code Explore, Learn and Transform. At our core, we are curious people who love to identify and understand industry patterns, create an insightful study around our findings and churn out money-making roadmaps.

Contact Us:

Ryan Johnson

Account Manager Global

3131 McKinney Ave Ste 600, Dallas,

TX75204, U.S.A.

Phone No.: USA: +1 210-667-2421/ +91 9665341414

Read the rest here:
Cloud Computing in Automotive Market 2021 Growing Demand and Precise Outlook- Amazon Web Services, Microsoft Azure, and Google Cloud Platform Puck77...

Read More..

The PC slowdown shouldn’t hurt Microsoft earnings, and here’s why – MarketWatch

The slowdown in personal computer sales due to supply-chain issues in recent months would have hurt Microsoft Corp. in past years, but the companys pivot to cloud computing and cloud software should insulate it from any earnings fallout.

Microsoft MSFT, -0.51% is scheduled to report its fiscal first-quarter earnings on Tuesday afternoon, as it rolls out its new Windows 11 operating system and PC makers struggle to deliver new machines. While the Microsoft of Bill Gates and Steve Ballmer would have faced a lot of Wall Street pessimism if PC shipments were mangled and a new operating system was not quickly adopted, Satya Nadellas Microsoft should be just fine.

That is because analysts and investors are mostly focused on Azure, Microsofts cloud-computing answer to Amazon.com Inc.s AMZN, -2.90% Amazon Web Services, as well as cloud-software offerings, decreasing the importance of Microsofts PC business.

Sustained digital transformation momentum should offset the impact from mixed PC unit shipment estimates from IDC and Gartner, Morgan Stanley analysts wrote in a preview of the report, later adding, While our negative growth outlook for Windows OEM pressures our longer term earnings expectation for Microsoft, we also note Windows OEM overall represents a decreasing mix of overall Microsoft revenue and gross profit.

Read: Why Amazon and Microsoft wont have a stranglehold on cloud computing forever

Azure has made sure that Windows importance to Microsoft has decreased. The fast-growing cloud business is at the top of every analyst note about Microsoft, and analysts expect revenue to grow in the mid-40% range. (Microsoft does not disclose Azure performance except for percentage gain, despite AWS and Google GOOGL, -3.04% GOOG, -2.91% Cloud providing full revenue and operating profits for their competitive services).

Fundamentally, ramping contribution from previously signed long-term Azure deals, continued Cloud migrations post-COVID, Microsofts intensifying focus on Cloud verticalization and strong Microsoft 365 seat growth can sustain durable longer-term Azure growth, the Morgan Stanley analysts wrote.

There are factors that could add to Microsofts growth as well, especially in the forecast. The $19.7 billion acquisition of health-care-focused company Nuance is expected to close before the end of the calendar year, and Microsoft recently disclosed that its cloud-based revenue would dump into the same revenue bucket as Azure.

The return of JEDI: Why the sequel to militarys cloud contract could cost much more than the $10 billion original

While Microsoft did not disclose exactly how much that would mean, UBS analysts said in September that prior Nuance disclosures and a call they had with the companys investor relations team led them to estimate that about 46% of Nuances revenue would be cloud-based. They estimated that would mean roughly $91 million in additional sales for Microsofts cloud division in the fiscal second quarter, if the full quarter were to be included.

Another bump could be coming in the future from increased prices for Microsofts most popular cloud software offering, Office 365. Microsoft is increasing prices more than 10% across the board for the product, which the company described as the first substantive pricing update since we launched Office 365 a decade ago, which also gives analysts confidence that Microsoft can withstand any supply-chain pressures on the PC market.

Earnings: Analysts on average expect Microsoft to report earnings of $2.08 a share, up from $1.82 a share a year ago. Contributors to Estimize a crowdsourcing platform that gathers estimates from Wall Street analysts as well as buy-side analysts, fund managers, company executives, academics and others predict earnings of $2.22 a share.

Revenue: Analysts on average were modeling sales of $43.93 billion, which would be an improvement from $37.15 billion a year ago, after Microsoft forecast revenue of $43.3 billion to $44.2 billion. Estimize contributors expect $44.88 billion in sales.

Analyst expect $16.52 billion in sales from the Intelligent Cloud segment, after Microsoft guided for $16.4 billion to $16.65 billion; $14.67 billion in sales from the cloud-software-focused Productivity and Business Solutions segment, after a forecast of $14.5 billion to $14.75 billion; and $12.72 billion from More Personal Computing, after guidance for sales of $12.4 billion to $12.8 billion.

Stock movement: Microsoft shares have declined in the session following earnings releases in four of the past five quarters, though the last decline was only by 0.1%. The stock has increased 8.1% in the past three months and 45.2% in the past year, as the S&P 500 index SPX, -0.11% has grown by 4.1% and 31.6% in those periods, respectively.

Analysts are in pretty universal agreement about Microsofts current position. According to FactSet tracking, 33 out of 36 analysts rate the stock the equivalent of a buy, while the other three rate it as a hold.

Currently trading at ~27x our CY23 GAAP EPS estimates, Microsoft represents a rare combination of strong secular positioning and reasonable valuation within the software space, wrote the Morgan Stanley analysts, who rate the shares overweight with a price target of $331.

The once concern seems to be the durability of the current growth trajectory, which is why the Nuance acquisition and increased pricing of Office 365 is seen as key to the stock continuing to rise.

Comps get progressively tougher throughout FY22, which should be met by Microsofts durable growth portfolio of Azure/Security/Teams, wrote Jeffries analysts, who have an outperform rating and recently raised their price target to $375 from $345. Key items to watch are elevated expectations (Azure high 40s reported), integration with Nuance and increased security investments.

In-depth: The tech earnings boom is fizzling out, as Apple and Amazon face the same issues as everyone else

Microsoft has benefitted from the pandemic, as companies have relied on cloud-computing power and software to keep teams connected while working remotely. But Microsoft bull and Wedbush analyst Daniel Ives does not see a return to the office as a sign that the boom will end.

We believe the Streets view of moderating cloud growth on the other side of this WFH cycle is contrary to the deal activity Microsoft is seeing in the field, Ives, with an outperform rating and $375 price target, wrote in a preview of the report. While we have seen the momentum of this backdrop in the last few years, we believe deal flow looks incrementally strong (Office 365/Azure combo deals in particular) heading into FY22 as we estimate that Microsoft is still only ~35% through penetrating its unparalleled installed base on the cloud transition.

Stifel analysts, with a buy rating and $325 price target, concurred.

We continue to believe that the pandemic is forcing organizations to accelerate the pace of their cloud migrations and that Microsoft remains a key beneficiary of this modernization spend, especially around large new deal momentum, as its broad stack enables it to capture Tier 1 workloads previously out of reach, they wrote.

The average price target on Microsoft stock as of Friday afternoon was $335.47, roughly 8.5% higher than the going rate.

Read the original:
The PC slowdown shouldn't hurt Microsoft earnings, and here's why - MarketWatch

Read More..

NASA Turns to the Cloud for Help With Deluge of Data From Next-Generation Earth Missions – SciTechDaily

The state-of-the-art Earth science satellites launching in the near future will be generating unprecedented quantities of data on our planets vital signs. Cloud computing will help researchers make the most of those troves of information. Credit: NASA Earth Observatory

As satellites collect larger and larger amounts of data, engineers and researchers are implementing solutions to manage these huge increases.

The cutting-edge Earth science satellites launching in the next couple of years will give more detailed views of our planet than ever before. Well be able to track small-scale ocean features like coastal currents that move nutrients vital to marine food webs, monitor how much fresh water flows through lakes and rivers, and spot movement in Earths surface of less than half an inch (a centimeter). But these satellites will also produce a deluge of data that has engineers and scientists setting up systems in the cloud capable of processing, storing, and analyzing all of that digital information.

About five or six years ago, there was a realization that future Earth missions were going to be generating a huge volume of data and that the systems we were using would become inadequate very quickly, said Suresh Vannan, manager of the Physical Oceanography Distributed Active Archive Center based at NASAs Jet Propulsion Laboratory in Southern California.

Part of the SWOT satellites science instrument payload sits in a clean room at NASAs Jet Propulsion Laboratory during assembly. By measuring the height of the water in the planets ocean, lakes, and rivers, researchers can track the volume and location of the finite resource around the world. Credit: NASA/JPL-Caltech

The center is one of several under NASAs Earth Science Data Systems program responsible for processing, archiving, documenting, and distributing data from the agencys Earth-observing satellites and field projects. The program has been working for several years on a solution to the information-volume challenge by moving its data and data-handling systems from local servers to the cloud software and computing services that run on the internet instead of locally on someones machine.

The Sentinel-6 Michael Freilich satellite, part of the U.S.-European Sentinel-6/Jason-CS (Continuity of Service) mission, is the first NASA satellite to utilize this cloud system, although the amount of data the spacecraft sends back isnt as large as the data many future satellites will return.

Part of the NISAR satellite rests in a thermal vacuum chamber at NASAs Jet Propulsion Laboratory in August 2020. The Earth satellite will track subtle changes in the planets surface as small as 0.4 inches. Credit: NASA/JPL-Caltech

Two of those forthcoming missions, SWOT and NISAR, will together produce roughly 100 terabytes of data a day. One terabyte is about 1,000 gigabytes enough digital storage for approximately 250 feature-length movies. SWOT, short for Surface Water and Ocean Topography, will produce about 20 terabytes of science data a day while the NISAR (NASA-Indian Space Research Organisation Synthetic Aperture Radar) mission will generate roughly 80 terabytes daily. Data from SWOT will be archived with the Physical Oceanography Distributed Active Archive Center while data from NISAR will be handled by the Alaska Satellite Facility Distributed Active Archive Center. NASAs current Earth science data archive is around 40 petabyes (1 petabyte is 1,000 terabytes), but by 2025 a couple of years after SWOT and NISAR are launched the archive is expected to hold more than 245 petabytes of data.

Both NISAR and SWOT will use radar-based instruments to gather information. Targeting a 2023 launch, NISAR will monitor the planets surface, collecting data on environmental characteristics including shifts in the land associated with earthquakes and volcanic eruptions, changes to Earths ice sheets and glaciers, and fluctuations in agricultural activities, wetlands, and the size of forests.

Explore this 3D model of the SWOT satellite by zooming in and out, or clicking and dragging the image around. Credit: NASA/JPL-Caltech

Set for a 2022 launch, SWOT will monitor the height of the planets surface water, both ocean and freshwater, and will help researchers compile the first survey of the worlds fresh water and small-scale ocean currents. SWOT is being jointly developed by NASA and the French space agency Centre National dEtudes Spatial.

This is a new era for Earth observation missions, and the huge amount of data they will generate requires a new era for data handling, said Kevin Murphy, chief science data officer for NASAs Science Mission Directorate. NASA is not just working across the agency to facilitate efficient access to a common cloud infrastructure, were also training the science community to access, analyze, and use that data.

Currently, Earth science satellites send data back to ground stations where engineers turn the raw information from ones and zeroes into measurements that people can use and understand. Processing the raw data increases the file size, but for older missions that send back relatively smaller amounts of information, this isnt a huge problem. The measurements are then sent to a data archive that keeps the information on servers. In general, when a researcher wants to use a dataset, they log on to a website, download the data they want, and then work with it on their machine.

However, with missions like SWOT and NISAR, that wont be feasible for most scientists. If someone wanted to download a days worth of information from SWOT onto their computer, theyd need 20 laptops, each capable of storing a terabyte of data. If a researcher wanted to download four days worth of data from NISAR, it would take about a year to perform on an average home internet connection. Working with data stored in the cloud means scientists wont have to buy huge hard drives to download the data or wait months as numerous large files download to their system. Processing and storing high volumes of data in the cloud will enable a cost-effective, efficient approach to the study of big-data problems, said Lee-Lueng Fu, JPL project scientist for SWOT.

Infrastructure limitations wont be as much of a concern, either, since organizations wont have to pay to store mind-boggling amounts of data or maintain the physical space for all those hard drives. We just dont have the additional physical server space at JPL with enough capacity and flexibility to support both NISAR and SWOT, said Hook Hua, a JPL science data systems architect for both missions.

NASA engineers have already taken advantage of this aspect of cloud computing for a proof-of-concept product using data from Sentinel-1. The satellite is an ESA (European Space Agency) mission that also looks at changes to Earths surface, although it uses a different type of radar instrument than the ones NISAR will use. Working with Sentinel-1 data in the cloud, engineers produced a colorized map showing the change in Earths surface from more vegetated areas to deserts. It took a week of constant computing in the cloud, using the equivalent of thousands of machines, said Paul Rosen, JPL project scientist for NISAR. If you tried to do this outside the cloud, youd have had to buy all those thousands of machines.

Cloud computing wont replace all of the ways in which researchers work with science datasets, but at least for Earth science, its certainly gaining ground, said Alex Gardner, a NISAR science team member at JPL who studies glaciers and sea level rise. He envisions that most of his analyses will happen elsewhere in the near future instead of on his laptop or personal server. I fully expect in five to 10 years, I wont have much of a hard drive on my computer and I will be exploring the new firehose of data in the cloud, he said.

More:
NASA Turns to the Cloud for Help With Deluge of Data From Next-Generation Earth Missions - SciTechDaily

Read More..