Page 2,399«..1020..2,3982,3992,4002,401..2,4102,420..»

phoenixNAP and MemVerge to Enable Memory Virtualization in Bare Metal Cloud – HPCwire

PHOENIX, Ariz.,Dec. 20, 2021 phoenixNAP, a global IT services provider offering security-focused cloud infrastructure, dedicated servers, colocation, and specialized Infrastructure-as-a-Service (IaaS) technology solutions, today announced a collaboration with MemVerge, a pioneer of Big Memory Computing and Big Memory Cloud technology. The two companies are working together to enable simplified deployments of MemVerge Memory Machine on phoenixNAPs Bare Metal Cloud and provide a robust infrastructure solution for Big Memory workloads.

With big data volumes growing at an unprecedented pace, the demand for memory-optimized compute is accelerating. Organizations are increasingly facing a challenge of deploying efficient memory resources to support real time analytics and long running data. DRAM scaling requires significant investments, while server configurations are often limited in their capacity to support such deployments.

As the industrys first software to virtualize memory hardware, MemVerge Memory Machine offers an alternative way to deploy and scale memory technology. With a single Memory Machine virtualization layer, organizations can power their application with persistent memory that provides DRAM-like capabilities and performance at an affordable price point. In addition to this, the software takes application snapshots in DRAM and persistent memory, enabling higher availability and mobility.

Deployed on phoenixNAPs Bare Metal Cloud platform, MemVerge Memory Machine will provide a comprehensive infrastructure solution for Big Memory data processing. Bare Metal Cloud relies on powerful hardware to provide advanced configurations for Big Memory data processing, ensuring consistent performance with cloud-like flexibility. As an automation-driven platform, Bare Metal Cloud can be deployed in minutes and managed easily using its API, CLI, and Infrastructure as Code integrations.

Through our collaboration with MemVerge, we are able to address an emerging need for memory-optimized server solutions, saidIan McClarty, President of phoenixNAP.

MemVerge Memory Machine is taking an innovative approach to memory technology, providing a simplified solution to efficient memory scaling and management. Coupled with performance capabilities of our Bare Metal Cloud, their software provides optimized compute for Big Memory workloads. Organizations handling data-hungry applications that need to be processed and analyzed fast can leverage this platform to streamline their projects and applications, while simplifying infrastructure management tasks.

Hardware limitations are a common challenge for advanced memory deployments, and this is the issue that phoenixNAPs Bare Metal Cloud successfully addresses, saidJonathan Jiang, COO at MemVerge. It offers dozens of configurations that provide a robust foundation for MemVerge Memory Machine implementations to process Big and Fast data. We are excited to work with phoenixNAP and leverage Bare Metal Cloud to demonstrate the potential of our memory virtualization technology, as well as to address the emerging use cases for Big Memory optimization.

By providing direct access to CPU and RAM resources, Bare Metal Cloud helps organizations ensure consistent performance even for demanding workloads and applications. At the same time, the platforms DevOps integrations, flexible billing models, and integration features ensure simplified server provisioning, scaling, and management. Automation-focused organizations can leverage it to streamline their CI/CD pipelines, access burst resources easily, and support global deployments.

Bare Metal Cloud comes with 15 TB of free bandwidth (5 TB inSingapore) and flexible bandwidth packages for more advanced needs. The platform also provides easy access to S3-compatible object storage, phoenixNAPs global DDoS-protected network, and strategic global locations.

To learn more about phoenixNAPs API-driven bare metal servers, visit theBare Metal Cloud page.For customized options, view itsdedicated server configurations.

Bare Metal Cloud Features:

About MemVerge

MemVerge is pioneering Big Memory Computing and Big Memory Cloud technology for the memory-centric and multi-cloud future. MemVerge Memory Machine is the industrys first software to virtualize memory hardware for fine-grained provisioning of capacity, performance, availability, and mobility. On top of the transparent memory service, Memory Machine provides another industry first, ZeroIO in-memory snapshots which can encapsulate terabytes of application state within seconds and enable data management at the speed of memory. The breakthrough capabilities of Big Memory Computing and Big Memory Cloud Technology are opening the door to cloud agility and flexibility for thousands of Big Memory applications. To learn more about MemVerge, visit http://www.memverge.com.

About phoenixNAP

phoenixNAP is a global IT services provider with a focus on cyber security and compliance-readiness, whose progressive Infrastructure-as-a-Service solutions are delivered from strategic edge locations worldwide. Its cloud, dedicated servers, hardware leasing, and colocation options are built to meet always evolving IT business requirements. Providing comprehensive disaster recovery solutions, a DDoS-protected global network, and hybrid IT deployments with software and hardware-based security, phoenixNAP fully supports its clients business continuity planning. Offering scalable and resilient opex solutions with expert staff to assist, phoenixNAP supports growth and innovation in businesses of any size enabling their digital transformation.

Source: phoenixNAP, MemVerge

More:
phoenixNAP and MemVerge to Enable Memory Virtualization in Bare Metal Cloud - HPCwire

Read More..

How the Cloud Helps With Medical Research and Remote Medicine – Business Insider

The cloud has had a major impact on data-driven medical research, enabling breakthroughs that otherwise would have taken substantially longer to happen. Such is the case with the massive, orchestrated effort that went into the development of COVID-19 vaccines.

Using cloud computing and artificial intelligence (AI), researchers developed the vaccines in less than a year, and the effort required collaboration by various entities in the private and public sectors pharmaceutical companies, hospitals, non-profit organizations, and government agencies. The monumental undertaking involved sharing large volumes of data as new discoveries occurred.

The development of these vaccines certainly is a major achievement for the pharmaceutical field. But there are several other examples of how cloud computing supports advancements in medicine, such as wearables devices that connect doctors and patience, storage of medical records, and remote surgery.

What makes the cloud so attractive to medical researchers comes down to the same characteristics that make it valuable in other fields elasticity, scalability, and the capacity to handle massive data volumes.

"One of the incredible powers of the cloud is that ability to scale up quickly," said Adam Glick, senior director of portfolio marketing for APEX Cloud Services at Dell Technologies. "Processing large amounts of drug discovery and trial data more quickly helps get lifesaving medications to people that need them faster. Imagine that you are in phase 2 trials for a new treatment, or you're in a much earlier stage doing drug discovery, and you want to analyze the data you're collecting. The ability to get data analysis in minutes as opposed to days can radically change the speed of drug discovery and approval, which ultimately mean saving more lives."

Without access to a cloud infrastructure, Glick added, the time and financial requirements to procure and set up the environment to conduct data-driven research are much higher. And once the project is completed, much of the servers and infrastructure used in the research may sit idle since they're no longer needed.

But with the cloud, "you can scale up your resources quickly and then you can process the data much faster," Glick said. This translates to faster development of life-saving drugs and treatments.

The cloud also plays a role in connected medical devices. Currently, 10 to 15 connected devices are used at each hospital bed. The global market for connected medical devices is expected to reach $158 billion in 2022, up from $41 billion in 2017.

Remote devices such as blood pressure, glucose, and heart monitors stay connected with clinics and physician offices, maintaining a continuous flow of data that helps enhance patient care. In some cases, timely data transmission can limit damage to a patient and even prevent death. If a device detects a problem with a patient, it can send an alert to dispatch an ambulance. In stroke and heart attack situations, a quick response can help minimize the impact on a patient.

Data transmitted from medical devices increasingly leverages edge networks, which place computing and analytics close to data sources and users to enable real-time decisions. But data that isn't used for real-time responses is stored in the cloud, where it can later be useful for research leading to new treatment methods and the development of therapeutic drugs.

Whether supporting operating rooms, wearable medical devices, or lab workers involved in critical research, the cloud already has proven critical to healthcare.

COVID-19 vaccines illustrate just how important the cloud can be, but as technologies and AI evolve to work together with the cloud, the list of possibilities of what medical researchers can accomplish is growing by the day.

Find out how APEX Cloud Services can help your R&D efforts.

This post was created by Insider Studios with Dell Technologies APEX.

Read this article:
How the Cloud Helps With Medical Research and Remote Medicine - Business Insider

Read More..

Contributed | The role of the Cloud in digital transformation – DIGIT.FYI

The pandemic has undoubtedly super-charged digital transformation strategies, leading many organisations to accelerate their migration to the cloud or modernise existing cloud-based applications to keep pace with their competition.

However, cloud migration itself should never be the goal its essential to identify your business goals, not just your technology goals, and how cloud migration and modernisation will help you to achieve them alongside broader cultural and process changes.

This integration drives greater agility, the adoption of new processes and encourages innovation. With 2022 approaching at pace, organisations can no longer ignore the cloud. Instead, they must consider it as the key to enabling digital transformation.

Skill up or lose out

Making sure your business is fit for the future both from a personnel and technical perspective is crucial to success. The alternative after two extremely challenging years and the potential of ongoing uncertainty next year, is unnerving.

We are all too aware that the IT skills gap is an ever-growing chasm potentially extended recently by the buoyant tech job market: cloud native skills top the hiring lists of nearly half of hiring managers.

Upskilling your existing teams means you can bring staff with deep understanding of your business on the digital transformation journey with you. Cloud empowers your teams, providing them with a toolkit that allows them to build amazing products quickly, reliable and cost-effectively.

Digital transformation is also a skills transformation, and ultimately this provides staff with the power to innovate quickly rather than focusing on the undifferentiated heavy lifting of managing racks and servers or database admin that a cloud provider can do.

Dont treat the cloud like another data centre

Its important not to just see the cloud as a new data centre, replicating old processes and silos in your new cloud environment.

In the cloud, everything is programmable you can get new storage/compute almost instantly, automate your infrastructure and network setup, automate your release processes, and get access to an amazing toolbox for achieving this with products and services built for easy implementation by cloud providers.

You should also consider how your teams can benefit from new ways to build and operate your digital products and accelerate innovation. For example, serverless lets you develop without having to worry about the underlying infrastructure. And as technology now extends into every aspect of business, it is no longer entirely under the purview of the IT team.

This means that old silos between teams need to come down, and a more collaborative approach should be taken.

A great example of this is how Developers, Security and Operations have started to work much more closely together. Proper adoption of DevOps can really help with improving time-to-market, getting feedback faster from your customers and getting new features out faster, more often and more reliably.

Transform your business perspective

Both the preceding issues demonstrate that digital transformation with cloud is not just an IT project. Cloud is a new way to do business, so you also need to transform perspectives on IT and on cloud across your business, and this comes with communication and transforming operating models.

One of the clearest examples of this is the rise of FinOps. This is a joined-up approach from Cloud Operations and Finance teams to help overcome some of the frictions that can sometimes arise Finance teams need to get used to the variable cost models of cloud and Developers need to be encouraged to take greater accountability for the spend they incur.

Cloud also transforms how Product teams operate, enabling them to shift from quarterly or monthly release cadences to a more frequent, feature-driven release cadence. Ideas can be tested, feedback can be gathered and improvements made in hours or days, and released instantly, providing value to customers and therefore to the business as a whole.

Cloud can also count towards sustainability initiatives. According to research from 451 Research, migrating business applications to the public cloud could reduce energy consumption by 80%, and carbon emissions by 96% further evidence of the benefits to expedite cloud migration strategies as part of digital transformation programmes, particularly in the drive towards a Net Zero future.

Make the time to modernise

While it may feel that once a migration programme has been completed it marks mission accomplished, it is far from the end of the journey.

Ensuring applications are modernised once an organisation is in the cloud will keep your applications fit for purpose. For example, there are cloud-native technologies available, like off the shelf machine learning products, which remove the need for organisations to procure scarce and expensive skills and instead utilise the skills of cloud providers.

Taking advantage of the cloud as transformational for both technology and the wider business requires a holistic approach to migration. Consider skills, consider the business goals, consider how you will use cloud-native services and above all how you will bring the business along with you on your transformation journey.

Like Loading...

Related

Read more here:
Contributed | The role of the Cloud in digital transformation - DIGIT.FYI

Read More..

Cloud Security Market 2021 is Expected to be on Course to Achieve Considerable Growth to 2027 mainlander.nz – mainlander.nz

Description

Thecloud security marketsize is valued atUSD 34.5 billion in 2020and is expected to register a 14.3% CAGR during the forecast period (2021-2027). Cloud security is one of the important aspects taken into consideration by every company that has shifted its business to the cloud. Most organizations are using multiple cloud servers and looking for a unified way to secure them, which will boost the growth of the cloud security market. Many enterprises use CASB software that integrates cloud service users and cloud applications for monitoring activity and enforcing security policies.

Request for Report Sample:https://www.marketstatsville.com/Cloud-Security-Market-will-reach-USD-68.5-billion-by-2025

The increasing sophistication of cyber espionage, cybercrimes, and new cyberattacks are the key factors that boost the growth of the cloud security market. Moreover, increasing data leakage and data breaches in enterprises are creating the demand for cloud security and are expected to accelerate the growth of the cloud security market.

Due to trust issues, small and large enterprises are hesitant to move all their data to the cloud. This factor is estimated to impede the growth of the cloud security market. Additionally, the rise in the number of government initiatives to support smart city projects is investing in cloud computing technology, which increases the demand for cloud security and fuel the growth of the global market.

The impact of the COVID-19 pandemic on the cloud security market was relatively mild compared to other industries. Since all the government and regulatory authorities have mentioned to both public and private organizations to work remotely and maintain social distancing, digital business has increased. On the other perspective internet penetration across the corners of the globe has also increased exponentially. This results in cloud security growth in the later stages of lockdown to ensure safety from malicious attackers and hackers.

Request for Buy Full Report: https://www.marketstatsville.com/buy-now/Cloud-Security-Market-will-reach-USD-68.5-billion-by-2025

The report outlines the global cloud security market study based on service, security type, application, and region.

Cloud Security Market Regional Outlook

The global cloud security market has been segmented into five geographical regions: North America, Asia Pacific, South America, Europe, and the Middle East and Africa (MEA). North America, followed by Europe and the Asia Pacific, has the largest share in the global cloud security market, owing to the high adoption of IT security services. Further, Asia Pacific is the fastest-growing global cloud security market.

Request for Report Table of contents: https://www.marketstatsville.com/table-of-content/Cloud-Security-Market-will-reach-USD-68.5-billion-by-2025

The global cloud security market is fairly fragmented, with the presence of a large number of small players across the globe. The vital cloud security manufacturers operating in the global market are

The cloud security market report thoroughly analyzes macro-economic factors and every segments market attractiveness. The report will include an in-depth qualitative and quantitative assessment of the segmental/regional outlook with the market players presence in the respective segment and region/country. The information concluded in the report includes the inputs from primary interviews.

Full Report Analysis: https://www.marketstatsville.com/Cloud-Security-Market-will-reach-USD-68.5-billion-by-2025

See the rest here:
Cloud Security Market 2021 is Expected to be on Course to Achieve Considerable Growth to 2027 mainlander.nz - mainlander.nz

Read More..

How Tripwire Can Be a Partner on Your Zero Trust Journey – tripwire.com

In a previous blog post, I discussed the different applications of integrity for Zero Trust and provided four use cases highlighting integrity in action. The reality is that many organizations cant realize any of this on their own. But they dont need to. They can work with a company like Tripwire as a partner on their Zero Trust journey.

Lets explore how they can do this below.

Security teams can begin their Zero Trust journeys by establishing a baseline of integrity. Infosec personnel need a trusted state of their employers systems and information to understand the security, compliance, and operational state of their employers assets over time. Only if they establish a single source of truth can they monitor for low-priority, routine changes as well as events that could signify a security incident. These include the addition of unrecognized binaries and the alteration of access privileges on critical files.

With this continuous monitoring capability, the integrity platform also becomes critical to successful prevention and detection within a Zero Trust environment. In that sense, integrity management doesnt just serve as the foundation for Zero Trust Architecture (ZTA). It also serves as the ultimate backstop should attackers get in, as these threat actors need to make a change to perform their malicious activity sooner or later.

Once they have an integrity-based Zero Trust program in place, organizations can then continuously revalidate the trustworthiness of systems and information using security tools such as those offered by Tripwire. They can turn to four solutions in particular. Those are security configuration assessment, policy compliance, vulnerability assessment, and integrity monitoring.

Security teams need to trust that their employers information and data is configured to a secure baseline that aligns with policy. This can help to ensure that the Trust Policy Engine makes appropriate risk-based decisions for connection requests to different business assets. Towards that end, Tripwire Enterprise provides a combination of platforms and policies for organizations to determine how their assets are configured. This assessment of security policy is available for integration via APIs and apps connected to Tripwire Enterprise. Simultaneously, Tripwire Configuration Manager provides assessment of cloud infrastructure such as cloud accounts, storage, and SaaS solutions, thereby allowing for Zero Trust to extend beyond on-premises assets.

Security teams dont just need to worry about protecting their employers assets against digital threats. They also need to make sure they fulfill any relevant compliance obligations that cover some or all of their systems and data. Tripwire Enterprise can provide compliance assessment results to inform trust policy decision making, as well as satisfy auditor requirements. Where it can be difficult to assign a static asset scope to a compliance requirement, Zero Trust using compliance results from Tripwire can provide assurance that all entities involved in a particular system are compliant.

An important part of Zero Trust is evaluating risk, such as software vulnerabilities. Indeed, a Zero Trust policy might specify that assets with vulnerabilities providing remote privilege access should not be able to connect to specific data sets, for instance. It might also specify vulnerability score thresholds for access to specific sets of resources.

These functions emphasize the need for infosec personnel to assess their employers infrastructure for known vulnerabilities. With that said, Tripwire IP360 provides both agent-less and agent-based vulnerability assessment across a variety of asset types including servers, workstations, network devices, containers, and cloud workloads. Those tests yield visibility into vulnerabilities affecting the operating systems and applications on those devices, and they provide results in a robust REST API that apply to both access requesters and ZTA resources such as Network Access Control (NAC) and Privileged Access Management (PAM) platforms.

Finally, security teams need to close any gaps left over from their security configuration assessments, policy compliance initiatives, and vulnerability assessments. Otherwise, an attacker could exploit undetected or unremediated vulnerabilities and abuse them to gain access to an organizations network. Thats why its not enough for security teams to implement these solutions and other solutions once and leave them alone after that. They need to bring in integrity monitoring to spot potential deviations. In the example of security configuration, for instance, that would mean establishing a baseline configuration and then monitoring that configuration for changes. This can help security teams to identify and address risk proactively before the Trust Policy Engine needs to make a decision about access. It can also help to spot changes in the configuration of the Zero Trust policy, the Trust Policy Engine, and any of the other supporting components themselves.

Ultimately, theres no Zero Trust without integrity. Security teams need to use this realization to get Zero Trust right the first time and to continue getting it right from there.

To learn more about how Tripwire can help, download this whitepaper: https://www.tripwire.com/misc/a-tripwire-zero-trust-reference-architecture.

Originally posted here:
How Tripwire Can Be a Partner on Your Zero Trust Journey - tripwire.com

Read More..

Top Cloud Computing Trends Shaping Our IT Landscape in 2022 – CRN – India – CRN.in

By Indrajeet Ghorpade General Manager Technical Services at Rahi Systems

The year 2020 and 2021 witnessed Data Explosion, there was an exponential rise in the generation and storage of data as work went virtual and businesses adopted digital services. 2022 will not only experience the quick cloud deployments for specific applications but a complete overhaul of enterprise systems embracing cloud migration.

As per a survey conducted by Forbes, 83% of the enterprise workload will be stored on the cloud by the end of 2021. The current shared cloud model is less flexible in terms of controlling it but it offers ease of access and high security. 94% of businesses experienced a significant improvement in security after migrating to the cloud, according to Salesforce.

Augmenting the best of public and private cloud environments with the hybrid cloud will continue to be a major trend continuing from 2021. Cloud has emerged as the most efficient data storage and application management solution. It has enabled organizations to focus on their core services. This article highlights the innovation and ongoing advancements in cloud computing infrastructure identified by Rahis cloud experts.

Cloud service providers like Amazon (AWS Lambda), IBM Cloud Function, and Microsoft Azure Functions are offering serverless cloud computing services. A relatively new concept, the serverless cloud is gaining huge traction as it provides a true pay-as-you-go service and users dont need to pay a fixed amount for storage or lease a server.

Serverless technology does not imply that servers are not present; servers are present, but users only use them without getting into the setups and technicalities. Also referred to as the functions-as-a-service, it can be utilized by startups and small businesses to rapidly upscale their business leveraging serverless architectural solutions with minimal capital investments.

Serverless infrastructure is infinitely scalable, and the expenses spent are dependent only on consumption, allowing businesses to pay for the exact cloud services they utilize. According to Mordor Intelligence, the serverless computing market is expected to grow at a CAGR of over 23.17% during the year 2021-2026.

As per IDG, 92% of organizations are hosting their IT infrastructure on the cloud, 55% are using multiple cloud systems with 21% of organizations using 3 or more cloud systems. This exponential growth in cloud adoptions created a new vulnerability for bad actors to compromise the IT systems. Cybercrime saw a steep increase of 630% in the January to April period only.

The recent breach in which over a million email addresses were exposed of GoDaddy customers shows that cloud security is critical and requires a proactive approach to fill up blind spots in the system. As per AllClouds Cloud Infrastructure Report, 28% of respondents consider security as one of the most important criteria in selecting the cloud service provider. Cloud offers the flexibility and operational efficiency an organization needs to grow and scale at ease but at the same time, it opens a gateway for cybercriminals via multiple entry points.

A major trend to counter cloud-based security issues is the use of Cloud Access Security Brokers (CASBs). CASBs are cloud-based security enforcement points placed between the cloud user and service provider. It can be on-premise or cloud-based security policy enforcement points and consolidates multiple security enforcement measures. Over 50% of organizations dont have a proper security framework for their cloud applications. Cloud adoptions saw a huge jump, cloud security comes next.

According to CDP Global, climate change will cost global businesses $1.3 trillion by 2026. Cloud significantly reduces the power consumption at the users end as physical IT infrastructure is handled by the colocation facility providing the cloud services. Maximum power consumption occurs due to the Always on infrastructure, powerful computing engines, massive digital storage requirements, etc.

Out of 9 consideration factors, 80% of consumers list sustainability as the most crucial while selecting a service provider. A wider audience is emerging who want their brand ethics to be in line with the values of service providers. With 44% of CEOs planning on net-zero futures, it is now increasingly important to take the best of highly efficient cloud operations. Adopting a public cloud reduces carbon dioxide emissions by 59 MT per year. Noticing the sustainability impact of cloud, it will be a very important trend in 2022 and years to come.

Tech giants around the world will be focussing on becoming net zero in 2022, Amazon the largest cloud service provider is also the biggest buyer of renewable energy resources for its data centers. Going green is more of an environmental necessity than just a trend to follow. Businesses globally are relying on the cloud for most of their operations, with advanced digital transformation. Rahis cloud experts are always evolving to bring the best of the cloud to your IT infrastructure.

If you have an interesting article / experience / case study to share, please get in touch with us at [emailprotected]

See the article here:
Top Cloud Computing Trends Shaping Our IT Landscape in 2022 - CRN - India - CRN.in

Read More..

Medelln Campus writes the future of worldwide industrial automation – Intelligent CIO ME

Rockwell Automation has launched an ambitious plan to accompany automation in manufacturing plants, regardless of their level of digital maturity. We look at the benefits it brings.

Industries are not stopping in the race for automation and need strategic partners with enough experience to guide them on that path. At the Rockwell Automation campus in Medelln, Colombia, they work on various fronts regarding the future of industrial automation and the software that supports it.

Csar Arango, Engineering Manager, Rockwell Automation, said: When I talk about the future, I refer to the Software-as-a-Service (SaaS) proposal in the cloud, that is, the possibility of hosting all of an industrys software in the cloud and not on local servers.

In addition to having deployed the necessary programs as well as benefiting from the associated saving costs, workers can access various aspects of the operation process without being in the plant.

SaaS offers advantages such as multi-location, immediate connection and collaboration between multiple locations. The suite that provides these services is FactoryTalk Hub, focused on software to support an ecosystem of industrial applications. And it is in line with what companies like Microsoft, Accenture, Salesforce and PTC, among others, have done.

88% of executives expect to optimize technology through services that are in the cloud, not by using their servers. Services is where we start to play, said Arango.

Rockwell Automation intends that in the next five years, IT and OT (Operational Technology) converge to take advantage of the significant advantages and computing capacity that the cloud offers and, thus, give better tools to the OT world.

This convergence will open many possibilities such as providing access to the software from anywhere, sharing information internally and with clients, enabling on-demand scaling, and as a result, generating cost reduction and performance improvement.

The products they have been developing allow the client to view the status of a plant in real-time or have analysis based on Big Data to identify deviations of any of the devices.

Arango points out that if a conveyor belt in a factory stops, the entire plant must stop. And this may be since no one perceived the vibration level was changing slightly for a few days. It is something that could be foreseen. This solution aims to optimize costs and generate fewer failures in the plants.

FactoryTalk in three stages

In FactoryTalk Hub, there are three main chapters: Design, operation and maintenance. For its development, Rockwell Automation has made a strategic alliance with Microsoft. In the Design Hub, the tools allow you to migrate from previous versions of drivers to the most up-to-date ones to develop the capacity of emulation and design through the cloud.

For example, if you are looking to design a cookie production plant, you can simulate the entire process in the cloud through software before creating it in the physical world.

The Medellin team engages in the Design Hub with file type conversion tools, Logix controller update, project analysis and digital engineering, emulation tools, digital twins and many other emerging technologies.

The Operation Hub, Plex Systems (recently acquired by Rockwell Automation) is a leader in cloud-native innovative manufacturing solutions operating in 2,400 plants in 37 countries and processing around 8.5 billion transactions per day. Through software with the same name, it allows managing and automating production. To connect on-premises or edge devices to the cloud requires IoT technology.

As the Maintenance Hub, Rockwell Automation, with its subsidiary Fiix, provides a cloud-native computerized maintenance management system (CMMS) powered by AI.

Then there is the FactoryTalk Vault, launched a year ago at the Automation Fair. This application drives the management and control of industrial automation programs, allowing companies to optimize their backup costs. It combines with design tools to keep their control software versions in the cloud updated.

About Rockwell Automation

Rockwell Automation is a provider of industrial automation and Digital Transformation solutions. Rockwell Automation sectors include intelligent devices, software and control, and lifecycle services. The company operates its business approximately 100 countries worldwide, including United States, China, Canada, Italy, Mexico, the United Kingdom, Germany and Australia.

Facebook Twitter LinkedInEmailWhatsApp

See original here:
Medelln Campus writes the future of worldwide industrial automation - Intelligent CIO ME

Read More..

3 Top Trends to Invest in for 2022 (and Beyond) – Motley Fool

The last two years haven't been easy or predictable for investors, but 2022 will present its own challenges. Uncertainty about interest rate increases and the new omicron coronavirus variant have triggered some volatility in the stock market recently, and it could carry into the new year.

But long-term strategies tend to negate short-term noise, and with the recent dip in some technology stocks, 2022 might be a great time to buy with a multi-year focus. To pick your stocks, it might be a good idea to focus on broad trends in high-growth industries.

Three Motley Fool contributors have identified Microsoft (NASDAQ:MSFT), Snowflake (NYSE:SNOW), and Upstart Holdings (NASDAQ:UPST), because together, they operate in industries that will represent trillions of dollars of economic growth throughout this current decade.

Image Source: Getty Images

Anthony Di Pizio (Microsoft):Cloud computing, in the simplest of terms, is the business of accessing data and programs online using the internet, rather than having them installed on computers or devices locally. In an era of remote work, and with companies operating in dozens of different countries, the cloud makes conducting everyday operations so much easier because it effectively connects organizations together internally -- no matter the location.

The popularity of this technology is evident in the numbers. By 2026, the cloud computing market is estimated to more than double to $947 billion in annual spend. The cloud services industry is dominated by a small handful of tech behemoths, and one of them is Microsoft. The average consumer probably associates the company with its Windows computer operating system, or its Office 365 software -- and why wouldn't they? These products serve billions of people globally.

But of Microsoft's three main business segments, cloud computing is in fact its largest, accounting for over 37% of total revenue in the recent fiscal first quarter of 2022. Cloud is also growing significantly faster than Microsoft's overall revenue.

Metric

Fiscal Q1 2021

Fiscal Q1 2022

Growth

Total revenue

$37.1 billion

$45.3 billion

22%

Cloud revenue

$12.9 billion

$16.9 billion

31%

Data source: Microsoft

This trend has been apparent for quite some time at Microsoft. From fiscal 2019 to fiscal 2021, cloud revenue grew at a compound annual rate of 24% compared to 15% for overall revenue.

Microsoft's cloud business is driven by its Azure platform, which provides over 200 different products and services, some of which rely on incredibly advanced technologies like artificial intelligence and machine learning. These can be used to analyze speech and images, and even make predictions using data. But Azure also caters to high-demand services like application development, security, and the Internet of Things.

I think Microsoft is one of the best stocks to buy for exposure to the cloud. It's not just because the segment is growing so quickly, but also because investors are buying a suite of other incredible businesses. Aside from the software offerings mentioned earlier, the company owns Xbox and Surface, which are multi-billion dollar hardware brands in their own right.

This diversity could make Microsoft the ultimate play in an uncertain 2022, and beyond.

Image source: Getty Images.

Jamie Louko (Snowflake): Companies have been producing an increasing amount of data over the past few years: 90% of the world's data has been created in just the past two years, and the amount of data that is being made today is expected to double in another two years. This rapid increase in data will result in companies needing more capabilities to analyze and process their growing amounts of data, and Snowflake is allowing them to do so.

The company offers businesses the ability to freely bring and store their data on Snowflake servers, and the companies only pay when they want to query and access their data. With businesses receiving increasing amounts of data every day, Snowflake is an easy choice because it doesn't charge to store the data. This has resulted in rapid adoption: Snowflake's third-quarter customer count grew 52% year over year to 5,416 customers.

This feature of Snowflake's business is what attracts customers, but the analytics is where Snowflake will thrive. With more data, companies will have to analyze their data more often, leading to increased interaction with Snowflake. The company has already seen success with this business model. The number of customers spending over $1 million with Snowflake increased 128% year over year to 148 customers in Q3. Additionally, customers who spent $100 one year ago are spending on average $173 today.

The company sees an addressable market of $90 billion ahead of it today, which is why it is heavily investing back into the business. Snowflake spent over $306 million in sales and marketing expenses and research and development, which resulted in immense unprofitability in Q3. The company lost $155 million -- representing roughly 46% of Q3 revenue. While this net loss is bad today, as Snowflake's expenses pay off by gaining market share and developing new products, the company will be able to scale back its expenses as growth continues.

Here's the bottom line: Snowflake's business model makes it easy for customers to join the platform, and the fast-growing data analytics market will undoubtedly grow rapidly through 2022. With these two tailwinds pushing Snowflake forward, it is well positioned to flourish in 2022 and beyond.

Image source: Getty Images.

Trevor Jennewine (Upstart): Artificial intelligence will likely be one of the most transformative technologies ever conceived of by the human race. It has the potential to improve efficiency and productivity across virtually every industry, and it will likely create tremendous wealth in the process. In fact, McKinsey & Company forecasts that AI will add $13 trillion to global economic output by 2030.

On that note, Upstart is a great example of a company using artificial intelligence to solve real-world problems. Specifically, its platform aims to make loans more accessible for consumers and less risky for lenders. Whereas traditional credit models consider between eight and 30 variablesto determine who qualifies for credit, Upstart captures over 1,600 data points per borrower, and measures those data points against repayment events. That means Upstart's AI models get smarter each time someone makes or misses a payment.

More importantly, Upstart's decisioning technology considers more data, which theoretically allows it to quantify risk more precisely. In fact, internal studies have shown that Upstart's AI models can reduce a bank's default rate by 75% while holding the approval rate constant. Alternatively, its platform can boost the approval rate by 173% while holding the loss rate constant.

Not surprisingly, Upstart has seen strong demand from banking partners. Over the past year, its client base tripled, and the company made significant headway in the auto lending industry. As a result, transaction volume surged 244%to $3.1 billion in the third quarter, and revenue soared 250% to $228 million. Even more impressive, despite being a young fintech company, Upstart is profitable according to generally accepted accounting principles (GAAP).

Going forward, I think Upstart can maintain that momentum. Over the last 12 months, its technology powered $8.9 billion in loans. But management puts its total addressable market at $753 billion, and that figure could get even bigger if Upstart expands into new industries -- for instance, mortgage loan originations total $4.5 trillion each year.

More importantly, Upstart's AI models appear to give the company a significant advantage. Assuming that holds up in the years ahead, I think shareholders could see 10x returns over the next decade.

This article represents the opinion of the writer, who may disagree with the official recommendation position of a Motley Fool premium advisory service. Were motley! Questioning an investing thesis -- even one of our own -- helps us all think critically about investing and make decisions that help us become smarter, happier, and richer.

Originally posted here:
3 Top Trends to Invest in for 2022 (and Beyond) - Motley Fool

Read More..

How Kubernetes lowers costs and automates IT department work – The Register

Advertorial One of the key factors to consider when evaluating an IT solution is concerned with how fast updates are brought to the market. Releasing an application is not enough. You need to work on it every day, add new features and services and simultaneously keep it running. Yet you can't just turn off the app, update it, and turn it on again. Your online store should be up and running while the guys wearing shabby knit sweaters are deploying your latest updates.

To make sure that the update implementation process remains unnoticed by users, you need to resort to special microservices, containers and the orchestration infrastructure. This is the definition of Kubernetes. With this solution, manual management of different versions, subversions, and parts of an application becomes unnecessary. This makes the system more powerful, reliable, stable, and expandable. Overall, it becomes easier to use.

How Kubernetes helps businesses:

Businesses profit from using Kubernetes as it helps them automate their work. Using Kubernetes significantly reduces the amount of money spent on hardware and human resources. It allows the project team to focus on their main task of website development rather than website administration. Here's how it works:

Kubernetes uses a virtual machine by G-Core Labs combined with a DevOps approach, thus helping businesses automate routine tasks. In this case, the application is launched and works at any stage in the same way as if it were launched and would work on the developer's local host.

"The economic, organizational and social consequences of the pandemic will continue stimulating digital innovations and cloud services", - believes Henrique Cecci, Senior Research Director at Gartner. This consulting company expects end users to spend more than $480 billion on public cloud services next year.

Public clouds simplify the work with Kubernetes significantly as they use modern infrastructure solutions such as API. This synergetic system makes it possible to distribute the workload within the cloud efficiently, thus enhancing the profit you get from your IT investments.

Here is how it works: imagine that you run a service with users in ten countries working with two clouds, with the main cloud being located in America and the backup - in Europe. In the past this was enough, but as new legal requirements have been introduced to one of the markets, you now need to store user data on the territory of the respective country.

In this case, you will most likely address one of the cloud providers - for example, G-Core Labs. As a result, you will get access to a virtual machine featuring a powerful CDN and other resources, which will allow you to deploy and manage containers most efficiently using Kubernetes. Thanks to a content delivery network with over 140 points of presence in 100 cities around the world, your servers with all the users' personal data will be located on the territory required by law.

Usually, businesses connect with a provider because they have to, yet it brings a positive effect in the end. After migration, the service clients in the desired region download files 22.5 times faster, while storage and download fees make up about the same sum that you previously paid for storage alone.

Kubernetes allows businesses to reduce their infrastructure costs and helps companies get the most of their IT investments. Migrating to a public cloud also brings businesses further bonuses. For example, in G-Core Labs, outbound traffic and the configuration of cluster nodes through the cloud control panel or via API are free of charge. You pay only for virtual machines, disks and load balancers.

Therefore, you will pay due attention to the four main sectors allowing you to save money - the cloud, the cluster, the main tools, and the company's culture. At the same time, you will also be able to reduce the amount of money spent on Kubernetes itself, while taking the full advantage of using this technology.

Automatic scaling provided by Kubernetes results in high availability and maximum application performance, which are both important for businesses. Now, when you need a new container for some service, you contact the provider and connect the new server to the cluster. Kubernetes automates this process. It uses an API request to order a virtual machine from a cloud provider, connects it to the cluster, and adds the required pod (container) with the required parameters.

This platform turns out to be very helpful in many other cases as well. Let's imagine that you've launched an application in Kubernetes and that its containers are already receiving some traffic. When the CPU load increases, the platform will notice this and will automatically increase the number of machines used in order to distribute the requests properly.

Using special metrics and tests allows the system to quickly identify damaged or unresponsive containers. Failed containers are created anew and get restarted on the same pod. This allows programmers to focus on development instead of doing routine administrative tasks.

Kubernetes allows developers to create production-like environments for automated testing. General application logs and Kubernetes app logs will help you detect problems and errors even faster.

Imagine that you've decided to completely redesign your cybersports video streaming app. The new layouts have already been internally tested by the team and have been sent to focus groups for trial. Everything seems to be fine. In your own cloud, everything works well. But what is it going to look like on production? To answer this question, you can resort to the so-called canary testing which implies a partial release of a certain service. While the overall check is still in progress, small amounts of live traffic are sent to the released application parts. The results are tracked and compared with the ideal, allowing you to make decisions concerning the app launch.

Such "traffic injections" remain unnoticed by the users because the containers are duplicated, and the users get redirected from one container to another. For orchestration purposes, you can use Kubernetes provided by G-Core Labs. The provider's virtual machines work with high-performance servers that have Intel Xeon Scalable processors (Ice Lake) of the 3rd generation. In April 2021, G-Core Labs became one of the world's first companies that started integrating such processors into their infrastructure.

Businesses should consider migrating to the Kubernetes platform in quite several cases:

Migrating to Kubernetes is necessary for companies that need to maintain their information systems online 24/7. This is exactly why using Kubernetes together with the cloud technologies offered by G-Core Labs is the ideal solution.

Sponsored by G-Core Labs

The rest is here:
How Kubernetes lowers costs and automates IT department work - The Register

Read More..

Top 10 cloud storage, DR and datacentre storage stories of 2021 – ComputerWeekly.com

Here are Computer Weeklys top 10 stories on cloud storage, disaster recovery and the pandemic in 2021.

Theses are apparently disparate areas, but very much connected. The rise of the cloud has seen it emerge as a key site for disaster recovery, and even more so during the pandemic, which has pushed cloud adoption in a general sense. But there is also the sense that the cloud solves everything, and thats not the case, as we see here, and there are effects from the pandemic and Brexit on compliance.

Articles here also highlight key ways that disaster recovery (DR) has changed during the pandemic, as well as the devil in the detail of recovery from disaster using cloud DR. We also look at another key cloud workload namely archiving but there are drawbacks too, and we look at one organisation that brought backup and archiving back in-house.

There were also articles that drill down into the key considerations for SMEs when it comes to cloud disaster recovery and whos best at what in DR among the hyperscalers.

And lastly, we look at customer deployments of storage technologies in-house that are still very strongly represented, despite moves towards the cloud, with flash storage at the Scottish agricultural ministry and an oil and gas firm that ditched storage and servers for hyper-converged infrastructure.

Covid-19 has changed IT. Previously, working remotely was a business continuity measure, but now it is the norm. That means disaster recovery has to adapt to new risks and new ways to respond.

We look at cloud disaster recovery and the potential complexities that can result from partial outages and restores as well as challenges around reconfiguring network and security.

The advantages of cloud archiving are ease of use and being well-suited to potential latency issues, but IT teams need to be aware of costs and issues around moving data from and between clouds.

France Tlvisions Publicit couldnt always get to critical data, so decided to repatriate backup and archiving from the cloud to on-site locations, with help from a managed service.

We look at the key areas of cloud storage compliance that can trip you up, with shared responsibility with cloud providers and data residency among the most important.

We look at key disaster recovery considerations for SMEs, including why backup is not enough, how to create a disaster recovery plan, best-practice DR testing and DR as a service.

We look at cloud disaster recovery from AWS, Microsoft Azure and Google to see which is best for provision of turnkey solutions, breadth of portfolio and modular building blocks.

IG Design Group gains a modern backup regime with data held on disk, in the cloud and long-term on tape in a move that has helped it to slash backup and restore times.

Summit E&P made a strategic move to Nutanix hyper-converged and away from NetApp and VMware and wanted backup that could handle virtual machines and physical servers.

Rural directorate ditched hybrid flash EMC SAN for Pure all-flash storage and cut developer time in half, while beta testing Cloud Block Store and planning container project.

More:
Top 10 cloud storage, DR and datacentre storage stories of 2021 - ComputerWeekly.com

Read More..