Page 2,318«..1020..2,3172,3182,3192,320..2,3302,340..»

Pets versus cattle: cloud computing in manufacturing: The benefits of cloud-native applications in manufacturing – Process & Control Today

31/01/2022 EU Automation

The cloud-native approach is often conceptualised using thePets vs. CattleDevOps concept, which compares traditional server systems, which are unique and cared for, to pets and cloud systems to cattle. Cattle are identical, assigned, scalable systems where if one of the systems were to be unavailable or fail, the whole network would not be affected. In this piece, Neil Bellinger, head of EMEA atglobal automation parts supplierEU Automation, delves into the native cloud and why it is the future of manufacturing.

Given the value of the cloud, one wonders why the manufacturing industry is resisting the move to cloud-native systems, especially with companies already using various cloud-based applications.

Until recently, manufacturing industries have been up to date with the latest innovations, from production tracking systems to artificial intelligence. Although financial and retail sectors have followed business giants like Netflix and Spotify, many still resist the switch to cloud-native. While83 per cent have a strategy for cloud implementationestablished, many still struggle to visualise the business value and cost savings of the cloud-native approach. However, forward-looking businesses are making the first steps in this direction. For example, one company currently implementing cloud-native applications is Volkswagen, training more than 200 specialists for its cloud-innovation centre.

Cloud-native is the term used to describe software and services that run on the internet instead of a local computer system, allowing businesses to be faster and more agile.Cloud-based applications, however, are not created in the cloud but instead transferred from an in-house web application to a cloud server. This cloud-based system allows for the availability and scalability of cloud-native applications without redesigning current applications.Cloud-enabled is similar to cloud-based, with the migration of the traditional application to the cloud. However, it does not have easy availability and scalability of cloud-based or cloud-native.

Some examples of cloud computing in the manufacturing industry are cloud-based marketing, product development, production and stock tracking, and productivity management.

The responsiveness, innovative features and zero downtime deployment of the cloud-native model provide businesses with a database needed to succeed in modern times. Successful examples of this implementation are Netflix and Uber. Netflix currently operates using over 600 services, with an update rate of around 100 times per day. The structural design of the cloud-native enables the rapid response, scalability, and selective deployment observed in companies such as Uber, Netflix and Spotify.

The cloud-native revolution

Cloud-native is the clear future of all businesses, including manufacturing. As seen in the last two years, implementing changes to aid when mass disruption like the pandemic occurs is extremely important for business survival. With accessibility possible from across the globe via the internet, the cloud-native approach is the best change companies could make.

The restriction of large on-premise systems has held back the manufacturing industry for years, but cloud-native applications enable companies to use small, reusable and independently deployable microservices. It can also allow the automation infrastructure, application delivery, recovery and scaling to increase, providing a more resilient and high performing system. The cloud-native model can also aid in lessening the strain of managing back-end software and infrastructure.

Cloud-native apps also enable the sharing and analysing of data across the organisation by being the central repository for data flowing fromsensors, machines, programmable logic controllers (PLCs),and more. At EU Automation, we value the importance of keeping with the times and understanding the cost of downtime. Cloud-native is the next step in lessening the impact of failures in company systems.

If you wish to stay informed about the developments of the cloud-native approach and how to unlock success in an industry 4.0, visit EU AutomationsKnowledge Hub.

Process and Control Today are not responsible for the content of submitted or externally produced articles and images. Click here to email us about any errors or omissions contained within this article.

See the article here:
Pets versus cattle: cloud computing in manufacturing: The benefits of cloud-native applications in manufacturing - Process & Control Today

Read More..

IoT With Cloud and Fog Computing Can Help Industry Recovery, Advancement – Journal of Petroleum Technology

Against the backdrop of multiple challenges faced by the industry in recent years, the industrial internet of things (IoT) may provide needed hybrid cloud and fog computing to analyze huge amounts of sensitive data from sensors and actuators to monitor oil rigs and wells closely, thereby better managing global oil production. Improved quality of service is possible with the fog computing because it can alleviate challenges that a standard isolated cloud cannot handle.

Recently, the petroleum industry has faced critical challenges including oil price volatility, dramatically increased environmental regulations, the COVID-19 outbreak, and digital solutions to cybersecurity challenges. Oil and gas deals with a huge amount of data requiring an immediate response, with the flow of data generated by the IoT in all domains: upstream, midstream, and downstream. The vast amount of data and different types of IoT will create a barrier to holistic evaluation that does not depend on machine learning and artificial intelligence. Currently, the cloud solves that problem by processing data and applying artificial intelligence and machine learning, along with other applications. However, the cloud cannot work efficiently with such enormous sets of data and produce actions in a limited time because of the physical distance between the IoT and a centralized cloud will increase latency. In addition, cyberattack challenges to the IoTs and the degree of damage to the environment and equipment can halt operations, which can be circumvented using the fog and cloud. These challenges demonstrate that the current architecture cannot handle the large number of endpoint devices and data in a way that effectively provides significant protection to the system.

The fog model has several advantages over the cloud model, including close proximity to endpoints. The fog connects cyberphysical social systems to cloud computing centers and has the potential to lower the bandwidth and strain of industrial clouds. Cloud computing is a critical paradigm for managing all types of calculations, including those that were previously considered insignificant. However, when the task must be completed in real time with a very low latency, the cloud can become ineffective. As a result, fog computing was created to supplement the cloud. Traditional and fog computing are employed to increase the performance of industrial IoT-based applications. When the fog is unable to complete an operation independently because of capacity constraints, heavy computations must be offloaded to the cloud.

Fog computing is thought to be more cost-effective than cloud computing in time-critical applications such as health care because of its decreased latency, and, in some situations, the spare capacity of locally accessible resources.

Fog computing is a novel paradigm for computation that can be represented as the link between the cloud and the networks edge, where it provides computing, communication, control, and storage. The decentralized platform differs from previous traditional, architectural computational methods because the fog computing environment connects resources, including people, to improve the quality of life by running cyberphysical social applications on network edge processing resources.

Fog Nodes or Microdata Centers (MDCs). Such applications can gather and analyze data from local microdata centers by fog computing. Local data processing and analysis are carried out by the MDC to limit the amount of data transferred to a centralized cloud, reduce network latency, and enhance overall performance, especially for time-sensitive services such as connected cars and health monitoring. In order to reduce network congestion, bandwidth consumption, and delay for user requests, MDCs typically are placed between data sources and the cloud data center. The MDC handles most user requests instead of forwarding them to centralized and remote cloud data centers.

Smart Gateway. Because it permits communication between the network layer and the ubiquitous sensor network layer, a smart gateway is an important component of industrial IoT applications. IoT gateways are communication points that connect lower-end users who operate in influential data centers, connect the many devices in use, and perform a variety of functions to complete the computing purpose. The gate is solely used to receive sensor data, incorporate it, and then send it to the cloud for processing.

Virtual Servers. Fog computing has developed as a response to the massive amounts of data being transmitted to cloud servers and the severe latency and bandwidth limits that come with it. As an intermediary computing layer between cloud servers and IoT devices, it distributes numerous heterogeneous fog servers. Fog servers contain fewer resources than cloud servers, but, because they can be accessible over a local area network, they offer better bandwidth and reduced latency for industrial IoT devices.

Management Subsystem. Fog computing optimizes task execution and management system by achieving a balance of attention between resources and tasks. Load balancing is an important resource-management method that can be used in conjunction with task management to produce a reliable system.

Storage Subsystem. The fundamental objective of the industrial IoT is to acquire correct data in real time and then respond quickly and appropriately to provide desired results. Fog and edge computing have been used to help solve these problems and enhance service quality and user experience by effectively distributing data storage and processing across multiple locations physically close to the data source.

Fog computing architectures are built on fog clusters that combine the processing of several fog devices. Data centers, on the other hand, are the clouds primary physical components, with high operational costs and energy usage. The fog-computing paradigm consumes less energy and has lower operating expenses. Because the fog is closer to the user, the distance between users and fog devices could be one or a few hops.

The clouds communication latency is always higher than the fogs because of distance. The fog relies upon a more-distributed strategy based on geographical orchestration, whereas the cloud represents a more-centralized approach. Because of its high latency, the cloud does not allow for real-time contact; however, fog computing can alleviate this problem easily. On the other hand, the fog has a high failure rate because of its dependence upon wireless connectivity, decentralized management, and power outages. When the software is not managed properly, these devices can fail.

Drawbacks of the cloud-based model include the following:

The fogs dispersed design safeguards linked systems from the cloud to the device by placing computing, storage, networking, and communications closer to the services and data sources, offering an extra layer of security. Fog nodes protect cloud-based industrial IoT and fog-based services by executing a variety of security tasks on any number of networked devices, even the tiniest and most resource-constrained ones. For managing and upgrading security credentials, malware detection, and timely software patch distribution at scale, the fog provides a trusted distributed platform and execution environment for applications and services, as shown in Fig. 1.

By detecting, verifying, and reporting assaults, the fog provides reliable communication and enhanced security. If a security breach is discovered, the fog can detect and isolate risks quickly by monitoring the security status of surrounding devices. Blockchain deployments to low-cost IoT endpoints are possible using the fog. If multiple power generators are attacked using malware, the fogs node-based root-of-trust capabilities allows operations managers to remotely isolate and shut down affected generators. This ensures that service interruptions are minimized. If hackers attempt to take control of a smart factory by exploiting a vulnerability in assembly-line equipment, the domains are protected by fog nodes. Traffic is monitored from the internet into the distributed fog network and uses machine learning in the local environment to detect a potential assault once it has been recognized.

Based on a literature review of the use of the fog paradigm in health-care, smart cities, industrial automation, and smart-connected vehicles, great potential for fog computing exists in the petroleum industry. It is an open-architecture methodology that allows industrial IoT 5G and artificial-intelligence advancement. Fog nodes protect cloud-based IoT and fog-based services by executing a variety of security tasks on any number of networked devices. The benefits of using fog computing in the petroleum industry are latency reduction, improved response time, enhanced compliance, increased security, greater data privacy, reduced cost of bandwidth, overall increase in speed and efficiency, less reliance on wide-area-network services, greater up-time of critical systems, and enhanced services for remote locations.

Fig. 1Securing industrial IoT through fog computing.

This article, written by JPT Technology Editor Chris Carpenter, contains highlights of paper SPE 206067, The Role of Hybrid IoT With Cloud Computing and Fog Computing in Helping the Oil and Gas Industry Recover From COVID-19 and Face Future Challenges, by Ethar H.K. Alkamil, SPE, University of Basrah; Ammar A. Mutlag, Universiti Teknikal Malaysia Melaka; and Haider W. Alsaffar, SPE, Halliburton, et al. The paper has not been peer reviewed.

Technical Paper Synopses in this Series

Introduction: Preparing Facility Engineers for 2022

Downhole Oil/Water Separation System Effective in Horizontal Wells

Mentoring, Sponsoring, and Networking Create Career Success

IoT With Cloud and Fog Computing Can Help Industry Recovery, Advancement

Fouling-Prediction Model Uses Machine Learning

Read the rest here:
IoT With Cloud and Fog Computing Can Help Industry Recovery, Advancement - Journal of Petroleum Technology

Read More..

Sync Computing Launches with Distributed Cloud Infrastructure Tech – ITPro Today

Startup Sync Computing emerged from stealth mode today with $6.1 million in seed funding in a bid to help enable distributed cloud infrastructure for artificial intelligence, machine learning and big data workloads. The company also announced the launch of Sync Autotuner and Sync Orchestrator.

Based in Boston, Sync Computing is a spinout from MIT Lincoln Laboratory, where Suraj Bramhavar and Jeff Chou co-founded the company. The core premise that underlies Sync is that modern cloud infrastructure isn't properly optimized to run distributed workloads across different cloud services.

As part of Sync's launch, the company is also launched Sync Autotuner, a service that helps automatically tune and optimize cloud infrastructure for a given AI, ML or big data workload running the Apache Spark query engine.

In addition, the startup announced Sync Orchestrator, a compute resource orchestration tool for enabling different workloads to run across a distributed cloud infrastructure.

"The larger vision of what we're building is a globally optimized resource allocation and scheduler that is custom-designed for the cloud," Chou, Syncs CEO, told ITPro Today.

Typically, users with a big data job must manually determine what resources are needed on a given cloud platform, which can be a complex process, according to Chou.

Amazon Web Services (AWS) has all manner of virtual instance types, with different configurations offered at various on-demand, reserve and spot pricing. Different resource types can impact performance as well as cost.

Another challenge is the orchestration of jobs. With certain AI, ML and big data workloads there are dependencies, such that one particular operation needs to run after another. Manually scheduling the timing and sequence of events is also a complex task, Chou said.

Sync Computing is aiming to enable its users to define their own business goals for a particular workload, whether the goal is better performance or cost optimization, Chou said. He noted that Sync provides an abstraction on top of compute resources, figuring out the details to help users reach business goals.

Chou said the optimization approach that Sync Computing uses is not based on machine learning. Rather, Sync Autotuner uses what he referred to as a mathematical model.

"There are companies that help tune various applications, but they require running the application for a week to train a model," Chou said. "For us, there is no training. We basically mathematically model Amazon's hardware and cost model, as well as Apache Spark."

As such, Chou claimed that a user could just direct the workload to Sync Computing, which will automatically calculate the right place and time to run a job.

While Sync Computing is starting off by helping optimize Apache Spark workloads across distributed cloud infrastructure, Chou said the plan is to enable various types of distributed computing workloads as the platform and the business continue to grow.

"When we first started, we were fundamentally just research scientists, so we had to find the right product andmarket," he said. "We do want to be a horizontal platform company, so that includes anything that benefits from distributed computing, like TensorFlow, PyTorch and other frameworks."

See the rest here:
Sync Computing Launches with Distributed Cloud Infrastructure Tech - ITPro Today

Read More..

Nearly One-Third of Developers are Turning to Alternative Cloud Providers, New Research from SlashData Finds – PR Web

Were seeing tremendous growth of multicloud approaches as organizations look for the best provider fit to solve their specific challenges. This trend will only continue to gain momentum as multicloud becomes the prevailing cloud strategy for businesses.

PHILADELPHIA (PRWEB) January 31, 2022

Developers increasingly rely on multiple cloud providers for their infrastructure needs. Thats the finding from research conducted by SlashData. The 21st edition of the global Developer Nation survey, which asked nearly 3,500 backend services developers which providers they turn to for their cloud infrastructure, found that one out of every four respondents (27%) uses an alternative cloud provider like Linode, DigitalOcean, or OVHcloud. The survey results also found that usage of alternative cloud providers has nearly doubled over the past four years, a period in which usage of the three largest hyperscalers, AWS, Microsoft Azure, and Google Cloud Platform (GCP), only grew by 18%.

As the cloud market continues to boom Gartner predicts global cloud revenue to total $474 billion in 2022 more organizations are seeking out cloud providers that cater to their specific needs, beyond the complex and costly offerings of the big three hyperscalers. In fact, SlashDatas latest research found that while half (51%) of developers stated that AWS, Microsoft Azure, and GCP served as their primary service provider, the majority (78%) of respondents use more than one provider for cloud services.

Developers and businesses are craving simpler, more affordable, and more reliable cloud services, said Blair Lyon, vice president of cloud experience at Linode. Were seeing tremendous growth of multicloud approaches as organizations look for the best provider fit to solve their specific challenges. This trend will only continue to gain momentum as multicloud becomes the prevailing cloud strategy for businesses.

A recent report from 451 Research found that large hyperscale providers are not always ideal for all cloud use cases and that alternative cloud providers are increasingly bridging the gap for use cases that previously went overlooked or underserviced, including for small businesses and independent developers. In addition, a 2021 survey of devops professionals by Techstrong Group (formerly Accelerated Strategies Group) found alternative cloud providers were used by one-third of those survey respondents.

"Since we began tracking this category over the past few years, developers have increasingly turned to alternative cloud providers for their infrastructure needs, said Konstantinos Korakitis, Director of Research at SlashData.

The 22nd edition of the Developer Nation survey is currently live, so if you are a developer anywhere in the world, you can share your input here.

*About Linode*Linode accelerates innovation by making cloud computing simple, accessible, and affordable to all. Founded in 2003, Linode helped pioneer the cloud computing industry and is today the largest independent open cloud provider in the world. Linode empowers more than a million developers, startups, and businesses across its global network of 11 data centers.

###

Share article on social media or email:

Original post:
Nearly One-Third of Developers are Turning to Alternative Cloud Providers, New Research from SlashData Finds - PR Web

Read More..

Partnership will redesign data centre computing to improve security of the cloud – Imperial College London

A new partnership between Imperial and the Technology Innovation Institute (TII) in Abu Dhabi aims to boost the security of cloud computing services.

Cloud computing, in which data and software are accessed via the internet rather than stored locally on personal laptops and smartphones, was until quite recently a novel concept, but is now a pervasive feature of our digital lives. It powers widely used mobile messaging and email apps, and the productivity suites that office workers use to collaborate online. It will play an even greater role when the Internet of Things (IoT) takes off and devices such as kitchen appliances and industrial machinery become internet-connected as standard, reporting sensor data to cloud servers.

But Professor Peter Pietzuch in Imperials Department of Computing, the partnerships academic lead, says that the devices and operating systems typically used to host cloud services in data centres were not designed with cyber security in mind. Security has not changed much from traditional operating systems that pre-date the internet and the era of cloud services, he says.

Limitations with the security of these systems have contributed to the regular occurrence of data theft, ransomware attacks in which cybercriminals hijack an organisations data and demand a ransom to restore it and attacks from other kinds of malware. These threats could pose ever greater risks to society as vital infrastructure such as the NHS becomes increasingly reliant on digital services and IoT.

The research partnership between Imperial and TII will work toward mitigating these risks by redesigning the hardware and software that the cloud runs on, from the ground up. The project aims to rethink what the hardware and software stack in cloud environments should look like when you design them with security in mind, says Professor Pietzuch.

The project will address the key challenge of ensuring data centre servers are isolated from one another so that a malicious cloud tenant cannot access data belonging to another, while also enabling servers to communicate efficiently for legitimate purposes, particularly for data-intensive computation tasks that are sometimes parallelised over thousands of machines.

The project will work toward developing software and hardware architectures, building on existing AArch64 and RISC-V processor architectures, to allow data to be well compartmentalised while enabling efficient sharing.

Dr Shreekant (Ticky) Thakkar, Chief Researcher at TIIs Secure Systems Research Centre (SSRC), said: The research project with Imperial aims to find solutions based on AArch64 and fits nicely with other research and use cases as SSRC is doing a lot of work on ARM-based and RISC-V platforms and on OS [operating systems] in phones and drones. Easily applicable to todays mobile devices, the projects solutions will simplify the unification of cloud and edge security approaches.

We are talking about new low-level security mechanisms in moderncomputer architectures, says Professor Pietzuch. Malicious attackers can currently exploit unauthorised access to data in a lot of ways for example, to leak data or install ransomware. Instead of thinking about very specific high level attacks and coming up with mitigations against them, our approach will deal with a number of security challenges by helping create a hardware and software stack that is secure at every layer.

Professor Pietzuch emphasises that this requires fundamental research. This is a scientific problem. Companies have been plugging one hole after the other, but there is no end in sight. What were trying to do is step back and consider how to fundamentally rethink software stacks so we dont face the same repeated problems. We need a fundamental shift so we can move to something less vulnerable, or in the future things could get worse and worse.

A lot of people at TII are applied industrial researchers. They bring in an industry view where they ask the right types of questions. They talk about usability aspects and when we devised the project they provided very useful input, steering things so we are aware of where the hard problems lie. Its great to be partnering with them on this important project.

Dr Rebeca Santamaria-Fernandez, Director of Industry Partnerships and Commercialisation for Imperials Faculty of Engineering, said: Cyber security is an area in which we can only make real headway by bringing together basic research of the kind carried out at Imperial and an understanding from industry experts of the problems and challenges faced in the real world. Im delighted about this partnership, which will combine the world-leading expertise of Professor Pietzuch and colleagues and the fantastic resources and expertise of the Technology Innovation Institute.

Imperial facilitates industry partnerships, technology commercialisation, and other activities that help translate research into real-world benefits for industry and society through its Enterprise Division.

Read the original:
Partnership will redesign data centre computing to improve security of the cloud - Imperial College London

Read More..

Filings buzz in the mining industry: 111% increase in cloud computing mentions in Q3 of 2021 – Mining Technology

Mentions of cloud computing within the filings of companies in the mining industry rose 111% between the second and third quarters of 2021.

In total, the frequency of sentences related to cloud computing between October 2020 and September 2021 was 162% higher than in 2016 when GlobalData, from whom our data for this article is taken, first began to track the key issues referred to in company filings.

When companies in the mining industry publish annual and quarterly reports, ESG reports and other filings, GlobalData analyses the text and identifies individual sentences that relate to disruptive forces facing companies in the coming years. Cloud computing is one of these topics companies that excel and invest in these areas are thought to be better prepared for the future business landscape and better equipped to survive unforeseen challenges.

To assess whether cloud computing is featuring more in the summaries and strategies of companies in the mining industry, two measures were calculated. Firstly, we looked at the percentage of companies which have mentioned cloud computing at least once in filings during the past 12 months this was 31% compared to 11% in 2016. Secondly, we calculated the percentage of total analysed sentences that referred to cloud computing.

Of the 50 biggest employers in the mining industry, Honeywell International Inc was the company that referred to cloud computing the most between October 2020 and September 2021. GlobalData identified 37 cloud-related sentences in the US-based company's filings 0.4% of all sentences. China Steel Corp mentioned cloud computing the second most the issue was referred to in 0.15% of sentences in the company's filings. Other top employers with high cloud mentions included Tata Steel Ltd, Metalurgica Gerdau SA, and Nippon Steel Corp.

This analysis provides an approximate indication of which companies are focusing on cloud computing and how important the issue is considered within the mining industry, but it also has limitations and should be interpreted carefully. For example, a company mentioning cloud computing more regularly is not necessarily proof that they are utilising new techniques or prioritising the issue, nor does it indicate whether the company's ventures into cloud computing have been successes or failures.

GlobalData also categorises cloud computing mentions by a series of subthemes. Of these subthemes, the most commonly referred to topic in the third quarter of 2021 was "software as a service", which made up 68% of all cloud subtheme mentions by companies in the mining industry.

Machines and Parts for Crawler Carriers

Heavy Machinery Shipping Services for the Mining Industry

More here:
Filings buzz in the mining industry: 111% increase in cloud computing mentions in Q3 of 2021 - Mining Technology

Read More..

AI storage: a new requirement for the shift in computing and analytics – Information Age

AI storage is needed as the boundaries of traditional computing and analytics shift to a new era

Data or AI storage is necessary to solve the emerging data challenges caused by the move away from traditional computing and analytics.

The universe of supercomputing has expanded rapidly to incorporate AI, advanced data analytics and cloud computing. The era of serial data is ending, with parallel data management replacing network file systems (NFS).

This shift has corresponded to the rise of AI, with investments in the technology hitting a new record in 2021. As an example, Microsoft invested $1 billion in an artificial intelligence project, co-founded by Elon Musk.

This shift in the boundaries of traditional computing and analytics has caused several data challenges that need to be resolved:

Data talent there is need to source new data science talent, maintain currency and up to date skills sets in a rapidly changing software environment.

Data sources there is a need to ingest high volume data from broad sources through a variety of ingest methods at rates well beyond traditional computing requirements.

Data processing there is a need for a different type of data processing to implement large scale GPU environments to bring the parallelism needed for training and inference in real-time.

Data governance there is a need to label, track and manage that data (forever) and share that data across organisations with the right security policies. Explainable AI on an application level and available data on a platform level.

During The IT Press Tour in San Francisco, James Coomer, Sr. Vice President Products Data at DDN, explained that data is the source code of AI, data is imperative for AI and storage is imperative for AI.

Storage cant be an afterthought, as its key for data ingestion, sourcing, management, labelling and longevity, which is critical for AI.

Data or AI storage is necessary to solve the emerging data challenges caused by the move away from traditional computing and analytics:

>Read here: To find out more about the other companies in the latest edition of The IT Press Tour in San Francisco and Silicon Valley

Traditionally, DDN Storage has focused on traditional data storage for unstructured data and big data in enterprise, government and academic sectors.

Now, it is redefining the imperatives that are driving it as a company, focusing on AI storage, with its solution, AI, which is at the heart of its growth strategy.

In action, over the last two years DDN has acted as the core backend storage system for NVIDIA to increase performance & scale and flexibility to drive innovation.

NVIDIA commands nearly 100% of the market for training AI algorithms and has multiple AI clusters, according to Karl Freund, analyst at Cambrian AI Research.

Following this success, DDN is powering the UKs most powerful supercomputer, Cambridge 1, which went live in 2021 and is focused on transforming AI-based healthcare research.

The AI storage vendor is also working with Recursion, the drug discovery company.

Our at-scale data needs require fast ingest, optimised processing and reduced application run times, said Kris Howard, Systems Engineer at Recursion.

Working with DDN, the drug discovery company achieved up to 20x less costs and raised the possibilities for accelerating the drug discovery pipeline with new levels of AI capability.

It previously ran on the cloud, but now operates more efficiently, with greater value for money on-premise.

DDN pioneered accelerated data-at-scale to tackle what ordinary storage cannot. We make data environments for innovators to create the future. Were the largest AI storage provider in the world, proven in an array of different industries and customers, from financial services to life sciences, added Coomer.

1. Transforming cancer care with managed services: DDN as a service for precision Oncology

Customer:

Data challenge:

DDN solution:

2. Simplifying data management for a global financial services and venture firm

Customer:

Data challenge:

The DDN Solution:

3. Transforming research data storage: from management and maintenance to universal resources

Customer:

Data challenge:

The DDN Solution:

See also: Mining the metadata and more Tips for good AI data storage practices

Read the original post:
AI storage: a new requirement for the shift in computing and analytics - Information Age

Read More..

Infrastructure as a Service Market Projected to Reach a Valuation of USD 119 Billion by 2030, Registering Around 24.1% CAGR – GlobeNewswire

New York, US, Jan. 31, 2022 (GLOBE NEWSWIRE) -- Market Overview: According to a comprehensive research report by Market Research Future (MRFR), Infrastructure as a Service Market information by Solution, by Deployment Type, by End User, by Verticals and Region forecast to 2030 market size to reach USD 119 billion, growing at a compound annual growth rate of 24.1% by 2030.

Market Scope: The infrastructure as a service (IaaS) market is expected to garner significant traction in the next few years. The continually growing demand for data governance, alongside the rising cases of IaaS misconfigurations and inadvertent sensitive data storage exposure that can cause significant data breaches, have created vast market opportunities over the past couple of years.

Dominant Key Players on Infrastructure as a Service Market Covered are:

Get Free Sample PDF Brochure: https://www.marketresearchfuture.com/sample_request/5910

Market USP Exclusively Encompassed:Market DriversBusinesses are increasingly focusing on responding to rising cybersecurity threats by improving data governance to prevent data loss. Simultaneously, augmenting demand for hybrid computing, scalability, faster implementation, and accessibility of the IT system define the continually growing market landscape.

Moreover, the increasing adoption of IaaS, multi-cloud, and cloud-based IaaS platforms would escalate the market shares. Resultantly, the market is projected to witness significant growth in the years to come. Augmenting uptake of cloud computing in BFSI, IT & telecom, healthcare, and retail & e-commerce industries due to maximum uses of IT resources and cost-effective cloud computing services among organizations drive the market demand.

Conversely, the lack of technical expertise required to manage IT infrastructure and cloud data security is a major factor impeding market growth. However, the growing adoption of a cloud-based IaaS platform to enhance IT services and data accessibility would support the market growth through the review period. Also, vast implementations of infrastructure services in large enterprises would impact the growth of the IaaS market over the forecast period.

Browse In-depth Market Research Report (100 Pages) on Infrastructure as a Service Market:https://www.marketresearchfuture.com/reports/infrastructure-as-a-service-market-5910

Segmentation of Market Covered in the Research:The market is segmented into solutions, deployment types, end-users, verticals, and regions. The solution segment is sub-segmented into managed hosting services, storage-as-a-service (network attached and storage area network-based storage), disaster recovery as a service, high-performance computing as a service, network management, and content delivery services.

The deployment type segment is sub-segmented into public, private, and hybrid clouds. The end-user segment is sub-segmented into SMEs and Large Enterprises. The vertical segment is sub-segmented into IT & telecom, BFSI, healthcare, retail & e-commerce, government, defense, and others. The region segment comprises the Asia Pacific, Americas, Europe, and rest-of-the-world.

Regional AnalysisNorth America dominates the global infrastructure as a service market. The largest market share attributes to the wide uptake of IaaS and technological upgrades. Besides, increasing investments in cloud-based solutions drive the IaaS market growth in the region. With the increasing penetration of hybrid cloud and advanced IT infrastructure, the region is projected to retain its dominance throughout the forecast period.

Europe stands second in the global infrastructure as a service market. The region, with its vast technology upgrades, offers lucrative opportunities. The market growth is driven by rapidly growing enterprises in the region and vast investments in risk disparity factors due to infrastructure service disruptions in extreme weather events. Moreover, the rising awareness of integrating social dimensions into the resilience planning of infrastructure systems substantiates the region's market size.

Talk to Expert: https://www.marketresearchfuture.com/ask_for_schedule_call/5910

The Asia Pacific region is also a promising market for infrastructure as a service globally, accounting for a sizable market share. Factors such as spurring industrialization and economic growth across the region are key driving forces for market growth. Countries like China, Japan, and India support the regional market's growth, heading with significant technological advances.

COVID-19 Impact on the Global Infrastructure as a Service MarketThe onset of COVID 19 significantly impacted the infrastructure as a service market, halting infrastructure development and the global economy. On one side, where the pandemic created rapid disruptions in key infrastructure sectors and industries, on the other hand, it increased the value of digital connectivity, presenting it as the most cost-efficient and effective solution to respond to the global crisis.

The COVID19 outbreak also had numerous contrasting effects on the digital infrastructure sector, such as increased demand for quality digital connectivity due to falling telecommunications prices in certain countries and increased network capacity in others. Emerging trends and their implications for policies and corporate, alongside investment strategies to support the development of the digital infrastructure sectors in emerging markets, would contribute to IaaS market revenues through the pandemic and beyond.

Share your Queries:https://www.marketresearchfuture.com/enquiry/5910

Competitive LandscapeHighly competitive, the IaaS market appears fragmented due to several large and small players forming a competitive landscape. To gain a larger competitive edge, players incorporate strategies such as collaborations, mergers & acquisitions, product/ technology launches, and expansion.

For instance, on Jan. 20, 2022, 11:11 Systems, a leading provider of managed infrastructure solutions, announced the acquisition of iland, a leading global cloud service provider of secure and compliant hosting for IaaS DRaaS and BaaS. The company (11:11 Systems) has also recently acquired Green Cloud Defense, a channel-only, cloud IaaS provider.

The addition of iland's steady 25% YOY momentum would enable 11:11 Systems to expand its national network of MSPs, VARs and IT consultants, creating a hyper-growth pathway. Companies are increasingly struggling to manage their hybrid infrastructure effectively and are under pressure to focus scarce resources on other key priorities. 11:11 would be able to meet the growing market demand and help single, trusted vendors navigate security threats and reduce complexities in infrastructure management.

In another instance, on Jan. 18, 2022, Sunlight.io, a leading provider of the edge infrastructure, announced a partnership with Safozi, a leading IaaS provider in North Africa. Together with Sunlight.io, Safozi launched high-performance cloud services in Tunisia to bring high-performance and private cloud services with local support across North Africa. Safozi's affordable, high-performance cloud services would now be able to deliver a far higher level of responsiveness to their users striving for better customer satisfaction.

Related Reports:Composable Infrastructure Market, By Cloud Type (Public, Private, Hybrid), By Type (Hardware, Software), By Organization Size (Large Enterprises, SMEs), By Vertical (IT and Telecommunication, BFSI, Government, Healthcare, Manufacturing and Others), By Region (North America, Europe, Asia-Pacific and the Rest of the world) - Industry Forecast till 2027

IT Infrastructure Services Market Research Report: By Type (Network Management Service, Enterprise System Management, IT Security Management, Virtualization Solutions, Data Center Consolidation Services and others), Service Type (Consulting, Planning Integration & Implementation, Maintenance and Managed), Organization Size (Small & Medium Enterprise, Large Enterprise), Vertical (Automotive, Chemicals, Retail & Consumer Goods, IT & Telecommunication, Healthcare, Government, BFSI, Manufacturing, and others) and Region (North America, Europe, Asia-Pacific, South America, Middle East & Africa) - Forecast till 2027

About Market Research Future:Market Research Future (MRFR) is a global market research company that takes pride in its services, offering a complete and accurate analysis regarding diverse markets and consumers worldwide. Market Research Future has the distinguished objective of providing the optimal quality research and granular research to clients. Our market research studies by products, services, technologies, applications, end users, and market players for global, regional, and country level market segments, enable our clients to see more, know more, and do more, which help answer your most important questions.

Follow Us:LinkedIn|Twitter

Originally posted here:
Infrastructure as a Service Market Projected to Reach a Valuation of USD 119 Billion by 2030, Registering Around 24.1% CAGR - GlobeNewswire

Read More..

Man with confederate flag told to leave by Ottawa truckers: ‘We called him out’ – Washington Examiner

Video from the trucker protest against vaccine mandates in Ottawa shows demonstrators asking a masked man holding a confederate flag to leave the area.

The video shows the man with the flag walking away from a group of protesters as one of the demonstrators calls out, "Now he's going. Now he's gone. We called him out. He knows. He's gonna hold his head in shame now."

ELON MUSK SUPPORTS CANADIAN TRUCKERS PROTESTING VACCINE MANDATES: 'CANADIAN TRUCKERS RULE'

Another confederate flag was photographed at the event in downtown Ottawa.

The vast majority of the thousands of protesters occupying the city streets are holding Canadian flags and other signs and banners with messaging about vaccine mandates and criticizing the leadership of Canadian Prime Minister Justin Trudeau, according to video and photos from the scene.

Many social media users reacted to the video, claiming that the masked man was suspicious or possibly a plant to confuse the message of the protest, which was a demonstration against government vaccine mandates.

CLICK HERE TO READ MORE FROM THE WASHINGTON EXAMINER

"Why would he cover his face & walk away when asked questions if he was one of the protestors?!" one Twitter user commented. "Efforts the govt & media trying to put in to paint all the protesters as white supremacists is appalling!"

"And the individual with the flag appears to be suspect," commented the Post Millennial's Angelo Isidorou, responding to Jordan Peterson, who blasted the CBC's reporting of the confederate flag holder as "pathetic."

Thousands of truckers from across Canada formed a convoy headed for Ottawa last week in protest of government-enforced vaccine mandates that prevent unvaccinated cross-border truckers from entering or leaving the country. The convoy reached Ottawa over the weekend and filled the downtown area with a sea of protesters who say they will occupy the city until the mandates are lifted.

Read the original:
Man with confederate flag told to leave by Ottawa truckers: 'We called him out' - Washington Examiner

Read More..

Joe Rogan: Have Jordan Peterson and Valentine Thomas been guests on the Joe Rogan podcast this week? How can you listen, how long has it been running…

It was acquired by Spotify in December 2020 in a blockbuster deal.

Hosted by Joe Rogan, it has had a wide range of different guests featuring on episodes from the likes of Elon Musk to U.S presidential candidates.

Heres all you need to know:

The most recent episode of the Joe Rogan Experience features Valentine Thomas

It was released on January 26.

Earlier in the week Jordan Peterson was a guest on episode 1769

Both have most shared tags on Spotify.

How can you listen to The Joe Rogan Experience?

The podcast is exclusively available from Spotify, after being acquired by the streaming service in December 2020.

Clips are also posted on YouTube.

How long has it been running for?

The podcast was launched in 2009 and a total of 1,770 episodes have been released so far.

A message from the Editor, Mark Waldron

Read more here:
Joe Rogan: Have Jordan Peterson and Valentine Thomas been guests on the Joe Rogan podcast this week? How can you listen, how long has it been running...

Read More..