Page 2,864«..1020..2,8632,8642,8652,866..2,8702,880..»

The acceleration to cloud is causing a monitoring migraine – www.computing.co.uk

From a technology perspective, one of the biggest changes we've seen over the past year has been a dramatic acceleration in cloud computing initiatives. The pandemic has proven once and for all that cloud computing really does work, even in the most challenging of circumstances, providing greater speed, agility and resilience.

And with this new level of trust and appreciation of cloud computing, huge numbers of businesses have gone from running only a handful of applications in the cloud to wanting to shift significant parts of their IT estate over to a cloud environment, as quickly as they possibly can.

Indeed, as organisations have rushed through digital transformation programs to deliver new digital services to both customers and employees during the pandemic, most have relied heavily on the cloud to enable them to move at the required speed and scale.

The pandemic will certainly come to be seen as a tipping point in the transition to cloud computing, speeding up what was already an inevitable switch by several years. Indeed, Gartner has forecast that worldwide end-user spending on public cloud services will grow by 18.4 per cent in 2021, and that the proportion of IT spending on cloud computing will make up 14.2 per cent of the total global enterprise IT spending market in 2024, up from 9.1 per cent in 2020.

This marked shift towards cloud computing is undoubtedly delivering benefits, enabling the digital transformation initiatives organisations have relied on throughout the pandemic. In many cases, the level and speed of innovation that has been achieved simply wouldn't have been possible using legacy technologies.

However, there is always a sting! The rapid acceleration of cloud initiatives has had a profound impact on the IT department, adding huge complexity and even greater pressure onto technologists.

In our latest Agents of Transformation report, Agents of Transformation 2021: The Rise of Full-Stack Observability, we found that 77 per cent of global technologists are experiencing greater levels of complexity as a result of the acceleration of cloud computing initiatives during the pandemic. And 78per cent cited technology sprawl and the need to manage a patchwork of legacy and cloud technologies as an additional source of complexity.

On the back of rapid digital transformation over the past year, technologists have rightly put even more focus on monitoring the entire IT estate, from customer-facing applications through to third party services and core infrastructure like network and storage. But whilst to a large degree their established monitoring approaches and tools have provided them greater visibility across traditional, legacy environments, they have been found wanting within new hybrid cloud environments.

The reason for this is that within a software-defined, cloud environment, nothing is fixed; everything is constantly changing in real-time. And that makes monitoring far more difficult.

Traditional approaches to monitoring were based on physical IT infrastructure - technologists knew they were operating five servers and 10 network wires - they were dealing with constants. This then allowed for fixed dashboards for each layer of the IT stack. But the nature of cloud computing is that organisations are continually scaling their use of IT up and down, according to business need. For instance, a company might be using two servers to support a customer-facing application, but then suddenly increase that to 25 servers to meet a surge in demand in real-time, before dropping back down to five a few hours later while adapting its network and storage infrastructure along the way.

Traditional monitoring solutions simply aren't designed for this dynamic use of infrastructure as code, and that means most technologists can no longer get visibility of their full IT stack health in a single pane of glass. In fact, three-quarters of technologists now report they are being held back because they have multiple, disconnected monitoring solutions, and worryingly, more than two-thirds admit they now waste a lot of time as they can't easily isolate where performance issues are actually happening. The acceleration of cloud computing initiatives is undoubtedly the major driver of this issue.

Looking ahead, technologists are under no illusions: the transition to the cloud is only going to gather pace, as organisations continue to prioritise digital transformation to get through the pandemic and exploit new opportunities in a turbulent marketplace.

Technologists are also fully aware that unless they find a way to gain greater visibility and insight into all IT environments, they will be unable to drive the rapid, sustainable digital transformation their organisations need. Indeed, 79per cent of technologists state that they need to adopt more comprehensive observability tools to achieve their organisations' innovation goals.

Without genuine full-stack observability, technologists simply don't stand a chance of being able to quickly identify and fix technology issues before they impact end users and the business.

IT and business leaders need to recognise that unless they address this issue now, they are jeopardising all of their efforts and investment in digital transformation. Organisations can develop the most innovative, cloud-based applications for their customers and staff, but unless their technologists have the right level of visibility and tools to optimise IT performance in real-time, then they will never be able to deliver faultless digital experiences.

Technologists need to be able to monitor all technical areas across their IT stack, including within cloud environments, and to directly link technology performance to end user experience and business outcomes, so they can prioritise actions and focus on what really matters to the business. Get this right, and then organisations really can start to take full advantage of the cloud.

James Harvey is EMEAR CTO at Cisco AppDynamics

Visit link:
The acceleration to cloud is causing a monitoring migraine - http://www.computing.co.uk

Read More..

A Chance to Tap Cloud & 5G Through the Upcoming iShares’ ETF – Zacks.com

Cloud computing and 5G have been hot investing areas lately thanks to higher demand and stupendous stock market gains of the industry players. Now wonder, iShares Trust is on its way to launch an ETF on the dual concepts. The name of the proposed fund is iShares Cloud 5G and Tech ETF IDAT.

The iShares Cloud 5G and Tech ETF looks to track the investment results of an index composed of companies from developed and emerging markets that could benefit from providing products, services, and technologies related to cloud computing and 5G. The fund would charge 47 bps in fees (read: 5 Most-Crowded Trades & Their Winning ETFs).

Cloud computing is a process in whichdata or software is stored outside of a computer, but can be easily accessed anywhere, at any time via the Internet.This idea is effective as it helps firms to lower IT costs by eliminating the need for servers and related maintenance costs.

In the wake of the pandemic, cloud technology adoption is projected to witness robust growth in sectors where work-from-home initiatives are sustaining business functions. Globally, end-user spending on public cloud services is forecast to grow 23% in 2021 to a total $332.3 billion,according to Gartner (read: A Comprehensive Guide to Cloud Computing ETFs).

On the other hand, 5G, the next era of smarter, faster and more efficient wireless technology, has lately picked up pace. The initial round of rollouts has been gathering steam globally. It is operational in many major cities in the United States, as well as places in China, South Korea and the United Kingdom, among other countries.

Carriers are busy building foundations. Phone makers have also started launching 5G-enabled handsets. Investors should note that apart from the faster usage of mobile networks, 5G is going to strengthen the mechanism of the growing Internet of Things (IoT) so that a human-to-object interaction can be set up smoothly (read: 5G Gaining Immense Traction: ETFs to Bet On).

Defiance Next Gen Connectivity ETF (FIVG Quick QuoteFIVG - Free Report) , Pacer Benchmark Data & Infrastructure Real Estate Sector ETF (SRVR), ALPS Disruptive Technologies ETF (DTEC) and First Trust Indxx NextG ETF (NXTG) are some of the ETFs that have exposure to 5G-enabled companies.

On the other hand, First Trust Cloud Computing ETF (SKYY), Global X Cloud Computing ETF (CLOU Quick QuoteCLOU - Free Report) , WisdomTree Cloud Computing ETF (WCLD) and Wedbush ETFMG Global Cloud Technology ETF (IVES) are the ETFs that thrive on cloud computing.

Hence, there is tough competition in the space, though two concepts in one fund could be a winning proposition for IDAT, if it is ever approved.

Zacks free Fund Newsletter will brief you on top news and analysis, as well as top-performing ETFs, each week.Get it free >>

Read more here:
A Chance to Tap Cloud & 5G Through the Upcoming iShares' ETF - Zacks.com

Read More..

Cloud migration and the catch-22 conundrum – ITWeb

Cloud computing is changing the way we do business. It offers increased scalability, flexibility and the opportunity to easily collaborate with fellow workers, customers and other stakeholders.

Cloud computing also enables software homogenisation across the business, giving every staff member access to the same current and updated software.

Today, increasing numbers of businesses are using cloud computing services in various forms. According to Gartner research, by 2022, almost 90% of all businesses will operate in the cloud to a greater or lesser extent.

While the remote-working trend has largely driven many organisations to adopt a cloud-based philosophy, it has also given them access to the big business benefits of increased processing power and improved data storage capacities.

However, as important and worthy as investments in cloud technologies are, there are challenges associated with cloud-migration moves.

For example, in their haste to adopt cloud-based solutions, some companies are putting their remaining traditional on-premises infrastructures in jeopardy by placing scheduled network updates on the back-burner.

Conversely, other organisations minimise their cloud migration efforts in a bid to extend the life of existing traditional network assets in order leverage capital investments in them.

The nett result is an increase in obsolete and unpatched devices containing software vulnerabilities with a number of networks exposed to information security threats.

In most cases, the advice of specialists is necessary to assist organisations in the planning phase ahead of their cloud journey.

As acclaimed research scientist and author Daniel Hein says: if your business isnt prepared to deal with the challenges of cloud migration, then it could be costly and dangerous for you and your data.

In short, outdated hardware and applications create vulnerabilities, making them easy targets for hackers whose goal is to infiltrate these increasingly-flawed networks.

There are other circumstances that can impact cloud migration and help prolong the life of outdated infrastructures.

For instance, some organisations take the view that cloud migration, with the addition of new cloud assets and networked devices, will introduce a significant degree of complexity into their IT operations.

They then plan for any cloud migration activities to take effect only after existing IT staff have reached the skill levels required to manage, integrate and maintain the processes.

Moreover, the upskilling of these employees is often seen as an addition to their current responsibilities for on-premises IT management and maintenance which may suffer as a result.

Research findings reveal that more than 50% of companies find cloud migration more difficult than expected, with projects exceeding budgets and missing completion deadlines. This is particularly true for organisations burdened with older on-premises implementations.

This creates a catch-22 situation with businesses holding on to aging, underperforming IT platforms, hoping to postpone the evil day when a move to cloud computing becomes imperative.

However, as many network managers will confirm, the older the technology, the more costly it becomes to effect an update or repair. Therefore, a reliance on outdated solutions will negatively impact business agility and limit an organisations ability to adapt quickly to market changes such as the work-from-home movement.

Similarly, such an imprudent strategy will also impair an organisations capability to respond rapidly to changing customer demands.

Of course, there are isolated instances where cloud migration may be delayed by special circumstances, such as a reliance on proprietary technology which, for legal reasons, may be unable to be deployed to the cloud.

Against this backdrop, making the move away from a traditional IT environment to cloud computing must be seen as a major step, with decision-making certain to impact the company from many aspects, including but not limited to bottom-line profitability and medium- to long-term growth.

As Ron Lopez, executive vice-president of NTT, notes in a published statement: The network is the platform for business digital transformation. It needs to be ubiquitous, flexible, robust and secure to adapt easily to business change, while increasing the maturity of the operational support environment.

In this light, businesses are best advised to take the earliest opportunity to appraise strategies related to their network and security architectures, and review plans for operating and support models. The objective should be to better manage operational risk and achieve a degree of maturity in operational support structures.

In most cases, the advice of specialists is necessary to assist organisations in the planning phase ahead of their cloud journey which needs to be a smooth and seamless experience.

As eminent IT industry luminary Josh LeSov says: The biggest challenge companies face when migrating to the cloud is their preparedness. You need to work with a seasoned implementation team that has strong project management skills, system experience and industry expertise.

Additionally, this team needs to be able to stick around after the implementation is done since on-going support is always required.

Full visibility into an organisations IT infrastructure before, during and after cloud migration is imperative, as is the adoption of modern technologies and techniques in order to eliminate potential pitfalls in the process which might otherwise compromise data, applications and day-to-day business activities.

Importantly, with budgets under ever-increasing pressure, costs must be accurately predicted and expertly managed. It is vital for in-house and consulting teams focusing on IT, security and operations to be on the same page in order to create a successful cloud migration blueprint.

View original post here:
Cloud migration and the catch-22 conundrum - ITWeb

Read More..

The future of asset health is in the cloud – Canadian Mining Journal

Emerging echnologies in predictive maintenance demand a cloud infrastructure for their unique capabilities remote data storage and aggregation, machine learning, and IIoT-based automation. Credit: Wenco

Asset health technologies have transformed the reliability of mining equipment over the past generation. By tapping into the equipments onboard sensors, maintenance teams can observe and record hundreds of parameters that indicate equipment health. Understanding this data and its effects has empowered mines to expand mean time before failure (MTBF), uptime, and other maintenance KPIs more than any tools in recent memory.

Yet, these technologies have their limitations. When installed exclusively on premises, asset health systems miss the advantages available with the power of cloud computing. In 2021, many innovations in predictive maintenance demand a cloud infrastructure and its unique capabilities to deliver optimal value. Remote data storage and aggregation, access to machine learning algorithms, and IIoT automation all rely on cloud technologies that are increasingly necessary elements in a forward-thinking mine maintenance program.

Fortunately, advances in data processing and communications technologies are making cloud solutions more viable for the mining industry. While traditionally resistant to cloud implementations, mines are now leveraging the capabilities of cloud computing, and their maintenance departments are seeing the benefits. New solutions are empowering maintenance teams to do their jobs better in ways that were impossible a few years ago predicting component fatigue from early warning signs at the edge, observing changes in equipment performance on a continuous basis, and even collaborating with OEMs on proactive asset management that leverages integrated digital platforms.

Real-time analytics, now at the edge

Edge devices installed on mobile and plant equipment are the point of entry for much of the data in any asset health infrastructure. Traditionally, these low-powered hardware units provided simple data processing near the source of operation, streaming that information to a cloud server for aggregation with other datasets and cross-platform analysis.

While this configuration can work well, the wealth of sensors and data now available to mines and their maintenance teams often proves too voluminous and costly to manage in this way. Bandwidth restrictions and communication costs mean that traditional cloud infrastructures struggle to handle the requirements of emerging IIoT systems. Instead, new solutions see more and more calculations happening at the edge itself.

Long-established vendors like Emerson, as well as startups like FogHorn, are bringing advanced capabilities like analytics and AI to lightweight devices near the source of a data stream. Todays edge devices are able to take raw sensor data temperature, pressure, vibration, events, and more and perform complex computations independent of a powerful cloud server. Data ingestion, processing, and reporting can now happen near the source, providing real-time, cost-effective insights to maintenance personnel. After that time-sensitive information has been communicated, the systems can publish compressed data to their cloud counterparts for richer analysis and long-term storage.

Its a two-way street, says Vien Dang, asset health specialist for Wenco International Mining Systems. Edge and cloud solutions work together. You train edge devices using a cloud-hosted model of what a healthy equipment unit looks like, then set it loose to respond to real-world applications.

Reliability teams get clean, accurate reporting quickly so they can respond quickly. Then, that data feeds up to the cloud, improving the model they started with. Over time, the whole process gets faster, more accurate, and more responsive with very little latency or bandwidth issues.

Digital twins deliver precise, specific asset health modelling

Todays inexpensive sensors and edge devices can easily produce vast streams of data, but making sense of it is another challenge. Often, maintenance teams have access to volumes of data, but lack useful information to diagnose emerging problems and intervene to prevent failures.

Rithmik Solutions is changing that. The Montreal-based companys Asset Health Analyzer (AHA) uses machine learning and a rapid analytics infrastructure to create accurate, site-specific equipment health baselines that enable early detection and diagnosis of maintenance issues.

Other asset health technology may claim to enable early issue detection, but AHA analytics go beyond manual error thresholds and standard AI models. In effect, AHA uses a multi-tiered AI approach with digital twins, which act as virtual companions for the entire equipment fleet. This approach fundamentally transforms a mines preventive maintenance program, letting technicians follow component health on an ongoing basis and examine the exact condition of monitored parts before pulling it down for maintenance.

There are a lot of advantages to embedding digital twins within a multi-layered AI approach, says Amanda Truscott, co-founder and CEO of Rithmik Solutions. Earlier alarms without any threshold setting, insight about whats going wrong, whats about to go wrong, and what went wrong in the past, the ability to prioritize maintenance based on actual equipment health.

AHA uses machine learning to quickly build a contextualized baseline for the best-performing equipment at the mine. It then monitors equipment for any difference from that tuned-in normal state, providing deep and early insights into equipment issues so mines can prevent small problems from escalating. By maintaining models of standard equipment in this way, AHA also allows for cross-asset comparison, highlighting how like assets are similar and how they vary.

Trials of AHA have already shown strong results, providing alarms hours or even days ahead of OEM alerts. In one case, rod-bearing failures on Cat 793Ds were costing a site in Canada $4 million year due to a late OEM warning coming only a few minutes before the failure occurred. AHA was able to find indicators of those failures 10 hours earlier a relative lifetime for maintenance to intervene.

In another recent trial in collaboration with our partner Wencos digital platform, our Asset Health Analyzer rapidly uncovered a customers fleet-wide inefficiency that had gone undetected for multiple years by both the equipment dealer and the mine maintenance team, said Kevin Urbanski, co-founder and CTO of Rithmik Solutions.

What had happened was that temperature regulators failed on 76% of the mines haul truck fleet. Fixing the issue is going to both extend the life of the engines and result in significant fuel savings.

Urbanski says AHA also pulled out previously unknown failure mode indicators on two separate chronic machine issues, which Rithmik and its customer are now using to generate earlier alerts of the failure modes. These insights are also providing a deeper understanding of the total impact of these failure modes on the machine themselves.

Cloud platforms create an ecosystem of partners in mine asset health

Cloud-based platforms are another emerging development in asset health. While digital portals are already common in medicine, entertainment, and enterprise business systems, they are new for mine maintenance.

The concept mirrors existing asset health systems: Sensor data streams to a server, which processes and reports real-time or historical information that maintenance technicians use to understand equipment condition. However, transferring this data to a secure cloud platform instead of an on-premises server opens up many opportunities for mining companies, including access to IIoT and AI-based analysis and stronger collaboration with OEM dealers.

Wenco and Hitachi Construction Machinery (HCM) are currently developing such a cloud-based solution, known as ConSite Mine. Operating on a digital IIoT platform, ConSite Mine remotely aggregates and processes the large volume of data associated with asset health for every installed unit at a mine site, displaying it on a customized dashboard for each customer.

ConSite Mine dashboard. Advanced digital technology helps extend equipment life and improve productivity and safety by providing the information to predict issues, such as visualizing signs of structural cracks. Credit: Hitachi Construction Machinery

Existing asset health systems may also allow customers to monitor equipment health in real time and anticipate issues before they occur, but a cloud solution like ConSite Mine enables the participation of partners outside the walls of the maintenance facility. With ConSiteMine, HCM dealers are able to remotely monitor equipment health in conjunction with their customers, leveraging their expertise and forging a partnership in keeping units running. Dealer technicians supporting their customers can proactively analyze asset health information through the online dashboard, then pre-order parts and schedule planned maintenance avoiding the costs and delays of unplanned downtime from failed equipment.

There are so many opportunities with a digital solution like ConSite Mine, says Dang. For example, the system can detect signs of a pending failure of an excavators hydraulic pump, then let the customer and HCM dealer know well ahead of time. The dealer can check their parts inventory, order a replacement, and schedule the install from their office, taking the pressure off the mines maintenance team.

That one preventive intervention could save the mine $1 million, easy.

Maintenance and operations data can feed into these emerging cloud platforms, enabling mine personnel, dealers, and consultants to investigate root causes, perform failure modes and effects analysis, and contribute to improved policies and structural designs of future equipment. Taking it further, cloud platforms like ConSite Mine are able to integrate services from other OEMs and third parties, creating an ecosystem of partners all working in support of the mines business objectives.

By bringing in OEMs and third parties, maintenance teams arent going it alone anymore, says Dang. They have specialists who are the most knowledgeable people in the world working with them 24/7 to extend their MTBF and reduce downtime.

And, really, its only feasible with the cloud.

Devon Wells is the corporate marketing manager for Wenco International Mining Systems, a Hitachi Construction Machinery group company. To learn more, visit http://www.wencomine.com

Original post:
The future of asset health is in the cloud - Canadian Mining Journal

Read More..

Indias public cloud spending on a roll – ComputerWeekly.com

Indias spending on public cloud services in reached $3.6bn in 2020, as more businesses in the subcontinent turn to cloud computing to ride out the ongoing pandemic, according to IDC.

Much of the growth came in the second half of the year, where revenue from cloud-based infrastructure, platform and applications totalled $1.9bn. The overall Indian public cloud services market is expected to reach $9.5bn by 2025, representing a compound annual growth rate of 21.5%.

Rishu Sharma, principal analyst for cloud and artificial intelligence at IDC India, noted the critical role that public cloud services played for organisations in 2020 as enterprises look to build digital resiliency.

Cloud will become crucial as organisations expedite the development process and deployment of business applications to meet the changing work and business environment, she added.

Cloud-based applications made up the lions share of overall public cloud spending, followed by cloud-based infrastructure and platforms. According to IDC, the top two service providers had 49% of the Indian public cloud services market in 2020.

Even though enterprises in the country have been discussing cloud adoption for the past few years, the Covid-19 pandemic forced enterprises to expedite their cloud strategy. This accelerated cloud adoption in the country by several years, said Harish Krishnakumar, senior market analyst at IDC India.

Businesses started adopting cloud to host a wide array of applications ranging from e-mail servers to many complex systems like data warehousing and advanced analytics. There was also an increased migration of enterprise applications to the cloud, he added.

Indian organisations are already using public cloud services in a big way. Tata Capital, for example, isusing virtual assistants such as Amazon Alexato deliver services, while the National Commodity and Derivatives Exchange has migrated 50 applications to AWS after a fire in 2018.

Indias growing demand for public cloud services has drawn major cloud suppliers to shore up their investments in the country.

In November 2020, Amazon Web Services said it was investing $2.8bn in a second cloud region in India, while Microsoft has teamed up with Indian telecoms giant Jioto deliver cloud infrastructure services through two new datacentres being built in Gujarat and Maharashtra.

Not to be outdone is Google which willopen a Delhi cloud region by 2021, its second one in India since it launched its Mumbai cloud region in 2017.

Google said the new region will enable Indian organisations to take advantage of its big data and infrastructure services onshore while complying with Indias data laws and regulations.

Excerpt from:
Indias public cloud spending on a roll - ComputerWeekly.com

Read More..

Cognitive Cloud Computing Market Next Big Thing : Major Giants Google, 3M, Microsoft The Shotcaller – The Shotcaller

A Latest intelligence report published by AMA Research with title Cognitive Cloud Computing Market Outlook to 2026. A detailed study accumulated to offer Latest insights about acute features of the Global Cognitive Cloud Computing market. This report provides a detailed overview of key factors in the Cognitive Cloud Computing Market and factors such as driver, restraint, past and current trends, regulatory scenarios and technology development. A thorough analysis of these factors including economic slowdown, local & global reforms and COVID-19 Impact has been conducted to determine future growth prospects in the global market.Definition:Cognitive computing is the use of computerized models to simulate the human thought process in complex situations where the answers may be ambiguous and uncertain. The phrase is closely associated with IBMs cognitive computer system, Watson. Cognitive computing overlaps with AI and involves many of the same underlying technologies to power cognitive applications, including expert systems, neural networks, robotics and virtual reality (VR).Major Players in This Report Include,3M (United States), Google LLC (United States), Hewlett Packard Enterprise Development LP (United States), International Business Machines Corporation (United States), Microsoft Corporation (United States), Nuance Communications Inc. (United States), Oracle Corporation (United States), SAP SE (United States), SAS Institute Inc. (United States), Tibco Software Inc. (United States)

Free Sample Report + All Related Graphs & Charts @ : https://www.advancemarketanalytics.com/sample-report/161546-global-cognitive-cloud-computing-marketMarket Trends:

Market Drivers:

Market Opportunities:

The Cognitive Cloud Computing Market segments and Market Data Break Down are illuminated below:by Deployment Type (On-Premise, Cloud), Organization Size (SMEs, Large enterprises), Technology (Natural Language Processing, Machine Learning, Automated Reasoning, Others), Industry Vertical (Healthcare, BFSI, Retail, Government & Defense, IT & Telecom, Energy & Power, Others)

Cognitive Cloud Computing the manufacturing cost structure analysis of the market is based on the core chain structure, engineering process, raw materials and suppliers. The manufacturing plant has been developed for market needs and new technology development. In addition, Cognitive Cloud Computing Market attractiveness according to country, end-user, and other measures is also provided, permitting the reader to gauge the most useful or commercial areas for investments. The study also provides special chapter designed (qualitative) to highlights issues faced by industry players in their production cycle and supply chain. However overall estimates and sizing, various tables and graphs presented in the study gives and impression how big is the impact of COVID.

Enquire for customization in Report @: https://www.advancemarketanalytics.com/enquiry-before-buy/161546-global-cognitive-cloud-computing-market

Geographically World Cognitive Cloud Computing markets can be classified as North America, Europe, Asia Pacific (APAC), Middle East and Africa and Latin America. North America has gained a leading position in the global market and is expected to remain in place for years to come. The growing demand for Cognitive Cloud Computing markets will drive growth in the North American market over the next few years.

In the last section of the report, the companies responsible for increasing the sales in the Cognitive Cloud Computing Market have been presented. These companies have been analyzed in terms of their manufacturing base, basic information, and competitors. In addition, the application and product type introduced by each of these companies also form a key part of this section of the report. The recent enhancements that took place in the global market and their influence on the future growth of the market have also been presented through this study.

Report Highlights:

Strategic Points Covered in Table of Content of Cognitive Cloud Computing Market:

Chapter 1: Introduction, market driving force product Objective of Study and Research Scope the Global Cognitive Cloud Computing market

Chapter 2: Exclusive Summary the basic information of the Global Cognitive Cloud Computing Market.

Chapter 3: Changing Impact on Market Dynamics- Drivers, Trends and Challenges & Opportunities of the Global Cognitive Cloud Computing; Post COVID Analysis

Chapter 4: Presenting the Global Cognitive Cloud Computing Market Factor Analysis, Post COVID Impact Analysis, Porters Five Forces, Supply/Value Chain, PESTEL analysis, Market Entropy, Patent/Trademark Analysis.

Chapter 5: Displaying the by Type, End User and Region/Country 2015-2020

Chapter 6: Evaluating the leading manufacturers of the Global Cognitive Cloud Computing market which consists of its Competitive Landscape, Peer Group Analysis, BCG Matrix & Company Profile

Chapter 7: To evaluate the market by segments, by countries and by Manufacturers/Company with revenue share and sales by key countries in these various regions (2021-2026)

.

Buy this research @ https://www.advancemarketanalytics.com/buy-now?format=1&report=161546

Key questions answered

Thanks for reading this article; you can also get individual chapter wise section or region wise report version like North America, Middle East, Africa, Europe or LATAM, Asia.

Contact US :

Craig Francis (PR & Marketing Manager)AMA Research & Media LLPUnit No. 429, Parsonage Road Edison, NJNew Jersey USA 08837Phone: +1 (206) 317 1218sales@advancemarketanalytics.com

Connect with us athttps://www.linkedin.com/company/advance-market-analyticshttps://www.facebook.com/AMA-Research-Media-LLP-344722399585916https://twitter.com/amareport

View post:
Cognitive Cloud Computing Market Next Big Thing : Major Giants Google, 3M, Microsoft The Shotcaller - The Shotcaller

Read More..

AI, cloud to bring about ‘next generation’ of GAO oversight – Federal News Network

The Government Accountability Office is resolute in its commitment to transforming its oversight through artificial intelligence and cloud systems.

GAOs chief scientist Tim Persons said in an interview with Federal News Network that these emerging capabilities transformed analytics within the agency. Users can now search keywords to yield specific paragraphs, and interactive dashboards enable staff to immerse themselves in different pockets of data.

We have special authorities and access as an agency into just the entire array of federal government problems, Persons said of GAO, as part ofFederal Monthly Insights Cloud and Artificial Intelligence. Were trying to make good government, better government And a lot of that is computing, or converting, questions into answers in the state-of-the-art way.

In the past year, for example, GAO has worked closely with the General Services Administrations Centers of Excellence program to build a cloud infrastructure for better analytics. A cloud-based system, refined in the GAOs Innovation Lab, stands out as one example of technology thats ushering in the next generation of oversight.

We didnt invent analytics by coming up with the Innovation Lab, Persons said on Federal Drive with Tom Temin. It was a different environment sandbox, agile, cloud-based, new tools inclusive of AI.

Another instance of this innovation grew visible within Operation Warp Speed. As the country inched closer to a vaccine rollout, GAO created a data analytics vaccine dashboard using cloud services. The agency took data from different health enterprises, such as the National Institutes of Health, and provided updates in real time to Congress and the White House.

It was nice to have this state-of-the-art, cloud-based, real-time updatable type thing, which we think is a model for, is exciting for, what we can do in the future, Persons said.

In GAOs Innovation Lab, Persons and his team build their technology through a process of reverse engineering. They look at the deficiencies in government oversight, then build capabilities like AI in the cloud to solve those problems.

But GAO wont utilize AI in a way that entirely eliminates human involvement, Persons said. While machine learning has a multitude of applications, Persons envisions a human-centered future in oversight rather than a droid-centered one.

Improving this technology requires constant iterating, though. Persons said the Innovation Lab includes a sandbox environment, where analysts and investigators can experiment in a trial-and-error fashion.

When AI fails not if, [but] when it fails you understand why it fails, and you iterate and fix the problem and then drive toward a better solution, Persons said. Its a different mindset than what often is in the federal government about failure is not an option.

The process of refinement becomes especially important within GAO, which has a higher risk profile around its data than many agencies.

GAO are the stewards of everyone elses data, Persons said. Cloud often sounds like Im just going to dump all this in a data lake its not that at all. Youre going to have a nice, strong data governance system.

These innovations will also transform analytics within GAO to a less hands-on system for chief information officers and other employees. Systems will require less day-to-day management, as Persons said he believes the shift to a cloud system will reduce the agencys on-premise data center footprint.

Converting to cloud, however, also means reskilling the workforce. Already, staff have trained to become more data literate. And in the Innovation Lab, workforce training is as critical as the technology itself, Persons said.

See the article here:
AI, cloud to bring about 'next generation' of GAO oversight - Federal News Network

Read More..

Google Cloud to start hosting some parts of YouTube platform – DIGIT.FYI

Tech firm Google has announced it intends to move some parts of video platform YouTube onto its Google Cloud systems.

YouTube is currently run on internal computer systems held at the tech firms data centres. However, Google said last week it wants to begin moving across to the cloud as it looks to expand further into the cloud-computing market.

Migration would also help the firm to become less reliant on advertisements within searches and on videos.

In an interview with CNBC, Google Cloud CEO Thomas Kurian said: Part of evolving the cloud is having our own services use it more and more, and they are. Parts of YouTube are moving to Google Cloud.

Speaking to the US broadcaster, Kurian was not clear on the timeframe of the move to the Google Cloud platform, the amount of YouTubes data being migrated or what parts would be being transferred.

Google has historically used a hybrid storage system, allowing its data centres to coexist with its cloud platform, and so far has made little attempt to fully migrate its larger properties to its public cloud. Currently, smaller programmes like Waze, Google Workspace and DeepMind use Google cloud infrastructure.

And YouTube is certainly a big platform to start with. Google acquired YouTube in 2006 in a deal worth around $1.65 billion, and it is currently the second-largest website online. The platform boasts a huge number of viewers per month, with current estimates at more than 2 billion.

Googles move to migrate large elements of its empire across to its cloud service now brings it more in line with competitors Amazon and Microsoft, who are both huge players in the cloud computing market.

The cloud is fast becoming a viable option for storage purposes, with other services like Amazon Web Services (AWS) being used by thousands of companies around the world. And the cloud can be massively valuable for firms, particularly during the Covid-19 pandemic when revenue at AWS grew by 32% to $13.5bn.

Google Cloud is now being recognised as a potentially important part of the fintech sector in Scotland, with the announcement in November 2020 that the service has been welcomed by FinTech Scotland into the countrys fintech cluster to help the growth of the countrys SME community.

In January, Edinburgh University became the first in Scotland to announce the migration of its core IT systems to the Oracle Cloud.

The three-phase implementation project was delivered with computer consultancy firm Inoapps, with the first stage of the universitys People and Money programme now live in the Oracle Cloud.

The shift to Cloud-based storage processes will be a key theme at the upcoming Cloud First Virtual Summit, held on 23rd June.

The conference will bring together senior technologists, Cloud architects and business transformation specialists to explore new advancements and best practice.

Register your free place now at:www.cloudfirstsummit.com

Like Loading...

Related

Read more:
Google Cloud to start hosting some parts of YouTube platform - DIGIT.FYI

Read More..

Caltech Undergrad Wins Dual Computer Science Awards – Caltech

Undergraduate student Laura Lewis, who is majoring in math and computer science, has earned a pair of national computer science awards.

In April, Lewis, a rising junior from Chester Springs, Pennsylvania, won both the qBraid Technical Challenge in the Quantum Coalition Hack and the National Center for Women and Information Technology (NCWIT) 2021 Collegiate Award. At Caltech, Lewis is advised by Elena Mantovan, professor of mathematics; Claire Ralph, lecturer in computing and mathematical sciences; and Thomas Vidick, professor of computing and mathematical sciences.

The Quantum Coalition Hackathon is a global quantum computing contest organized by Yale and Stanford, and sponsored by Google Quantum AI, IBM Quantum Computing, Microsoft, Amazon, IonQ, qBraid, and other private companies investing in the quantum computing field. There were more than 2,100 participants from more than 70 countries. Lewis, whose primary area of scholarly interest is quantum computing, found out about the contest through an email from the Caltech Physics Club and decided to check it out.

During the event, which ran April 1011, Lewis and her fellow participants chose from a list of challenges, which they then had 24 hours to tackle. Lewis chose a problem related to Shor's algorithm.

Shor's algorithm, published in 1994 by mathematician Peter Shor (BS '81), describes how, in theory, to factor incredibly large numbers efficiently using quantum computers. It is believed that Shor's algorithm will be the downfall of RSA cryptography, a widely used secure data transmission system that creates public and private keys. The private keys are prime numbers, and the product of those two numbers is the public key. Anyone can encrypt information using the public key, but once they have, the information can only be decrypted using the private keys. The system relies on the fact that it is time consuming and computationally intensive to factor the product of two prime numbers to determine those private keys. However, with a functional quantum computer and Shor's algorithm, the process could be sped up to the point that RSA cryptography could be easily cracked.

Lewis has been studying Shor's algorithm and quantum cryptography with Vidick at Caltech's Institute for Quantum Information and Matter (IQIM), a National Science Foundation Physics Frontiers Center where researchers study physical systems in which the effects of the quantum world can be observed on macroscopic scales.

"We are currently working on quantum verification: you have a quantum computer, make it do a computation, then have a normal computer check that the computation is correct, efficiently," Lewis says. "Known protocols for this require functions with special properties, and to compute these functions, we utilize modular arithmetic, a type of mathematics that is important in cryptography. So, a large part of my work with Dr. Vidick is figuring out how to perform modular arithmetic on a quantum computer."

So far, computer scientists have been able to factor the number 15 on a quantum computer by hardcoding the modular arithmetic. "I thought, OK, that seems not very good, and I'm going to try something new,'" Lewis says. She developed an implementation for a quantum computer that is not just meant to factor a single number by generalizing this modular arithmetic; in theory, she says, it is capable of factoring any number. "QBraid told me that the reason they picked me as the winner is that they'd never seen a fully general implementation of a factoring program for a quantum computer," Lewis says.

Coming off of that success, Lewis was also honored by the National Center for Women and Information Technology (NCWIT), a group she has been involved with since high school. "They have a great network for women in technology, which is important," Lewis says.

The award recognizes "technical contributions to projects that demonstrate a high level of innovation and potential impact," according to NCWIT. The application for the award included a presentation of the project to a general audience. "Presenting about quantum computing to a general audience in under seven minutes was a challenge," Lewis says.

Lewis's project was on verifiable quantum computation. Quantum computers exist, but they are "noisy"that is, heat and electromagnetic noise can disrupt the functioning of quantum bits, or "qubits." Lewis's project, titled "Implementing Remote-State Preparation on a Noisy Intermediate-Size Quantum Device," addresses the issue of verifying that a computation performed on a quantum computer is correct.

Lewis has a longstanding interest in math and science; it started with a contest she tackled in middle school to see who could learn the programming language Python the fastest. She picked it up quickly and started learning other languages from there. In high school, Lewis was the head programmer on a robotics team and went to the FIRST (For Inspiration and Recognition of Science and Technology) Robotics World Championship. "That's when I realized that I liked programming a lot," she says.

While applying for college, Lewis received offers from 11 other universities before selecting Caltech. "It was really the Caltech Up-Close Program that convinced me to come here," she says, referring to the annual program in which prospective undergraduate students from historically underrepresented backgrounds who have an interest in math, science, and/or engineering stay overnight in the student residences, meet current students, and interact with other members of the Caltech community. "I got to know admission staff like Jarrid Whitney [assistant vice president for student affairs, enrollment and career services ] and Derek Terrell [former admissions officer]. They helped show me what a great environment there is here."

Lewis became interested in quantum computing at Caltech through one of the computer science option "pizza courses," lunchtime classes that explore a range of topics in-depth. In this one, Adam Wierman, professor of computing and mathematical sciences, explained the Caltech Information Science and Technology CS+X initiative, which leverages Caltech's expertise in computer science to advance other disciplines across campus. Specifically, he spoke about the possibilities created by the burgeoning field of quantum computing. "I thought, 'That's really interesting,' and emailed Professor Wierman, who then put me in touch with Professor Vidick," she says.

As the COVID-19 pandemic reached the United States and Caltech's undergraduate education moved online, Lewis and other students faced new challenges, including taking classes on West Coast time from her home in Pennsylvania. She looks forward to returning to campus in the fall.

"The Caltech undergraduate experience is truly one of a kind. After spending over a year away from campus, I can't wait to go back and physically be a part of the community again," says Lewis.

View post:

Caltech Undergrad Wins Dual Computer Science Awards - Caltech

Read More..

ECSU Ranked One of Top Most Affordable Computer Science Programs in the Country – Elizabeth City State University

The college ranking websiteUniversityHQ.orghas named Elizabeth City State University one of the top most affordable computer science program in the country.

According to the sites listing for 100 Best Affordable Computer Science School Degrees, ECSU took the number two spot due to a number of factors, including financial aid and tuition. ECSU is an NC Promise Tuition school, making it one of the most affordable universitys in North Carolina.

ECSUs Bachelor of Science in Computer Science, with concentrations in Data Science or Computer Information Systems, is one of the best kept secrets in Northeastern North Carolina, said Dr. Kenneth L. Jones, chair of the Department of Mathematics, Computer Science and Engineering Technology. Were thrilled to see that the secret is getting out through this recognition.

Dr. Jones says the world is more computer-dependent now than ever before and there is a high demand for well-trained computing experts. The employment of computer scientists is expected to increase much faster than most other areas of employment. The U.S. Department of Labor estimates that graduates in computer science will have the best job prospects for the coming decades.

Our program is preparing ECSU students for careers in 21stcentury computing, with a solid foundation in computer science, offering them a knowledge base and skills to compete in a highly competitive world, said Dr. Jones. Our curriculum provides students with exposure to a number of industry-standard subjects including Python and Visualization, Bioinformatics, Software Engineering, Data Mining and Machine Learning.

UniversityHQs ranking criteria also includes retention rates. Because student success is a top priority for ECSU, the university recently announced the launch of the VikingPlus program,a comprehensive set of initiatives to help students afford a high-quality college education.

The university will award new funds underVikingPlus this year and has already provided a total of nearly $4.2 million in free credits, additional emergency funding, and housing and meal plan grants since spring 2020.

UniversityHQ is an independent educational organization providing information for students looking to pursue a degree in higher education. For more information about ECSUs number two ranking, goHERE.

Post Views: 457

See the rest here:

ECSU Ranked One of Top Most Affordable Computer Science Programs in the Country - Elizabeth City State University

Read More..