Page 1,820«..1020..1,8191,8201,8211,822..1,8301,840..»

What is Cloud Computing: Definition, Career & Scope

Cloud computing is the mechanism through which computer services, including servers, storage, databases, networking, software, analytics, and intelligence, are delivered or supplied through the Internet to bring faster innovation, more flexible resources, and economies of scale. You generally pay just for the cloud services you use, lowering your operational expenses, allowing you to manage your infrastructure more effectively, and allowing you to grow as your company needs evolve.

A cloud may be both private and public. Anyone with an internet connection can be sold public cloud services. However, private cloud infrastructure is a network or data center that provides hosted services to a limited number of users with restricted access and rights. The purpose of cloud computing, whether private or public, is to give simple, scalable access to computer resources and IT services.

Cloud infrastructure refers to the hardware and software components required to properly implement a cloud computing system. Cloud computing is also known as utility computing and on-demand computing.

Here are some cloud computing examples: Dropbox, Salesforce, Cisco Webex, etc.

Cloud computing can be defined as the delivery of various services through the Internet. Tools and applications such as data storage, servers, databases, networking, and software are examples of these resources.

There are mainly three types of Cloud Computing. They have been explained below:

IaaS could be referred to as a cloud service provider managing your infrastructurethe real servers, network, virtualization, and data storagevia an internet connection. The user gains access via an API or dashboard and essentially leases the infrastructure. The user handles things like the operating system, applications, and middleware. In contrast, the provider manages every hardware, networking, hard drive, data storage, and server, as well as outages, repairs, and hardware concerns. This is the most common deployment type used by cloud storage providers.

Here are some examples of IaaS: DigitalOcean, Rackspace, Amazon Web Services (AWS), Linode, Microsoft Azure, Google Compute Engine (GCE), and Cisco Metacloud.

PaaS refers to the provision and management of hardware and an application-software platform by a third-party cloud service provider. However, the user is responsible for the applications that run on top of the platform and the data on which the apps rely. PaaS, which developers and programmers mostly use, provides users with a standard cloud service for application development and administration without creating and maintaining the infrastructure generally involved with the process.

Here are some examples of PaaS: AWS Elastic Beanstalk, Google App Engine, Microsoft Azure Web Apps, and Google Cloud SQL.

SaaS is a service that provides its consumers with a software application that the cloud service provider manages. SaaS apps are often online applications or mobile applications that consumers may access using a web browser. The user is responsible for software updates, bug patches, and other fundamental software maintenance, and they connect to cloud apps via a dashboard or API. SaaS also eliminates the requirement for each users computer to have an app downloaded locally, allowing for team access to the program.

Here are some examples of SaaS: Microsoft Office365, Google GSuite, Salesforce, DocuSign, MailChimp, Dropbox, and Slack.

LearnSoftware Development Coursesonline from the Worlds top Universities. Earn Executive PG Programs, Advanced Certificate Programs or Masters Programs to fast-track your career.

A Career in Cloud Computing

Cloud automation experts are needed to create, install, and manage automation technology since it relocates to the cloud as the world becomes more automated. It relieves human employees of repetitive activities through automation.

A cloud consultant is an expert in cloud technology who advises businesses searching for cloud-based products. Typically, a specialist will examine a companys requirements and recommend software and equipment to best satisfy that firms technical and financial requirements. A cloud consultant may also assist with the cloud move by developing migration strategies and identifying relevant platforms. In addition, consultants may be required to assist in tailoring a companys cloud presence on occasion. Thus, they should be well-versed in both broad and in-depth understanding of the leading cloud platforms.

Cloud engineers are information technology specialists who develop, deploy and maintain cloud-based solutions for enterprises. They create and deploy cloud applications, transfer on-premises applications to the cloud, and troubleshoot cloud stacks.

These individuals are responsible for the safety of their organizations Cloud systems, including evaluating possible risks and recommending best-fit technology to improve Cloud security.

These are the individuals in charge of implementing their companys entire cloud strategy. They analyze the business needs and design appropriate solutions utilizing suitable cloud services.

Scope in the field of Cloud Computing

Cloud computing benefits every business in many ways. It allows for easy information retrieval, offers virtual storage space, and handles backup difficulties. It also protects against illegal access and data loss. It enables businesses to save significantly on services and infrastructure for data storage, software licensing, servers, and hardware. With such enormous relevance in the technical arena, the future of cloud computing has become a critical problem to be addressed. Cloud computing has snowballed over the years, owing to the increasing dependence of large organizations on this technology.

Many businesses are concerned about security when using a cloud computing solution. A cloud hosts full-time duty is to closely monitor security, which is much more efficient than a traditional in-house system, in which an organization must split its efforts among a plethora of IT problems security is just one of them. The encryption of data transported over networks and kept in databases is the key to this increased security. By encrypting your data, hackers and anyone not authorized to view it are less likely to gain access to it. As an extra security step, most cloud-based services allow you to configure various security settings based on the user.

Business growth is inextricably linked to innovation. Using old technology might limit an organizations capacity to try out new solutions and deploy such solutions on a large scale. Combating back-end performance issues may be complex, particularly in the world of online applications. Using the cloud as a foundation for innovation can result in higher performance, cheaper costs, and enhanced agility.

Given the current environmental situation, it is no longer sufficient for businesses to set a recycling bin in the breakroom and claim that they are helping the earth. Cloud infrastructures promote environmental stewardship by powering virtual services instead of actual items and hardware, decreasing paper waste, increasing energy efficiency, and lowering emissions produced by commuters, given that employees may use it from anywhere with internet access.

One of the chief factors why cloud computing is significant for companies is its cost-effectiveness. Although cloud migration can be costly, the best way to tackle the pricing problem isnt to consider how much money you might just save by migrating. Assess how much your firm is presently spending on IT services against how much you will spend on the cloud.

To pursue a career in cloud computing, one needs to have a bachelor of science in software engineering, data science, computer science, or a similar area that is generally necessary. Computer programming languages are often utilized in cloud-based development, such as Perl, Python, Ruby, PHP, Java, or. NET.

Cloud computing is gaining popularity among organizations, and it has proven especially advantageous for bigger enterprises with global operations. The cloud enables these sophisticated enterprises to have remote access to data and software at any time. This provides cost-effective alternatives that may be scaled to meet the demands of businesses.

Cloud engineers are in charge of overseeing an organization's cloud-based systems and operations in general. Setting up architectures utilizing cloud providers like AWS, Microsoft Azure, Google Cloud, and others are examples of particular jobs that fall under this category.

Want to share this article?

Follow this link:
What is Cloud Computing: Definition, Career & Scope

Read More..

Deloitte Cloud Services | Deloitte US – Deloitte United States

Ecosystems and alliances

Our clients need technology solutions that fit their businesses and power their ambitions. To deliver, Deloitte orchestrates business ecosystems of client and technology vendor relationships. We combine best-of-breed technology providers across every phase of delivery with Deloittes consulting business acumen and strong industry relationships to help your organization make the right technology choices and accelerate your possible.

We know the landscape of cloud vendors and how their varied offerings can help address specific business challenges. Our independent approach brings objectivity and deep experience to the process of vendor selection.

Learn more >

Our extensive alliance ecosystem guided by our sector and industry knowledge gives us the ability to select, connect, and customize cloud solutions that fit.

Learn more >

Through our ability to customize delivery and terms leveraging our Public Cloud Provider practices, and with our preferred partner status, we can help you plan, implement, and scale your cloud transformations cost-effectively.

Learn more >

See the original post here:
Deloitte Cloud Services | Deloitte US - Deloitte United States

Read More..

Cloud Computing Continues to Exhibit Strong Growth in 2022 – ETF Trends

By Christopher Gannatti, CFAGlobal Head of Research

When you think aboutcloud computing companiesthis year, the most likely starting point will be performance1:

However, from June 16 to August 22 this year2:

The bottom line:The dominant force behind the performance of cloud computing companies has beenmacroeconomic, meaning that as theU.S. Federal Reserveand othercentral bankspursue more restrictive monetary policies to fightinflation, thevaluationsof cloud companies have fallen. Similarly, if investors feel that inflation is easing in any wayand subsequently, central banks may slow the pace of tighteningthere has tended to be a strong positive share price response.

The BVP Nasdaq Emerging Cloud Index: August 2022 Rebalance

We mention the BVP Nasdaq Emerging Cloud Index as a measure of the performance of cloud companies because it is designed to offer a precise exposure to cloud companies growing revenues by serving enterprise customers. What we see in figure 13:

Figure 1: Bringing the BVP Nasdaq Emerging Cloud Index Back to Equal Weight

The Fundamentals Will Matter Again

Up to this writing in August, it would be difficult for us to note that the main catalyst for the share price performance of cloud companies has to do withfundamentalslike revenue growth. As we noted earlier, the main catalyst has been the macroeconomic backdrop.

However, company fundamentals are always an important force and will always come back to prominence once macro pressures fade. What we see in figure 24:

Figure 2: Gauging the Fundamentals

Conclusion: Cloud Companies Will Continue to Deliver Exciting Results

In cloud computing, its important to look at all the available signals to gain the most appropriate sense of market conditions.

Bessemer Venture Partners has just put out its annual Cloud 100 Benchmarks report for 2022.5This report specifically looked at the largest and most dynamic private cloud companies, which provide important signals for the overall health of the business model.

In 2022, Bessemer specifically notes that the valuation of private companies may not be the best metric to look at if the goal is to get a sense of the health of a given market. For instance, if companies have not raised money recently, they may not have their valuations marked all the way to present market conditions. Bessemer instead focuses on what they call Centaurs. While a Unicorn has $1 billion in private market valuation, a Centaur has $100 million inannual recurring revenue.

For the 2022 Cloud 100, 70% are already achieving Centaur status and a further 10% more are quite close and could reasonably do it before the year is out. In an environment where the market is focusing much more on results than exciting stories and private funding is harder to come by, proving business success at the Centaur level is indeed important.

At WisdomTree, we work directly with Bessemer Venture Partners and Nasdaq to provide an investment strategy that seeks to track the returns of the BVP Nasdaq Emerging Cloud Index, theWisdomTree Cloud Computing Fund. If you are thinking it is an interesting time to learn more about the investment strategy, please visit ourcloud computingresearch.

1Source: Bloomberg, with data from 11/9/216/16/22.2Source: Bloomberg, with data from 6/16/228/22/22.3Source: The six-month period between rebalances is 2/22/228/22/22. The performance source is Bloomberg.4Sources: WisdomTree, Nasdaq and Bloomberg, with data measured as of 8/22/22. Further details in sourcing are below figure 2.5Source: DOnofrio, Teng, Schmitt, The 2022 Cloud 100 Benchmarks, Bessemer Venture Partners, 8/9/22.

Originally published by WisdomTree on August 30, 2022.

For more news, information, and strategy, visit the Modern Alpha Channel.

Important Risks Related to this Article

As of August 26, 2022, WCLD held 1.33%, 1.21%, 1.35%, 1.29%, 1.26%, 1.40%, 1.41%, 1.70% and 1.45% of its weight in RingCentral, Asana, Blend Labs, Paylocity Holding Corp, Box, Qualys, Gitlab, Snowflake and SentinelOne, respectively.

Christopher Gannatti is an employee of WisdomTree UK Limited, a European subsidiary of WisdomTree Asset Management, Inc.s parent company, WisdomTree Investments, Inc.

There are risks associated with investing, including the possible loss of principal. The Fund invests in cloud computing companies, which are heavily dependent on the internet and utilizing a distributed network of servers over the internet. Cloud computing companies may have limited product lines, markets, financial resources or personnel and are subject to the risks of changes in business cycles, world economic growth, technological progress and government regulation. These companies typically face intense competition and potentially rapid product obsolescence. Additionally, many cloud computing companies store sensitive consumer information and could be the target of cybersecurity attacks and other types of theft, which could have a negative impact on these companies and the Fund. Securities of cloud computing companies tend to be more volatile than securities of companies that rely less heavily on technology and, specifically, the internet. Cloud computing companies can typically engage in significant amounts of spending on research and development, and rapid changes to the field could have a material adverse effect on a companys operating results. The composition of the Index is heavily dependent on quantitative and qualitative information and data from one or more third parties, and the Index may not perform as intended. Please read the Funds prospectus for specific details regarding the Funds risk profile.

Read more from the original source:
Cloud Computing Continues to Exhibit Strong Growth in 2022 - ETF Trends

Read More..

An Ocean of Data: The Future of Cloud Computing Sify – Sify

With climate change at the doorstep, Adarsh Vinay looks into the potential future of data centers

Citing environmental concerns, operational costs and the need for superfast connectivity, major market players like Microsoft and Google are looking to shift their data centers underwater.

A study by Datareportal earlier this year revealed that over 5 billion people around the world use the internet every day. In other words, 63.1 per cent of the worlds total population is accessing data online.

The study also revealed that internet users are increasing at an annual rate of almost 4 per cent. This means that two-thirds of the worlds population should be online sometime in the second half of 2023.

As the number of people accessing the internet continues to increase, there is a need to improve the physical infrastructure thats needed to support all that data. Cloud computing is an integral part of every software solutions provider and has seen a surge in demand over the course of the last decade.

The Need to go Underwater

The multitude of server networks in these data centers consume a lot of power and pump out a lot of waste. They are also high maintenance and suffer a lot of corrosion due to the oxygen and humidity on land. Temperature fluctuations and power failures also affect the effectiveness of the data centers and the maintenance cost for this is unimaginably high.

Thats not all! Another area of concern is the scarcity of space. With the surge in the need for more data centers around the globe, on-land options are bound to run out in the next 10 years.

This is why the outrageous idea of underwater data centers was first proposed.

The Out-of-the-box Idea!

Microsoft hosts an annual event called ThinkWeek where it urges employees to share out-of-the-box ideas. During the 2014 ThinkWeek, an underwater data center was proposed as a potential way to provide lightning-quick cloud services to coastal populations while also saving energy.

The reasoning was that more than fifty percent of the worlds population lives within 120 miles of the coast. With underwater data centers near coastal cities, the data would only have to travel a shorter distance, thereby speeding up the surfing and streaming requirements of the masses.

The Switch to Underwater

After the idea was received and researched, phase 1 of Project Natick was conducted in the Pacific Ocean in 2015. A relatively small data center -107 feet in dimensions and weighing 38,000 pounds was deployed underwater for just over 3 months.

After 105 days, it was retrieved from the ocean bed and extensive studies revealed that it had delivered expected results with minimal complications and drastically less power consumption.

So in 2018, Microsoft launched phase 2 of Project Natick. This was an extensive three year project which would involve submerging a full-fledged data center with 12 racks and 864 servers underwater. Scotlands Orkney Islands was chosen as the location because of its popularity as a renewable energy research destination. Tidal currents that are 9 miles per hour and waves that reach 10-60 feet high make it an intriguing site to evaluate the economic viability of a fully submerged data center.

For the next two years, researchers tested and monitored the performance and reliability of the data centers servers. When this data center was retrieved at the start of 2020, it was noted that there were just a handful of failed servers and related cables.

Though all the reasons for the same are still being researched as to why, the underwater data center was found to be 8 times more reliable than its on-land counterpart. So as far as Microsoft is concerned, Project Natick has been a grand success.

The War for Water

From 2007 to 2012, Google was using regular drinking water to cool its data center in Douglas County, Atlanta. But after environmental conversationalists started filing lawsuits, the company switched to recycled water to help conserve the Chattahoochee River. But this is proving difficult as recycling options are not available in all the locations where they have data centers.

Google has also been very closed about its water usage. It considers it a proprietary trade secret but information continues to leak out as part of legal battles and lawsuits. In 2019 alone, it was reported that Google either requested or was granted over 2.3 billion gallons in three different states in the USA. Google however has insisted that they dont use all the water they are granted.

As the company continues to chase Microsoft and Amazon in the cloud-computing market, things are bound to get more complicated in the future. As of now, Google has 21 data centers but it intends to spend USD 10 billion on data centers in the next year alone.

The Future of Underwater Centers

The overall reliability of a data center in a sealed container on the ocean floor is great for environmental, economical and connectivity reasons.

Microsoft, Google and Amazon are already looking to serve customers who need to deploy and operate tactical and critical data centers anywhere in the world. Wherever possible, they are looking to deploy these centers off the coast.

Another great motivation to relocate underseas is Microsofts aim to being carbon-negative by 2030. The company intends to do this by developing long-term cloud infrastructure. In addition, it hopes to have entirely shifted its technical reliance on renewable energy sources by 2025.

All in all, just like cloud computing is here to say, so are underwater data centers. It might seem intriguing now but very soon, it looks like it is going to become the norm!

Read this article:
An Ocean of Data: The Future of Cloud Computing Sify - Sify

Read More..

Cutting-edge information technologies adopted for agriculture in NE China’s Heilongjiang – Xinhua

Photo taken on Sept. 2, 2022 shows a number of weather monitoring devices in a paddy rice technology demonstration park in Fujin City of northeast China's Heilongjiang Province. Cutting-edge information technologies including the Internet-of-Things and cloud-computing have been adopted for breeding management, water-saving irrigation, weather monitoring, grain growth tracking, as well as pest and weed control at the demonstration park to improve both quantity and quality of rice produce. (Xinhua/Zhang Tao)

Aerial photo taken on Sept. 2, 2022 shows an art creation made up of living plants of different colors in a paddy rice technology demonstration park in Fujin City of northeast China's Heilongjiang Province. Cutting-edge information technologies including the Internet-of-Things and cloud-computing have been adopted for breeding management, water-saving irrigation, weather monitoring, grain growth tracking, as well as pest and weed control at the demonstration park to improve both quantity and quality of rice produce. (Xinhua/Zhang Tao)

Photo taken on Sept. 2, 2022 shows some electric devices deployed in a paddy rice technology demonstration park in Fujin City of northeast China's Heilongjiang Province. Cutting-edge information technologies including the Internet-of-Things and cloud-computing have been adopted for breeding management, water-saving irrigation, weather monitoring, grain growth tracking, as well as pest and weed control at the demonstration park to improve both quantity and quality of rice produce. (Xinhua/Lan Hongguang)

Photo taken on Sept. 2, 2022 shows a field with water-saving and reduced-draining technologies in a paddy rice technology demonstration park in Fujin City of northeast China's Heilongjiang Province. Cutting-edge information technologies including the Internet-of-Things and cloud-computing have been adopted for breeding management, water-saving irrigation, weather monitoring, grain growth tracking, as well as pest and weed control at the demonstration park to improve both quantity and quality of rice produce. (Xinhua/Zhang Tao)

A staff member operates at the modern agricultural information service center of a paddy rice technology demonstration park in Fujin City of northeast China's Heilongjiang Province, on Sept. 2, 2022. Cutting-edge information technologies including the Internet-of-Things and cloud-computing have been adopted for breeding management, water-saving irrigation, weather monitoring, grain growth tracking, as well as pest and weed control at the demonstration park to improve both quantity and quality of rice produce. (Xinhua/Lan Hongguang)

Photo taken on Sept. 2, 2022 shows an art creation made up of living plants of different colors in a paddy rice technology demonstration park in Fujin City of northeast China's Heilongjiang Province. Cutting-edge information technologies including the Internet-of-Things and cloud-computing have been adopted for breeding management, water-saving irrigation, weather monitoring, grain growth tracking, as well as pest and weed control at the demonstration park to improve both quantity and quality of rice produce. (Xinhua/Wang Jianwei)

Photo taken on Sept. 2, 2022 shows a pest-control device in a paddy rice technology demonstration park in Fujin City of northeast China's Heilongjiang Province. Cutting-edge information technologies including the Internet-of-Things and cloud-computing have been adopted for breeding management, water-saving irrigation, weather monitoring, grain growth tracking, as well as pest and weed control at the demonstration park to improve both quantity and quality of rice produce. (Xinhua/Zhang Tao)

Photo taken on Sept. 2, 2022 shows an art creation made up of living plants of different colors in a paddy rice technology demonstration park in Fujin City of northeast China's Heilongjiang Province. Cutting-edge information technologies including the Internet-of-Things and cloud-computing have been adopted for breeding management, water-saving irrigation, weather monitoring, grain growth tracking, as well as pest and weed control at the demonstration park to improve both quantity and quality of rice produce. (Xinhua/Wang Jianwei)

Aerial photo taken on Sept. 2, 2022 shows an irrigation volume experiment zone in a paddy rice technology demonstration park in Fujin City of northeast China's Heilongjiang Province. Cutting-edge information technologies including the Internet-of-Things and cloud-computing have been adopted for breeding management, water-saving irrigation, weather monitoring, grain growth tracking, as well as pest and weed control at the demonstration park to improve both quantity and quality of rice produce. (Xinhua/Zhang Tao)

A staff member gives a briefing at the modern agricultural information service center of a paddy rice technology demonstration park in Fujin City of northeast China's Heilongjiang Province, on Sept. 2, 2022. Cutting-edge information technologies including the Internet-of-Things and cloud-computing have been adopted for breeding management, water-saving irrigation, weather monitoring, grain growth tracking, as well as pest and weed control at the demonstration park to improve both quantity and quality of rice produce. (Xinhua/Wang Jianwei)

Photo taken on Sept. 2, 2022 shows the modern agricultural information service center of a paddy rice technology demonstration park in Fujin City of northeast China's Heilongjiang Province. Cutting-edge information technologies including the Internet-of-Things and cloud-computing have been adopted for breeding management, water-saving irrigation, weather monitoring, grain growth tracking, as well as pest and weed control at the demonstration park to improve both quantity and quality of rice produce. (Xinhua/Wang Jianwei)

Photo taken on Sept. 2, 2022 shows a weather monitoring device in a paddy rice technology demonstration park in Fujin City of northeast China's Heilongjiang Province. Cutting-edge information technologies including the Internet-of-Things and cloud-computing have been adopted for breeding management, water-saving irrigation, weather monitoring, grain growth tracking, as well as pest and weed control at the demonstration park to improve both quantity and quality of rice produce. (Xinhua/Zhang Tao)

A staff member patrols green houses in a paddy rice technology demonstration park in Fujin City of northeast China's Heilongjiang Province, on Sept. 2, 2022. Cutting-edge information technologies including the Internet-of-Things and cloud-computing have been adopted for breeding management, water-saving irrigation, weather monitoring, grain growth tracking, as well as pest and weed control at the demonstration park to improve both quantity and quality of rice produce. (Xinhua/Zhang Tao)

Original post:
Cutting-edge information technologies adopted for agriculture in NE China's Heilongjiang - Xinhua

Read More..

Jack Henry teams Up with Google Cloud to accelerate its multi-year modernisation strategy – ETCIO South East Asia

Jack Henry (Nasdaq: JKHY), a leading provider of technology solutions for the financial services industry, announced today a collaboration with Google Cloud to further enable its multi-year next-generation technology strategy focused on helping community and regional financial institutions innovate faster and meet the evolving needs of their accountholders.

Through its collaboration with Google Cloud, Jack Henry will build cloud-first technologies that modernise existing data, reporting, and integration systems while creating a new set of services that form a modern digital core for banks and credit unions. Google Cloud offers a number of technological benefits, including industry-leading scalability and operational uptime, leading security and compliance solutions, best-in-class data and artificial intelligence (AI) platforms, and the flexibility to integrate services with other cloud providers. Building capabilities on the public cloud is a strategy that complements Jack Henry's existing core systems.

"We are pleased to work with Google Cloud to offer community and regional financial institutions our banking services natively in the cloud," said Jack Henry Board Chair and CEO David Foss. "Community and regional financial institutions are the lifeblood of Main Street America, and they are uniquely positioned to help consumers and businesses achieve financial health. Google Cloud is an important part of our next-generation technology strategy and will enable our bank and credit union clients to successfully compete in the ever-changing technology landscape by providing leading-edge, fast, reliable, and secure digital banking experiences to consumers and businesses that are only possible in the public cloud."

"Consumers today expect a personalised, on-demand digital banking experience that is secure, fast, and frictionless," said Thomas Kurian, CEO of Google Cloud. "We look forward to providing Jack Henry with secure and scalable cloud infrastructure to modernise their tech stack and accelerate innovation for their customers. With cloud-first tools and services, community banks and credit unions can create digital experiences that combine relationship banking with leading technology to better fit their clients' needs and promote financial wellness."

Read more:
Jack Henry teams Up with Google Cloud to accelerate its multi-year modernisation strategy - ETCIO South East Asia

Read More..

Is Hiring Quantum PhDs the Answer? – Quantum Computing Report

By Yuval Boger

When companies recognize that quantum computing has the potential to dramatically transform their business, they often seek to hire quantum information science PhDs to staff their quantum activities. The thought is that such PhDs are quantum experts and are equipped with knowledge and experience that can help accelerate a companys quantum program. But would hiring many such PhDs be a realistic approach? What might be good alternatives?

One challenge with hiring quantum PhDs is that there are not enough of them. McKinseys June 22 Quantum Technology Monitor reports that there were 851 active quantum computing job postings in Dec 21, yet annually only 290 quantum technology graduates are available to fill these positions without requiring significant training. The same report notes that only 12 universities in the US (and a total of 29 universities worldwide) offer a quantum technology masters degree, so its unlikely that the number of graduates will increase as quickly as the need for their services.

But thats not the only concern. Companies build quantum teams to explore quantum solutions to their specific challenges option pricing, chemical simulation, supply chain optimization, etc. How quickly would these quantum graduates pick up the intricacies of the business? Even if such a graduate became well-versed in high-end finance, for example, they might not have the personal relationships and interpersonal skills to navigate company politics and build organizational support for their efforts. They also often lack relationships with peers in the industry and thus might be limited in their ability to leverage lessons learned in other organizations.

An alternative could be up-skilling, providing quantum training to in-house scientists and engineers that already understand the business and are well-connected in the organization as well as in their respective industries. Quantum computing is a hot topic and, in my experience, many would be highly motivated to participate in quantum training. Many online (sometimes free) courses are available for both beginners and advanced users. Additionally, the emergence of higher-level libraries and abstraction layers makes it easier to create useful quantum software without mastering the fine details of how quantum computers are built or resorting to intricate low-level coding. Often, quantum computing efforts sometimes grow from the bottom up, not by executive edict, and motivated employees just need permission to spend more time learning and exploring. Last, up-skilling promotes employee retention and job satisfaction.

Another option is to plug the skills gap using consulting companies. Firms like BCG or Deloitte can perform two types of functions. The first educating executives, identifying promising use cases, and providing industry benchmarks can be very useful to accelerate a companys quantum program. The second actually writing quantum computing code, whether by generalist companies or those specializing in quantum computing can be a mixed blessing. They might provide trained, able consultants, but organizations sometimes worry about IP-sharing arrangements or the ability to develop their workforce when relying on outside parties.

Last, an emerging option is quantum API marketplaces. Just like Google provides an API for finding the best route between two points, quantum API marketplaces provide pay per use quantum algorithms for optimization, random number generation, and more. They potentially allow faster exploration of use cases without the burden of coding sophisticated algorithms.

Im not recommending shying away from hiring quantum PhDs but rather exploring an intelligent mix of these alternatives. Quantum computing is too important to ignore. Dont slow down the progress by exclusively relying on outside talent.

Yuval Boger is a quantum computing executive. Known as the original Qubit Guy, he most recently served as Chief Marketing Officer for Classiq.

September 1, 2022

Continue reading here:
Is Hiring Quantum PhDs the Answer? - Quantum Computing Report

Read More..

What Is Going on With Arqit Quantum (ARQQ) Stock Today? – InvestorPlace

Source: Amin Van / Shutterstock.com

Today has been highly volatile for Arqit Quantum(NASDAQ:ARQQ). The quantum computing company is preparing for the expiration of its voluntary lockup, the period that allows for deals by investors holding at least 105.9 million shares. As a result, ARQQ stock plunged after markets opened, falling more than 10%. Since then, however, it has been slowly trending upward and is now down only 4%. Its current trajectory hints that it could easily pull back into the green before the end of the day.

Despite todays initial plunge, ARQQ has displayed good momentum for most of the week and remains in the green for the past five days by more than 15%. Does this mean that the stock will continue rising after the lock-up period ends tomorrow, though? Lets take a closer look.

Its true that ARQQ stock has risen more than 25% over the past month. However, the company is still recovering from the late April 2022 crash that pushed many stocks down and hasnt displayed much growth until this week. Although Arqit briefly rallied in May on news that the White House would be taking measures to boost quantum computing, it has had no company-specific catalysts since then, casting the stock in a questionable light.

Now with the share lock-up period about to end, investors are left with even more questions. Its common for the end of a lock-up period to push a stock down, but a rally before it expires is less typical and likely wont last. As Seeking Alpha reports:

Of the shares subject to lock-up deals, officers, directors and employees. These are subject to black out period restrictions and cant be traded till after ARQQ reports results for the year ending Sept. 30. 19.5M shares are held by two institutional investors, each of which have a representative on ARQQs board, and 16.8M shares are held by three institutional investors that were early investors in the company.

The Sept. 30 date is an important one, as it may be followed by investors offloading shares. With shares being held by such a small group, any sales could push ARQQ stock down. Theres always the possibility that new investors will seize the opportunity to buy. That doesnt seem likely when we take a macro look at the company, though.

Some would argue that names like ARQQ stock have a distinct advantage; they are part of a niche sector with incredible potential. According to InvestorPlace Senior Analyst Luke Lango, quantum computing is the most underrated, most transformational technological breakthrough since the internet. However, when Lango listed his picks for quantum computing stocks to buy for maximum growth over the coming decade, ARQQ was nowhere to be found. Arqits smaller-cap rival Quantum Computing (NASDAQ:QUBT) did make the list, though. Other experts have also omitted ARQQ from lists of top quantum stocks, highlighting QBUT and peer Rigetti Computing (NASDAQ:RGTI). When we consider the low price points that both stocks trade at, theres no reason for investors to embrace ARQQ.

Arqit hasnt received much analyst coverage, but Seeking Alpha has issued a highly bearish take. The platform gives ARQQ stock an F profitability grading, citing inferior profitability and decelerating momentum vs. other IT stocks. Tomorrows lock-up expiration isnt likely to trigger any real momentum for a stock with limited growth prospects.

On the date of publication, Samuel OBrientdid not have (either directly or indirectly) any positions in the securities mentioned in this article.The opinions expressed in this article are those of the writer, subject to theInvestorPlace.comPublishing Guidelines.

Samuel OBrient has been covering financial markets and analyzing economic policy for three-plus years. His areas of expertise involve electric vehicle (EV) stocks, green energy and NFTs. OBrient loves helping everyone understand the complexities of economics. He is ranked in the top 15% of stock pickers on TipRanks.

Follow this link:
What Is Going on With Arqit Quantum (ARQQ) Stock Today? - InvestorPlace

Read More..

SandboxAQ Joins the Department of Defense Skillbridge Program to Place Military Community Members in Exciting Quantum Tech Careers – PR Web

The Skillbridge program is a win-win for retiring Service members looking to seamlessly transition to meaningful post-military careers and employers looking to hire highly skilled and motivated new talent, said Jen Sovada, President of SandboxAQ Public Sector.

PALO ALTO, Calif. (PRWEB) September 02, 2022

SandboxAQ, an enterprise SaaS company delivering the compound effects of AI and Quantum technology (AQ), today announced it is an authorized partner for the Department of Defense (DoD) Skillbridge program. The SkillBridge program enables Service members to gain valuable civilian work experience through specific industry training, apprenticeships, or internships during the last 180 days of service. SkillBridge connects Service members with industry partners in real-world job experiences.

Through Skillbridge, SandboxAQ will tap into a growing pool of talented and dedicated workers looking to apply their advanced degrees and military experience in fields such as cryptography, cybersecurity, AI, advanced mathematics, program management, natural and applied sciences and other disciplines to develop AQ solutions that solve some of the worlds toughest challenges.

The Skillbridge program is a win-win for retiring Service members looking to seamlessly transition to meaningful post-military careers and employers looking to hire highly skilled and motivated new talent, said Jen Sovada, President of SandboxAQ Public Sector. The quantum ecosystems tremendous growth creates incredible opportunities for Skillbridge participants to apply their advanced degrees and military training towards careers that will deliver groundbreaking technology solutions, protect our national security and improve our way of life.

For Service members SkillBridge provides an invaluable chance to work and learn in civilian career areas. For industry partners, SkillBridge is an opportunity to access and leverage the worlds most highly trained and motivated workforce at no cost. Service members participating in SkillBridge receive their military compensation and benefits, and industry partners provide the training and work experience.

For more information about our exciting career opportunities and employee benefits, please visit our careers page at https://www.sandboxaq.com/careers. To join SandboxAQ via Skillbridge visit https://skillbridge.osd.mil/program-overview.htm and apply today.

About SandboxAQSandboxAQ is an enterprise SaaS company providing solutions at the nexus of AI and Quantum technology (AQ) to address some of the worlds most challenging problems. We leverage the power of classical computing architecture to deliver AQ solutions and technologies today years before fault-tolerant, error-corrected quantum computers become available. Our core team and inspiration formed at Alphabet Inc. in 2016, and SandboxAQ emerged as an independent, venture-backed company in 2022. For more information, please visit https://www.sandboxaq.com.

Share article on social media or email:

See more here:
SandboxAQ Joins the Department of Defense Skillbridge Program to Place Military Community Members in Exciting Quantum Tech Careers - PR Web

Read More..

How machine learning helps the New York Times power its paywall – VentureBeat

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.

Every organization applying artificial intelligence (AI) and machine learning (ML) to their business is looking to use these powerful technologies to tackle thorny problems. For the New York Times, one of the biggest challenges is striking a balance between meeting its latest target of 15 million digital subscribers by 2027 while also getting more people to read articles online.

These days, the multimedia giant is digging into that complex cause-and-effect relationship using a causal machine learning model, called the Dynamic Meter, which is all about making its paywall smarter. According to Chris Wiggins, chief data scientist at the New York Times, for the past three or four years the company has worked to understand their user journey and the workings of the paywall.

Back in 2011, when the Times began focusing on digital subscriptions, metered access was designed so that non-subscribers could read the same fixed number of articles every month before hitting a paywall requiring a subscription. That allowed the company to gain subscribers while also allowing readers to explore a range of offerings before committing to a subscription.

Now, however, the Dynamic Meter can set personalized meter limits. By powering the model with data-driven user insights, the causal machine learning model can be prescriptive, determining the right number of free articles each user should get so they seem interested enough in the New York Times to subscribe in order to continue reading.

MetaBeat 2022

MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.

According to a blog post written by Rohit Supekar, a data scientist on the New York Times algorithmic targeting team, at the top of the sites subscription funnel are unregistered users. At a specific meter limit, they are shown a registration wall that blocks access and asks them to create an account. This allows them access to more free content, and a registration ID allows the company to better understand their activity. Once registered users reach another meter limit, they are served a paywall with a subscription offer. The Dynamic Meter model learns from all of this registered user data and determines the appropriate meter limit to optimize for specific key performance indicators (KPIs).

The idea, said Wiggins, is to form a long-term relationship with readers. Its a much slower problem [to solve], in which people engage over the span of weeks or months, he said.

The most difficult challenge in building the causal machine learning model was to set up the robust data pipeline that helps the algorithmic targeting team understand activity for over 130 million registered users on the New York Times site, said Supekar.

The key technical advancement powering the Dynamic Meter is around causal AI, a machine learning method where models are built which can predict not just will happen, but what would have happened.

Were really trying to understand the cause and effect, he explained.

If a particular user is given a different number of free articles, what would be the likelihood that they would subscribe or the likelihood that they would read a certain number of articles? This is a complicated question, he explained, because in reality, they can only observe one of these outcomes.

If we give somebody 100 free articles, we have to guess what would have happened if they were given 50 articles, he said. These sorts of questions fall in the realm of causal AI.

Supekars blog post explained that its clear how the causal machine learning model works by performing a randomized control trial, where certain groups of people are given different numbers of free articles and the model can learn based on this data. As the meter limit for registered users increases, the engagement measured by the average number of page views gets larger. But it also leads to a reduction in subscription conversions because fewer users encounter the paywall. The Dynamic Meter has to both optimize for and balance a trade-off between conversion engagement.

For a specific user who got 100 free articles, we can determine what would have happened if they got 50 because we can compare them with other registered users who were given 50 articles, said Supekar. This is an example of why causal AI has become popular: There are a lot of business decisions, which have a lot of revenue impact in our case, where we would like to understand the relationship between what happened and what would have happened, he explained. Thats where causal AI has really picked up steam.

Wiggins added that as more and more organizations bring AI into their businesses for automated decision-making, they really want to understand what is going on, at all angles.

Its different from machine learning in the service of insights, where you do a classification problem once and maybe you study that as a model, but you dont actually put the ML into production to make decisions for you, he said. Instead, for a business that wants AI to really make decisions, they want to have an understanding of whats going on. You dont want it to be a blackbox model, he pointed out.

Supekar added that his team is conscious of algorithmic ethics when it comes to the Dynamic Meter model. Our exclusive first-party data is only about the engagement people have with the Times content, and we dont include any demographic or psychographic features, he said.

As for the future of the New York Times paywall, Supekar said he is excited about exploring the science about the negative aspects of introducing paywalls in the media business.

We do know if you show paywalls we get a lot of subscribers, but we are also interested in knowing how a paywall affects some readers habits and the likelihood they would want to return in the future, even months or years down the line, he said. We want to maintain a healthy audience so they can potentially become subscribers, but also serve our product mission to increase readership.

The subscription business model has these kinds of inherent challenges, added Wiggins.

You dont have those challenges if your business model is about clicks, he said. We think about how our design choices now impact whether someone will continue to be a subscriber in three months, or three years. Its a complex science.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.

Here is the original post:
How machine learning helps the New York Times power its paywall - VentureBeat

Read More..