Page 1,901«..1020..1,9001,9011,9021,903..1,9101,920..»

Europe Artificial Intelligence In Fintech Market Report 2022: Process Automation is One of the Most Important Factors Fueling Sector Demand for…

Country

United States of AmericaUS Virgin IslandsUnited States Minor Outlying IslandsCanadaMexico, United Mexican StatesBahamas, Commonwealth of theCuba, Republic ofDominican RepublicHaiti, Republic ofJamaicaAfghanistanAlbania, People's Socialist Republic ofAlgeria, People's Democratic Republic ofAmerican SamoaAndorra, Principality ofAngola, Republic ofAnguillaAntarctica (the territory South of 60 deg S)Antigua and BarbudaArgentina, Argentine RepublicArmeniaArubaAustralia, Commonwealth ofAustria, Republic ofAzerbaijan, Republic ofBahrain, Kingdom ofBangladesh, People's Republic ofBarbadosBelarusBelgium, Kingdom ofBelizeBenin, People's Republic ofBermudaBhutan, Kingdom ofBolivia, Republic ofBosnia and HerzegovinaBotswana, Republic ofBouvet Island (Bouvetoya)Brazil, Federative Republic ofBritish Indian Ocean Territory (Chagos Archipelago)British Virgin IslandsBrunei DarussalamBulgaria, People's Republic ofBurkina FasoBurundi, Republic ofCambodia, Kingdom ofCameroon, United Republic ofCape Verde, Republic ofCayman IslandsCentral African RepublicChad, Republic ofChile, Republic ofChina, People's Republic ofChristmas IslandCocos (Keeling) IslandsColombia, Republic ofComoros, Union of theCongo, Democratic Republic ofCongo, People's Republic ofCook IslandsCosta Rica, Republic ofCote D'Ivoire, Ivory Coast, Republic of theCyprus, Republic ofCzech RepublicDenmark, Kingdom ofDjibouti, Republic ofDominica, Commonwealth ofEcuador, Republic ofEgypt, Arab Republic ofEl Salvador, Republic ofEquatorial Guinea, Republic ofEritreaEstoniaEthiopiaFaeroe IslandsFalkland Islands (Malvinas)Fiji, Republic of the Fiji IslandsFinland, Republic ofFrance, French RepublicFrench GuianaFrench PolynesiaFrench Southern TerritoriesGabon, Gabonese RepublicGambia, Republic of theGeorgiaGermanyGhana, Republic ofGibraltarGreece, Hellenic RepublicGreenlandGrenadaGuadaloupeGuamGuatemala, Republic ofGuinea, RevolutionaryPeople's Rep'c ofGuinea-Bissau, Republic ofGuyana, Republic ofHeard and McDonald IslandsHoly See (Vatican City State)Honduras, Republic ofHong Kong, Special Administrative Region of ChinaHrvatska (Croatia)Hungary, Hungarian People's RepublicIceland, Republic ofIndia, Republic ofIndonesia, Republic ofIran, Islamic Republic ofIraq, Republic ofIrelandIsrael, State ofItaly, Italian RepublicJapanJordan, Hashemite Kingdom ofKazakhstan, Republic ofKenya, Republic ofKiribati, Republic ofKorea, Democratic People's Republic ofKorea, Republic ofKuwait, State ofKyrgyz RepublicLao People's Democratic RepublicLatviaLebanon, Lebanese RepublicLesotho, Kingdom ofLiberia, Republic ofLibyan Arab JamahiriyaLiechtenstein, Principality ofLithuaniaLuxembourg, Grand Duchy ofMacao, Special Administrative Region of ChinaMacedonia, the former Yugoslav Republic ofMadagascar, Republic ofMalawi, Republic ofMalaysiaMaldives, Republic ofMali, Republic ofMalta, Republic ofMarshall IslandsMartiniqueMauritania, Islamic Republic ofMauritiusMayotteMicronesia, Federated States ofMoldova, Republic ofMonaco, Principality ofMongolia, Mongolian People's RepublicMontserratMorocco, Kingdom ofMozambique, People's Republic ofMyanmarNamibiaNauru, Republic ofNepal, Kingdom ofNetherlands AntillesNetherlands, Kingdom of theNew CaledoniaNew ZealandNicaragua, Republic ofNiger, Republic of theNigeria, Federal Republic ofNiue, Republic ofNorfolk IslandNorthern Mariana IslandsNorway, Kingdom ofOman, Sultanate ofPakistan, Islamic Republic ofPalauPalestinian Territory, OccupiedPanama, Republic ofPapua New GuineaParaguay, Republic ofPeru, Republic ofPhilippines, Republic of thePitcairn IslandPoland, Polish People's RepublicPortugal, Portuguese RepublicPuerto RicoQatar, State ofReunionRomania, Socialist Republic ofRussian FederationRwanda, Rwandese RepublicSamoa, Independent State ofSan Marino, Republic ofSao Tome and Principe, Democratic Republic ofSaudi Arabia, Kingdom ofSenegal, Republic ofSerbia and MontenegroSeychelles, Republic ofSierra Leone, Republic ofSingapore, Republic ofSlovakia (Slovak Republic)SloveniaSolomon IslandsSomalia, Somali RepublicSouth Africa, Republic ofSouth Georgia and the South Sandwich IslandsSpain, Spanish StateSri Lanka, Democratic Socialist Republic ofSt. HelenaSt. Kitts and NevisSt. LuciaSt. Pierre and MiquelonSt. Vincent and the GrenadinesSudan, Democratic Republic of theSuriname, Republic ofSvalbard & Jan Mayen IslandsSwaziland, Kingdom ofSweden, Kingdom ofSwitzerland, Swiss ConfederationSyrian Arab RepublicTaiwan, Province of ChinaTajikistanTanzania, United Republic ofThailand, Kingdom ofTimor-Leste, Democratic Republic ofTogo, Togolese RepublicTokelau (Tokelau Islands)Tonga, Kingdom ofTrinidad and Tobago, Republic ofTunisia, Republic ofTurkey, Republic ofTurkmenistanTurks and Caicos IslandsTuvaluUganda, Republic ofUkraineUnited Arab EmiratesUnited Kingdom of Great Britain & N. IrelandUruguay, Eastern Republic ofUzbekistanVanuatuVenezuela, Bolivarian Republic ofViet Nam, Socialist Republic ofWallis and Futuna IslandsWestern SaharaYemenZambia, Republic ofZimbabwe

More:
Europe Artificial Intelligence In Fintech Market Report 2022: Process Automation is One of the Most Important Factors Fueling Sector Demand for...

Read More..

Artificial Intelligence and Machine Learning in Trading: How are they changing the world of Trading? – New Trader U

This is a guest post by Saloni Bogati of quantinsti.com.

Latest technological phenomena like ML and AI transform businesses, industries, and their scope of growth. The finance industry is notorious for using the latest technologies and solutions to achieve objectives.

Trading has also seen its fair share of advancements in recent years and has yielded the abilities of these technologies. There is also a significant increase in the use of AI and ML techniques to build trading systems using data. Further, Artificial Intelligence and Machine Learning enhance trading efficiency by using innovative solutions.

Hence, in this article, we will learn:

Artificial Intelligence or AI is a computer science stream that develops machines capable of imitating the human mind. In other words, it enables devices to think and react like humans to perform specific tasks without human intervention.

For example, virtual assistants like Siri, Alexa, and Google Assistant are a part of our daily lives. Virtual assistants often use data history, voice technology, and other features to make our lives easier.

Machine Learning is a subject of Artificial Intelligence that enables software solutions to make decisions based on accurate and calculated information. It develops machine capabilities to use algorithms to duplicate the way humans learn.

It also progressively enhances reliability and accuracy by analyzing historical data to procure desired results. Moreover, ML makes the machines more human by training the devices with the ability to learn and develop.

The evolution of trading defines the development and journey of humans. Trading denotes the system of equally exchanging products, services, and money to gain specific ownership. Earlier, the barter system was a popular method of determining exchange. Further, with the inception of coins and money, the manner of exchange evolved. Soon, trading evolved as coins and currencies emerged to define the value of products, commodities, and services. Automated Trading accounted for about 70% of US equities in 2013. Algorithmic trading accounted for a third of the total volume on Indian cash shares and almost half of the book in the derivatives segment. Hence, trading has evolved from the early days of yore to the recent technological development. Therefore, terms like algo trading are now gaining momentum as the present leads the lives of future traders.

AI and ML have a tremendous impact on trading. Therefore, the reasons why AI and ML are pivotal in Trading:

According to a report by Allied Market Research, The global AI and advance machine learning in BFSI market size was valued at $7.66 billion in 2020, and is projected to reach $61.24 billion by 2030, growing at a CAGR of 23.1% from 2021 to 2030.

Artificial Intelligence and machine learning in trading offer financial industry solutions that help streamline various processes. It also helps in optimizing decisions in quantitative trading and manages financial risks. Hence, the solutions and services offered by AI and ML help automate processes in trading and reduce manual and repetitive tasks. Hence, here are different ways in which AI and ML contribute to the world of trading:

AI and ML often use abilities like neural networks and other learning models to detect and analyse factors that influence stock prices. That is to say; the factors act as predictors or features that help determine the future of stocks. For instance, AI can help detect technical, social, economic, demographical, and other factors to gain desired results. Therefore, traders can use these insights and knowledge to develop robust ML algorithms, strategies, and models to trade.

Artificial Intelligence uses automated systems that perform tasks based on facts and the accuracy of the information. On the other hand, humans may make specific errors based on emotions, cloud judgments, agendas, etc. Therefore, AIs fact-based decision-making process offers optimum results for participants.

AI in trading also increases the need for human management as an organisation is now looking for experts in Mathematics, Computer Programming, etc., to develop strategies. As a result, AI improves decision-making processes while experts develop ML strategies for various trading agendas.

Chatbots are virtual assistants that enable traders to find easy solutions. It also mitigates the use of agents for mundane queries. Moreover, traders can access chatbots anytime in the day as it does not require human intervention and assist questions using automated responses.

AI helps predict stock prices using the factors that influence the market. Therefore, it can use similar elements and data to anticipate risks and enable ML algorithms to avoid scenarios or mitigate actions that may lead to the risk. Further, AI can process large sets of data rapidly and accurately. Machine Learning can help replicate the scenarios within the models and learn various techniques to optimize the results. Hence, AI can be the brain of the operation, and ML is the limb that follows instructions while learning and developing its abilities.

Artificial Intelligence and Machine Learning play a pivotal role in trading by offering rapid and simplified solutions. The technologies enhance the processes of innovating and modernizing the various concepts in trading. Therefore, the following are some revolutionizing applications of AI and ML in trading:

Sentiment Analysis is a general application of Machine Learning in the financial market. It helps analyze larger volumes of data related to assets and other investment information. Machine learning also leverages NLP (Natural language processing) in trading to rapidly and accurately analyse various data sets. Therefore, it is critical to have a comprehensive understanding of Sentiment analysis in Financial markets.

Machine Learning also helps improve sentimental analysis in the following categories according to requirements:

Traders often inquire about the estimated results of their trades. Although it is tedious to predict an accurate outcome, ML can help understand the factors that may affect the desired result. Therefore, traders can use these estimates to analyze possible results using research insights and probability.

Data fuels the engine for AI and Machine Learning. It is vital to have large volumes of data sets to develop ML algorithms and models. Therefore, another ML application in trading is developing synthetic data. Moreover, Generative Adversarial Networks (GANs) can help counter challenges like data scarcity, data privacy, data costs, backtesting overfitting, etc.

According to Coherent Market Insights, Global algorithmic trading market was valued at US $10,346.6 Mn in 2018 and is expected to exhibit a CAGR of 10.7% over the forecast period to reach US$ 25,257.0 Mn in 2027.

Therefore, the expanding market demands AI and ML-based solutions to meet the expectations. Machine learning can improve the speed of search for efficient Algo Trading Strategies. It also helps traders optimize their desired results and simulate risks while trading. ML and AI can also integrate their abilities with various algorithmic trading platforms to help investment professionals. For instance, multiple techniques employ ML and AI to enhance algorithms, including neural networks, deep learning, linear regressions, etc., often covered in machine learning courses.

Traders often analyze the opportunities and risks associated with trading stocks. They also want to predict the future value of specific stocks. Therefore, AI and ML strategies enable systems to make estimations based on real-world data. The strategies analyze various scenarios and factors affecting the desired outcome and provide information based on the calculations.

Moreover, a trader must always hope for the best yet prepare for the worst-case scenario by identifying and assessing the risks. ML algorithms can analyze large data sets and offer insights based on calculations and impact.

Programming Languages like Python in Trading

Machine Learning and Python have become widespread integration into algorithmic trading. Moreover, Python codes are easy to read and comprehend with extensive libraries. As a result, including programming languages in AI and ML strategies opens up several avenues for trading. It also offers robust computing power to enable scalability. Therefore, programming languages simplify processes by using comprehensive libraries for trading.

In the world of trading, AI and ML are actively used in Algorithmic Trading by various organizations and retail investors. Moreover, the concept help develops algorithms that comprehend market conditions, learn from past data, make calculated decisions, etc.

The use of AI and ML in trading is algorithmic or automated solutions that integrate AI analysis, self-developing algorithms, managing tasks according to trading agendas, etc. Therefore, it can create an environment for traders and investors to use solutions that offer optimum results optimally. As technology evolves, traders and investors must upgrade their skills to the leverage abilities of technologies. An algorithmic trading course guides traders and investors with the help of industry experts and updated training modules.

You can follow QuantInsti on Twitter here an check out more information on their website at quantinsti.com.

The rest is here:
Artificial Intelligence and Machine Learning in Trading: How are they changing the world of Trading? - New Trader U

Read More..

Deep Tech has become one of the most powerful use cases for A.I. in business. Here are 3 keys to making it work – Fortune

In early 2020, when scientists rushed to develop a vaccine to take on the SARS-CoV-2 coronavirus that causes COVID-19, it seemed like a really long shot. The fastest a vaccine had ever previously been developed was for mumps, back in the 1960san effort that took 48 months. Still, just nine months later, in December 2020, the American pharmaceutical giant Pfizer and a German deep-tech startup, BioNTech, had developed the first COVID-19 vaccine, validating the use of the new technology of mRNA-based vaccines.

The first studies on DNA vaccines began 25 years ago, and the science of RNA vaccines too has been evolving for over 15 years. One outcome was mRNA technology, which required the convergence of advances in synthetic biology, nanotechnology, and artificial intelligence, and has transformed the scienceand the businessof vaccines. Pfizer generated nearly$37 billion in salesfrom the COVID-19 vaccine last year, making it one of the most lucrative products in the companys history.

Like Pfizer and Moderna in the pharmaceuticals sector, several corporations in other industriessuch as Tesla inautomobiles,Bayerin agrochemicals,BASFin specialty chemicals,Deerein agriculture machinery, andGoodyearin rubberare relying on deep technologies. Deep Tech, as we call it, is the problem-driven approach to tackling big, hairy, audacious, and wicked challenges by combining new physical technologies, such as advanced material sciences, with sophisticated digital technologies, such as A.I. and soon, quantum computing.

Deep Tech is risingto the fore because of businessspressing need to develop new products faster than before; to develop sustainable products and processes; and to become more future-proof. Deep Tech can generate enormous value and will provide companies with new sources of advantage. In fact, Deep Tech will disrupt incumbents in almost every industry. Thats because the products and processes that will result because of these technologies will be transformational, creating new industries or fundamentally altering existing ones.

The early prototypes of Deep Tech-based products are already available. For instance, the use of drones, 3-D printers, and syn-bio kits is proliferating, while No Code / Low Code tools are making A.I. more accessible. Theyre opening upmore avenuesby which companies can combine emerging technologies and catalyze more innovations. Unsurprisingly, incubators and accelerators have sprung up worldwide to facilitate their development. Not only are more Deep Tech start-ups being set up nowadays, but theyre launching successful innovations faster than before.

Its risky for CEOs of incumbent companies to count on a wait-and-watch strategy. They need to figureout ways to tap into Deep Techs potentialright away before their organizations are disrupted by themjust as digital technologies and start-ups disrupted business not so long ago. Unlike digital disruption, though, the physical-cum-digital nature of Deep Tech provides a golden opportunity for incumbents to shape these technologies evolution and to harness them for their benefit.

Established giants can help Deep Tech start-ups scale their products, which can be especially complex and costly for physical products, by leveraging their expertise in engineering and manufacturing scale-up and by providing market access. And because the incumbents are already at the center of global networks, they can also help navigate government regulations and influence their suppliers and distributors to transition to infrastructure that will support the new processes and products. Doing so will unlock enormous value, as the Pfizer-BioNTech case exemplifies.

Most incumbents will find thatDeep Tech poses two stiff challenges at first. One, it isnt easy to spot or assess the business opportunities that the new technologies will create. Two, its equally tough to develop and deploy Deep Tech-based solutions and applications, which usually requires participating in and catalyzing collective actions with ecosystems. To manage the twin challenges of Deep Tech, CEOs should keep in mind three starting points.

Despite its sophistication, conventional technology forecasting produces linear predictions and siloed thinking; it doesnt account for how technologies change and converge. As a result, most forecasts underestimate the speed at which technologies evolve and when business will be able to use them. Thats why companies should use backcasting, the method outlined by University of WaterloosJohn Robinsonin the late 1980s.

Rather than tracking the development of many technologies, business would do better to start by focusing on the worlds biggest needs and pressing problems, to identify the long-standing frictions and tradeoffs that have prevented it from tackling them until now. Then, they should define a desirable future in which those issues have been resolved, and work back to identify the technologies, and combinations thereof, that will make solutions possible and commercially feasible. Backcasting helps companies come to grips with both short-term and long run technological changes, making it ideal to manage Deep Tech.

The Anglo-American think tankRethink X, for instance, has used a technology disruption framework, predicated on backcasting, to highlight the implications of creating a sustainable world. The analysis suggests that the technological changes under way in the energy, transportation, and food sectors, driven by a combination of just eight emerging technologies, could eliminate over 90% of net greenhouse gas emissions in 15 years time. The same technologies will also make the cost of carbon withdrawal affordable, so more breakthrough technologies may not be needed in the medium term.

When companies evaluate the business opportunities that deep technologies will open up, they should take into account the scope of thechanges they will bring about. It will be determined by the complexity of a technology and the businesss ability to scale solutions based on it. As Arnulf Grubler, the head of the Austria-basedInternational Institute for Applied Systems Analysis, and his co-authorsargued six years ago,new technologies can bring about four levels of change. They can:

1. Improve an existing product. For example, sustainable biodegradable plastic can replace conventional plastic packaging.

2. Improve an existing system. Nanomaterial-infused paints and an A.I.-enabled smart home system can, for instance, dramatically change homes.

3. Transform a system. Developing the ecosystem for hydrogen-powered automobiles, from hydrogen production to refueling stations, could transform urban mobility.

4. Transform a system-of-systems. Creating a purification technology that transforms current water supply and management systems will also alter the working of water-consuming sectors such as agriculture, alcohol, beverages, paper, and sugar.

Figuring out which of the four levels of change is likely to result will help companies better assess market sizes as well as growth trajectories. WhenBCG recently estimatedthe market size of Deep Tech solutions in nine sustainability-related sectors, for example, it found that while technology improvements in existing value chains would generate additional revenues of over $123 billion per annum, those that resulted in systemic changes would generate 20 times more. Or as much as $2.7 trillion a year.

Few companies already have in-house all the technologies and capabilities they need to deploy Deep Tech. They must gain thesupport of technology-related ecosystems, which extend from academics and university departments to investors and governments, to develop those competencies. The types of linkages that will result will depend on the business opportunity as well as the ecosystems maturity.

Several kinds of collaborations are likely to form. Some incumbents will, obviously, join hands with start-ups to develop new products or processes, as Bayer did in 2017, setting up ajoint venturewithGinkgo Bioworks to synthesize microbes that will allow plants to produce their own fertilizers.Others will orchestrate systemic changes, which is whatHyundai Motor Groupis trying to do in the field of mobility by working with several Deep Tech startups. Still others may focus on nurturing deep technologies to maturity themselves, akin to theefforts of SwedensSSAB(formerly Swedish Steel), Vattenfal, and Finlands LKAB to scale a sustainable steel-making process in which fossil-free electricity and green hydrogen replace coking coal.

***

A deep technology was impossible yesterday, is barely feasible today, and may soon become so pervasive and impactful that it will be difficult to remember life without it, points out Michigan State Universitys Joshua Siegel. The future will likely belong to companies that dont just track Deep Tech, but invest in its development and drive its adoption by engaging with ecosystems, forcing rivals to play the losing strategy of catch up.

ReadotherFortunecolumns by Franois Candelon.

Franois Candelonisa managing director and senior partner at BCG and global director of the BCG Henderson Institute.Maxime Courtauxis a project leader at BCG and ambassador at the BCG Henderson Institute.Antoine Gourevitch is a managing director and senior partner at BCG.John Paschkewitz is a partner and associate director at BCG.Vinit Patelis a project leader at BCG and ambassador at the BCG Henderson Institute.

Some companies featured in this column are past or current clients of BCG.

The opinions expressed in Fortune.com commentary pieces are solely the views of their authors and do not necessarily reflect the opinions and beliefs ofFortune.

Sign up for theFortune Features email list so you dont miss our biggest features, exclusive interviews, and investigations.

Continue reading here:
Deep Tech has become one of the most powerful use cases for A.I. in business. Here are 3 keys to making it work - Fortune

Read More..

Edge Vs. Cloud Computing: Which Solution Is Better For Your Connected Device? – IoT For All

If youre developing an IoT device, odds are that you want it to do some valuable computations to solve an important problem. Maybe you want todeploy sensors in remote locations, develop a device that can perform data analytics to monitor a renewable energy source, or build a medical device that can use computer vision to detect early signs of an illness.

Whatever you are building, at some point you may start to wonder:should your device perform these important computations in the cloud or at the edge?Choosing between computing on the cloud or on the edge is a decision that can impact things like your devices cost or efficiency and no one wants to make the wrong decision initially and spend time and money later down the line to pivot to the correct one.

The cloud refers to the collection of servers that can be accessed over the internet popularcloud providers include Amazon Web Services, Microsoft Azure, and Google Cloud.

These servers can provide on-demand computing resources to store and process data. You can think of the cloud as a centralized location for your files and programs, and you can connect any device to the cloud to access them. Services like Dropbox or Google Drive are some of the many cloud-based services out there.

Cloud computingdescribes the idea of performing computations in the cloud. These computations can include data analysis and visualization, computer vision, and machine learning. An example of cloud computing in action is when your average smart home speaker sends your audio input to the cloud where it is interpreted by algorithms and sends back a response.

The edge describes the edge of the network.It includes devices that are an entry/exit point to the cloud but are not part of the cloud itself. For example, a server in a data center is part of the cloud; the smartphone and router that connect to that server are part of the edge.

Edge computingdescribes the idea of performing computations on the edge. This way, the processing is done closer to or at the location where the data is collected or acted upon.

An example of an edge computing process is object detection on an autonomous vehicle. The vehicle processes data from its sensors and uses the results to avoid obstacles. Unlike your smart home speaker, the data it collects is processed locally rather than sent to the cloud.

There are a couple of key questions to consider when choosing between edge and cloud computing.

Performing computations on the cloud can work well when you have a high-bandwidth, low-latency, and stable connection to the internet, as you will be needing to send your data back and forth between cloud servers and your device. If your device is intended to be used, for instance, in a home or office with a good internet connection, this back and forth can be done relatively seamlessly.

In most cases, if the computation is done on the edge, it wont be affected by poor or lost internet connection in a remote location; the processing can continue since it is not computed in the cloud. You wouldnt want your vehicles object detection to stop working while on a long road trip; thats one of the reasons why autonomous vehicles frequently perform computations like object detection on the edge.

Edge computing can be ideal in cases where your customer needs response times from your device to be faster than what can be achieved with a decent network connection, such as monitoring vital components of a system. The latency of the travel time between the device and the cloud can be reduced or eliminated completely. As a result, the data can be processed right away. If the data processing itself is quick, you could achieve real-time responses from your device.

Cloud computing is beneficial when device use is intermittent. Smart home devices are a good example of this again, where running computations in the cloud lets you share the same computing resources between multiple customers. This reduces costs by avoiding the need to provision your device with upgraded hardware to run the data processing.

Computing on the edge is useful if you only care about the result of your dataafterit has been processed. You can send only what is important to store in the long-term in the cloud, and doing so will allow you to reduce the cost of storing and processing data in the cloud. For example, if you are creating a traffic surveillance device that needs to report levels of congestion on a road, you could pre-process the videos on the edge instead of running hours of raw video in the cloud and only send images or clips of the traffic when it is present.

Its possible you need to keep the data tobuild out your machine learning dataset or you plan on analyzing the raw data in other ways in the future. If you are already sending your raw data to the cloud, it may be ideal to perform calculations in the cloud as well.

If you expect your device will be restricted in power and size, given that it has a good network connection, sending the computing work to be done on the cloud will allow your device to remain small and low-power. Google Home and Amazon Alexa, for example, will capture the audio and send it to the cloud for processing, allowing complex computations to be run on the audio that would not be possible to run on the small computers inside the devices themselves.

If you are making a consumer device and the method you are using to process data is part of your Intellectual Property (IP), you may need to consider how you plan to protect it. Putting your IP on your device without a robust security plan can leave it vulnerable to hacks. If you dont have the knowledge or resources to secure your IP on the edge, it may be best to leave it on the cloud, which already has security measures in place.

There are quite a few things to consider when choosing between computing on the edge or in the cloud. In complex problems, you may benefit from using a combination of both by leaving some parts of your processing on the edge and the rest on the cloud.

Read more:
Edge Vs. Cloud Computing: Which Solution Is Better For Your Connected Device? - IoT For All

Read More..

Google Cloud announces upcoming regions in Malaysia, Thailand and New Zealand – TechCrunch

Fresh off of an expansion to Mexico, Google today previewed the launch of new Google Cloud regions concentrated in Asia-Pacific (APAC), specifically Malaysia, Thailand and New Zealand. When they come online, theyll bring Googles total number of cloud regions to 34 short of Azures more than 60 but ahead of AWS 26.

In the context of cloud computing, a region is a specific geographic location where users can deploy cloud resources. At a minimum, all Google Cloud regions offer services including Compute Engine, Google Kubernetes Engine, Cloud Storage, Persistent Disk, CloudSQL, Virtual Private Cloud, Key Management System, Cloud Identity and Secret Manager. Additional products usually come online within six months of a new regions launch.

In a blog post, Google cites data from IDC projecting that total spending on cloud services in APAC (excluding Japan) will reach $282 billion by 2025. Other research agrees. According to a 2021 survey from Information Services Group, cloud services accounted for more than 84% of APACs IT and business services spending in Q3 2021 by far the greatest percentage of any region.

AWS is also eyeing the opportunity, having recently outlined a two-year plan to set up cloud zones in Auckland, Manila, Bangkok and elsewhere in APAC. Google Clouds move would appear to be a shot across the bow.

The new Google Cloud regions will help to address organizations increasing needs in the area of digital sovereignty and enable more opportunities for digital transformation and innovation in APAC, Google Clouds Daphne Chung said in a statement. With this announcement, Google Cloud is providing customers with more choices in accessing capabilities from local cloud regions while aiding their journeys to hybrid and multicloud environments.

Google Clouds continued growth comes as it fights for dominance in the ultra-competitive and potentially lucrative cloud computing market. Flexeras latest State of the Cloud report shows Google Cloud several percentage points behind AWS and Azure in terms of usage and adoption. But on the other hand, Google Cloud surpassed $6 billion in quarterly revenue for the first time in Q2 2022, signaling resilience.

Read more:
Google Cloud announces upcoming regions in Malaysia, Thailand and New Zealand - TechCrunch

Read More..

Dataminr Named to the 2022 Forbes Cloud 100 – PR Newswire

For the sixth consecutive year, Dataminr ranks among the world's top private companies powered by cloud computing

NEW YORK, Aug. 9, 2022 /PRNewswire/ --Dataminr, the leading real-time information discovery platform, today announced that it has been named to the Forbes 2022 Cloud 100 list, the definitive ranking of the top 100 private cloud companies in the world, published by Forbes in partnership with Bessemer Venture Partners and Salesforce Ventures.

"Dataminr has revolutionized the way global corporations, first responders, NGOs and newsrooms discover high-impact events and emerging risks," said Dataminr Founder and CEO Ted Bailey. "It's a point of great pride that our work has been recognized by the Forbes Cloud 100 for the sixth consecutive year."

Dataminr's inclusion on the list marks another milestone in the company's continued momentum in 2022. Following its successful $475M growth capital financing, Dataminr made two strategic acquisitions in 2021UK-based WatchKeeper, a data geovisualization platform, and Copenhagen-based Krizo, a real-time crisis response platformboth of which are now fully integrated into Dataminr Pulse and available to all customers. In July 2022, Dataminr announced a new strategic advisory partnership with NightDragon, focused on the convergence of cyber and physical security.

For the seventh straight year, the Cloud 100 reviews submissions from hundreds of cloud startups and private companies each year. The Cloud 100 evaluation process involved ranking companies across four factors: market leadership (35%), estimated valuation (30%), operating metrics (20%), and people & culture (15%). For market leadership, the Cloud 100 enlists the help of a judging panel of public cloud company CEOs who assist in evaluating and ranking their private company peers.

"The companies of the Cloud 100 list represent the best and brightest emerging companies in the cloud sector," said Alex Konrad, senior editor at Forbes. "Every year, it gets more difficult to make this listmeaning even more elite company for those who do. Congratulations to each of the 2022 Cloud 100 honorees."

"The public markets may be in turmoil, but the private valuations of the Cloud 100 continue to rise. All of the 2022 Cloud 100 honorees, again, have reached the $1 billion valuation milestone, and the average Cloud 100 valuation has skyrocketed to $7.4 billion," said Mary D'Onofrio, partner at Bessemer Venture Partners. "Despite the market correction in 2022, our confidence in the cloud economy continues to growtoday over 70% of the 2022 Cloud 100 Honorees have reached or exceeded $100 million in annual recurring revenue making them cloud Centaurs. An additional 10% of the list is expected to hit this milestone by the end of the year, furthering our conviction that this years' honorees truly represent the best cloud companies globally."

"Great companies are born out of all environments, and it's exciting to see the continued momentum in the cloud sector," said Alex Kayyal, managing partner, Salesforce Ventures. "The companies on this list have gone through a rigorous selection process, and join an esteemed alumni list of Cloud 100 companies. As the need for digital transformation continues to drive innovation and efficiencies across industries, we can look to these companies as the absolute best in cloud computing."

The Forbes 2022 Cloud 100 and 20 Rising Stars lists are published online at http://www.forbes.com/cloud100. Highlights of the list appear in the August/September 2022 issue of Forbes magazine.

About Dataminr

Dataminr delivers the earliest warnings on high impact events and critical information far in advance of other sources. Recognized as one of the world's leading AI businesses, Dataminr enables faster response, more effective risk mitigation and stronger crisis management for public and private sector organizations spanning global corporations, first responders, NGOs, and newsrooms. Recently valued at $4.1B, Dataminr is one of New York's top private technology companies, with 900+ employees across eight global offices.

Since its founding in 2009, Dataminr has created the world's leading real-time information discovery platform, which detects digital patterns of emerging events and critical information from public data signals. Today, Dataminr's leading AI platform performs trillions of daily computations across billions of public data inputs from over 300,000 unique public data sources. The company has been recognized for its groundbreaking AI platform and rapid revenue growth by Forbes AI 50 and Deloitte Fast 500, and has been named to Forbes Cloud 100 for six consecutive years.

Alongside Dataminr's corporate product, Dataminr Pulse, the company provides public sector organizations with its First Alert product for first response, including the United Nations, which relies on First Alert in over 100 countries. Dataminr for News is used by more than 650 newsrooms and by over 30,000 journalists worldwide.

About Bessemer Venture Partners

Bessemer Venture Partners helps entrepreneurs lay strong foundations to build and forge long-standing companies. With more than 135 IPOs and 200 portfolio companies in the enterprise, consumer and healthcare spaces, Bessemer supports founders and CEOs from their early days through every stage of growth. Bessemer's global portfolio includes Pinterest, Shopify, Twilio, Yelp, LinkedIn, PagerDuty, DocuSign, Wix, Fiverr and Toast and has $19 billion of regulatory assets under management. Bessemer has teams of investors and partners located in Tel Aviv, Silicon Valley, San Francisco, New York, London, Boston, Beijing and Bangalore. Born from innovations in steel more than a century ago, Bessemer's storied history has afforded its partners the opportunity to celebrate and scrutinize its best investment decisions (see Memos) and also learn from its mistakes (see Anti-Portfolio).

About Forbes

Forbes champions success by celebrating those who have made it, and those who aspire to make it. Forbes convenes and curates the most influential leaders and entrepreneurs who are driving change, transforming business and making a significant impact on the world. The Forbes brand today reaches more than 150 million people worldwide through its trusted journalism, signature LIVE and Forbes Virtual events, custom marketing programs and 47 licensed local editions in 80 countries. Forbes Media's brand extensions include real estate, education and financial services license agreements.

About Salesforce Ventures

Salesforce Ventures helps enterprising founders build companies that reinvent the way the world works. Since 2009, we've invested in and partnered with more than 400 of the world's most tenacious enterprise software companies from seed to IPO, including Airtable, Databricks, DocuSign, Guild Education, Hopin, monday.com, nCino, Snowflake, Snyk, Stripe, Tanium, and Zoom. Salesforce Ventures leverages our decades of expertise in the cloud and our long-term relationships with key decision-makers at thousands of businesses around the world to give our portfolio companies an unfair advantage, help them build credibility, and accelerate growth. Salesforce Ventures has invested in more than 25 countries with offices all over the world including in San Francisco, Irvine, New York, London, Tokyo, and Sydney. Follow @SalesforceVC and learn more at http://www.salesforceventures.com.

ContactNikki HornBessemer Venture Partners, Head of Events[emailprotected] T +1 949 400 3355salesventures.com

SOURCE Dataminr

Original post:
Dataminr Named to the 2022 Forbes Cloud 100 - PR Newswire

Read More..

Doug Lane: Capgemini to Help Army Expand Adoption of Cloud Computing Tech – ExecutiveBiz

TYSONS CORNER, VA, Aug. 9, 2022 Capgemini Government Solutions has secured a three-year contract from the U.S. Army for the modernization, transformation and growth of its Cloud Common Shared Services environment known as cARMY to expand the adoption of cloud computing technologies, ExecutiveGov reported.

The cARMY project allows the Army to empower teams to make better data-driven decisions. We are honored to support it in partnership with the [Enterprise Cloud Management Agency], said Doug Lane, president and CEO of Capgemini Government Solutions and a 2022 Wash100 Award winner.

About Executive Mosaic

Founded in 2002, Executive Mosaic is a leadership organization and media company. It provides its members an opportunity to learn from peer business executives and government thought leaders while providing an interactive forum to develop key business and partnering relationships.

Executive Mosaic offers highly coveted executive events, breaking business news on the Government Contracting industry, and delivers robust and reliable content through seven influential websites and four consequential E-newswires. Executive Mosaic is headquartered in Tysons Corner, VA.

Link:
Doug Lane: Capgemini to Help Army Expand Adoption of Cloud Computing Tech - ExecutiveBiz

Read More..

Come and Learn Cloud Computing Opportunities at Tekedia Mini-MBA – Tekedia

Good People, the zen-master and one of the most amazing technology leaders in our continent will be in Tekedia Mini-MBA Live today. University of Ilorins First Class graduate and Microsoft MVP on many occasions, Olanrewaju Oyinbooke, will come to teach cloud computing and opportunities. Lets create that future so that we can predict it. In the cloud space, Olanrewaju will explain the future.

Tekedia Institute offers Tekedia Mini-MBA, an innovation management 12-week program, optimized for business execution and growth, with digital operational overlay. It runs 100% online. The theme isInnovation, Growth & Digital Execution Techniques for Building Category-King Companies. All contents are self-paced, recorded and archived which means participants do not have to be at any scheduled time to consume contents. Besides, programs are designed for ALL sectors, from fintech to construction, healthcare to manufacturing, agriculture to real estate, etc.

To join the next edition which begins Sept 12, click here.

Registration for Tekedia Mini-MBA edition 9 (Sep 12- Dec 3 2022) has started. Register here. Cost is N60,000 or $140 for the 12-week program.

1. Advance your career, run your business better with Tekedia Mini-MBA (Sep 12 Dec 3, 2022): cost is N60,000 naira ($140). Click and register here.

2. Click and register for Tekedia Startup Masterclass and master business secrets from start-up to unicorn. Cost is N180,000 naira ($400).

3. Click to join Tekedia Capital Syndicate and own a piece of Africas finest startups with a minimum of $10,000 investment.

Originally posted here:
Come and Learn Cloud Computing Opportunities at Tekedia Mini-MBA - Tekedia

Read More..

What Is the Industrial Edge? – Acceleration Economy

In Episode 6 of the Cutting Edge podcast, Leonard Lee analyzes what sets the Industrial Edge apart, explains the Purdue model of cybersecurity, and asks what businesses moving to the cloud are willing to sacrifice to bolster data security.

00:23 Leonard Lee covers the Industrial Edge: What makes it different, and why it matters to an organizations future.

00:47 Two different descriptions of the Industrial Edge are given. Leonard suggests the Industrial Edge is simply the environment where we see the operations of a business take place.

02:02 IT organizations are generally more familiar with an architecture that differs from how systems are traditionally built. Most often, operational technology (OT) folks are working off of the (Purdue Enterprise Reference Architecture (PERA) reference model.

03:04 One of the key features of the Purdue model is its take on security architecture. Leonard explains the importance of air gapping, or separating protected systems from the internet.

04:15 Introducing technologies and deployment models like cloud computing into the OT world also introduces a host of security issues. With increased flexibility and interoperability comes an inherent risk of a security breach.

04:56 Leonard examines why the principles guiding the Purdue model are still relevant. As cloud computing moves towards the edge, it needs to adjust to new realities; there is a price to pay for security.

06:12 How do you apply PERA in a smart home or home automation system to protect consumer privacy?

06:35 Not everything needs to go to the public cloud. Edge is where business and customers lives happen.

07:25 Leonard concludes his remarks and previous the Cutting Edge column.

View original post here:
What Is the Industrial Edge? - Acceleration Economy

Read More..

Inside the rush to build ‘superclouds’ – SiliconANGLE News

Liberty Mutual Insurance Co. is in the midst of a massive cloud migration that will affect more than 40,000 systems and applications across the globe running on everything from Windows servers to mainframes.

The company has already moved 68% of its workloads to the public cloud and aims to slim down from three data centers to just one by 2024. Our ultimate goal is to get 100% to cloud, said Eric Drobisewski, senior architect for global digital services at the insurer, which employs 45,000 people in 29 countries.

But Liberty Mutual isnt tying itself to a single cloud. Part of its migration strategy is to build or re-platform applications on top of an abstraction layer that makes the underlying cloud service invisible.

In the end, these resources are all utilities and we need to treat them the way wed treat home electricity, Drobisewski said.

Liberty Mutuals Drobisewski: We need to treat [cloud resources] the way wed treat home electricity. Photo: LinkedIn

Not long ago, such an ambitious strategy would have been unthinkable. As recently as 2019, the concept of a comprehensive multicloud environment was considered a pipe dream. But technology providers and their customers have been hacking away at the problem and are now beginning to build applications both for internal use and commercial sale that combine resources from multiple public and private cloud platforms in a way that is nearly invisible to the user.

This extension of multicloud computing goes by various names. SiliconANGLEs research affiliate Wikibon adopted the term supercloud, a term coined by Cornell University researchers in 2017. Others have referred to them as metaclouds, cross-clouds and even cloud of clouds. The nomenclature matters less than the expected payoffs.

Were able to bring new insurance products and integrate them back into consumers hands much more quickly, Drobisewski said. Weve invested heavily in allowing software developers to move quickly with modern toolsets to be more effective and faster.

The concept of a supercloud is less revolutionary than evolutionary. Its industry clouds and multiclouds munged together, said Gartner Inc. Analyst Craig Lowery. This is a continuation of edge, hybrid and multicloud technology stacks that brings more immediate value.

A recent survey of 1,800 IT decision-makers by VMware Inc. found that 73% said their enterprises use two or more public clouds today and 81% plan to do so by 2024.

Gartners Lowery: Supercloud is a continuation of edge, hybrid and multicloud technology stacks that brings more immediate value. Photo: Twitter

Its been growing as a thing for the last five years and somebody just gave it a name, said David Linthicum, chief cloud strategist at Deloitte LLP. The idea is to stop building security and operations systems three times and instead use a layer of technology above the clouds that provides all that functionality.

There are sound business reasons behind that goal. The VMware study found that organizations that leverage multiple clouds with automated operations and secure access to applications and data from any device and location release new applications 42% faster and spend 41% less time fiddling with infrastructure. Liberty Mutual expects its supercloud to reduce annual IT expenses by 28% through 2024 and eventually eliminate as much as 40% of fixed-run costs, Drobisewski said.

But the mechanics of building superclouds are a lot trickier than the concept. Basically, each public cloud provider does things a little bit differently, ranging from the way they store data to how they manage networks. Abstracting each providers infrastructure into a common service layer runs the risk of also abstracting away the unique value each provides.

Third-party vendors have come up with some solutions. They say that in most cases, they can not only preserve each cloud service providers unique value but can even improve service quality by building on top of the common layer.

However, at this point, there is no governing standards body or set of generally accepted tools for building superclouds. Most solutions are handcrafted and unique, a fact thats likely to hold back supercloud adoption until standards become clearer.

The drive to create those standards wont come from the public cloud vendors because they have an incentive to keep you in their clouds, said Danny Allan, chief technology officer at Veeam Software Corp., a maker of backup and data protection software. It will come from outside vendors or an industry working group and theres no such effort now that I know of.

Still, a lack of consensus isnt likely to slow the trend. At the end of the day it is, in essence, an abstraction that gives enterprises what they call their four-plus-one strategy: one cloud that uses all the major cloud service platforms plus whatever is on-premises, said Steve Mullaney, chief executive of Aviatrix Systems Inc., which sells a cross-cloud networking platform.

Long-term, the infrastructure should be completely transparent, Allan said. Customers should choose the consumption rate and be able to move seamlessly across infrastructures.

Veeams Allan: Long-term, the infrastructure should be completely transparent. Photo: SiliconANGLE

In the commercial software arena, superclouds are becoming commonplace and even emerging from companies outside of the traditional technology sphere.

For example, the Goldman Sachs Financial Cloud, which was launched last November by Goldman Sachs Group Inc., delivers analytics tools developed internally by the financial services on top of the Amazon Web Services Inc. cloud. Goldman Sachs expects the package both to generate revenue and differentiate itself from other financial firms.

Deloitte LLPs ConvergeHealth is one of a series of vertical market commercial services the company is assembling from multiple clouds. Capital One Financial Corp. recently entered the software business with a suite of data management tools it developed on top of Snowflake Inc.s cross-cloud data warehouse. It sees its Slingshot cloud manager as the first of a line of cloud data management products that will create a new revenue stream.

Its challenging to bring new software to the world but our teams have perfected ways to to build [software-as-a-service] with security, resiliency, performance and scale, said Salim Syed, Capital One Softwares vice president of engineering. We feel we have a very good product.

Snowflake is one of the most advanced commercial supercloud providers, according to Wikibon, with a multicloud platform that spans all three major infrastructure-as-a-service platforms AWS, Microsoft Corp.s Azure and Google Cloud while making the location of data transparent to users, according to Christian Kleinerman, Snowflakes senior vice president of product.

Snowflake Kleinerman: For customers, multicloud portability has gone in importance from a one or two to a nine or 10. Photo: SiliconANGLE

We didnt want to become another version of silos in the data center, he said. It was important to have a single, central system that interconnects them all. The technology the company developed, called Snowgrid, enables people to collaborate on a single copy of data with a common set of controls and governance policies regardless of where the data physically resides.

As recently as three years ago, few prospective customers asked for such features, but over the last year theyve realized the value of having a single stack, Kleinerman said. Multicloud portability has gone in importance from a one or two to a nine or 10. Cloud independence is now a major reason customers come to Snowflake.

Other data management vendors such as MongoDB Inc., Couchbase Inc. and Databricks Inc. also tout cross-cloud compatibility as a selling point. MongoDB is a developer-friendly platform that is moving to a supercloud model running document databases very efficiently and creating a common developer experience across clouds, Wikibon Chief Analyst David Vellante recently wrote.

Dremio Corp., a high-profile distributed data startup, addressed the problem by building an architecture that processes queries in a distributed fashion on the infrastructure where the data lives. We connect to all these different things and push down the query processing to that system, said CEO Tomer Shiran. We will actually spin up Azure or [AWS] EC2 instances with our code running on them.

Such technical wizardry is typical of the solutions that developers are inventing to deal with the superclouds inherent complexity.Its a lot of do-it-yourself stuff right now as far as what the stacks should look like, said Deloittes Linthicum. There havent been a lot of people thinking about it until recently because, until the last year, there wasnt a lot of interest in it.

The need to bridge cross-cloud incompatibilities has been driven by several factors. One is the rise of edge computing, an architecture that distributes processing across a wide network of devices and compute nodes. Each of the big cloud providers has its own edge strategy, but enterprises with far-flung networks dont want to be tied to a single provider.

A big reason for that is latency. Edge devices, particularly those that collect data in real-time, need to be close enough to a cloud data center, or region, to enable the high-speed communication this is needed for rapid decision-making. For latency-sensitive applications, that distance may be as little as 100 miles.

Deloittes Linthicum: We dont want to limit the ability of innovators to use best-of-breed services. Photo: David Linthicum

Where you place elements of your workloads matters, said Matt Baker, senior vice president of corporate strategy at Dell Technologies Inc. Latency of more than 10 milliseconds can kill some applications. Locality becomes critically important. Superclouds give organizations more latitude in which cloud regions to use.

Snowflake touts its multiregion reach as a strength. Once a customer makes a query, all the code that runs in a specific region is native to that region, Kleinerman said. Instead of a software translation layer, the user model is the point of abstraction.

A second factor is simplicity. Businesses dont want to have to wrestle with the fine points of each cloud providers operating and management stacks, particularly at a time when IT skills are in desperately short supply.

If youre having trouble hiring for your AWS cloud, how are you going to add Azure into that? asked Amanda Blevins, chief technology officer for the Americas at VMware.

Economics and labor scarcity means that the dumb thing would be to solve every security and FinOps problem for each cloud and keep around the skill sets to run them, said Deloittes Linthicum. Were going to reach a complexity state where the number of tools and talents we need exceeds the operations budget. FinOps is the practice of creating visibility and accountability to manage cloud spending throughout an organization.

Like many firms, Dremio developed its own workarounds for managing distributed data across clouds, said CEO Shiran. Photo: SiliconANGLE

The VMware study cited these low-level compatibility issues as a major disadvantage of the current multicloud landscape. For developers, each cloud provider has unique infrastructure, interfaces and APIs that add work and slow the pace of their releases, it said. Each additional cloud increases the complexity of their architecture, fragmenting security, performance optimization and cost management.

Snowflakes Kleinerman likened the current situation to the need for smartphone developers to build functionally identical applications for both Apple Inc. and Android platforms. Developers are 10 times more excited than CIOs about this, he said. Instead of building three versions of one app, you can write it once and run it in multiple locations.

A third motivator is to gain access to the offerings from the different cloud service providers that best meet their needs. Google, for example, is widely recognized as having the best analytics tools, while Microsofts business applications are its strength. We dont want to limit the ability of innovators to use best-of-breed services, Linthicum said.

But there is a multitude of impediments to be overcome. One of the biggest is data gravity, or the difficulty of moving large amounts of data between clouds. Organizations building sophisticated data analytics and artificial intelligence training models dont want to wait hours for a terabyte of data to move from one cloud to another.

A lot of the solutions for shifting workloads dont address the data challenge, said Liberty Mutuals Drobisewski. Data mobility in many ways is the most challenging problem right now.

Distributed data management vendors have come up with some clever ways to address the gravity problem, usually involving distributing queries to the infrastructure where the data resides. You cant be transferring terabytes of data to do a join, said Dremios Shiran. His company uses local caching and technologies such as the nonvolatile memory express storage access and transport protocol, so we dont have to keep going back to the [original] resource for every single input and output.

Each cloud service provider also has its own approach to networking, security and backup and those are burdens to developers, Drobisewski said. Were looking at how we can have a common protocol that allows you to interact with all of those equally, with a common API layer, so youre not that worried about which cloud provider youre working with.

Aviatrix built a supercloud that optimizes network performance and automates security across multiple CSPs. We actually improve the functionality, Mullaney said. The CSPs provide primitive networking and are limited to a shared service designed for millions of small customers. We not only connect across all of them but also add in advanced services. The approach appears to be resonating with customers: Mullaney said Aviatrix is on track to book $100 million in annual recurring revenue this year.

Maribel Lopez: Supercloud has some real problems to solve around authentication, identity, data lineage and data security. Photo: SiliconANGLE

Then theres the problem of data portability. Each CSP favors a different storage protocol, which doesnt necessarily work with anothers. Each also offers different kinds of block, file and object storage. Making a storage system looking the same across every provider takes some doing, Linthicum said.

Here, again, third parties are inventing solutions. Snowflake uses external tables that interact with each providers preferred storage format and loads data into a neutral format.

Veeam addressed the problem with a self-described file system similar to that used in compression algorithms such as ZIP and RAR. The compressed object includes not only files but also the software needed to decompress them. Its a file system within a file, Allan said. It enables the supercloud because now you have a portable, self-describing thing that can be moved anywhere, powered on and it knows the format of the host.

Security is also a multicloud hairball. Each cloud provider has its own security tools and approaches, the VMware study concluded. In addition to implementing security controls in individual clouds, enterprises must also secure communication between clouds and their respective workloads, applications and end users.

Weve got some real problems to solve around authentication, identity, data lineage and data security, Maribel Lopez, founder and principal analyst at Lopez Research, said in a SiliconANGLE Supercloud22 interview. Those are going to be sort of the tactical things were working on for the next couple of years.

At Liberty Mutual, supercloud security has been a huge focus, Drobisewski said. In the wake of COVID-19 lockdowns, the company adopted a zero trust model for perimeter security and has since applied access controls down to the individual cloud API. As we get more cloud native architectures in place, were looking to move our focus on zero trust beyond redefining the perimeter and taking a more workload and application-centric approach, he said.

Finally, the observability challenges of monitoring even a single hybrid cloud are daunting. Sophisticated tooling will be needed to manage supercloud environments that may encompass thousands of services. If any service suffers an outage it could cause you to have an even bigger outage, Lowery said. Its a question of not knowing when youre going off a cliff.

All of those solutions have one thing in common: They are bespoke projects that are unique to individual vendors and user organizations. Are broad industry standards likely to emerge? Some efforts are underway.

Crossplane, for example, is an open-source project being incubated by the Cloud Native Computing Foundation thats intended to let organizations build cross-cloud control planes. However, it requires users to run software containers and the Kubernetes container orchestrator, which are cloud-native constructs that dont apply to most legacy applications.

The CNCF can make things happen from a Kubernetes perspective, but theyre fairly limited to containers, said Veeams Allan. Kubernetes workloads are certainly rapidly expanding but the vast majority of workloads are images running on bare-metal or virtualization layers that cant easily be moved across platforms.

VMware is one of the most prominent providers bidding to become the arms dealer for superclouds. Theres been a big shift to cross-cloud services at VMware to let customers run workloads where they choose, said VMwares Blevins. We have those higher-level services to be able to manage and observe.

For example, the companys vRealize cloud management suite, CloudHealth FinOps application, Secure Access Service Edge and Tanzu Observability platform have all been adapted to support multiple clouds. The companys virtual desktop infrastructure can play a part in unifying clouds at the user level. VMware also has a strong portfolio of edge services and relationships with all the major CSPs.

The hyperscalers partnerships with us are recognition that this is something customers want and need, Blevins said.

Dells Baker: In the early days of architectural shifts, the best thing to do is to use as open an ecosystem as possible. Photo: SiliconANGLE

All this begs the question of whether the big public cloud providers will ever give in and agree to cooperate in the name of making superclouds possible. The expert consensus is that wont happen soon, if ever. Letting users run on [other clouds] or in their data centers isnt part of their business model, Blevins said.

There are signs, however, that even the biggest of the big now acknowledge that customers favor more interoperability and that a rising tide will ultimately lift all boats. The reality is that theyre going to make more money if the supercloud is successful, said Linthicum. Adoption of cloud computing will go up. Everybodys going to win.

Lowery said the big cloud providers may have concerns about third parties taking over the relationship with their customers, but they dont have much of a choice. It wont be possible for the hyperscalers to build superclouds for what everyone wants. Ultimately, they will see this as a way to sell more, he said.

Dells Baker believes that all cloud services will ultimately be hybrid. In the early days of architectural shifts, the best thing to do is to use as open an ecosystem as possible as opposed to carving out a stack, as each hyperscaler has done so far, he said.

That doesnt mean underlying infrastructure is ever likely to be completely abstracted. For example, private networking services typically establish a direct link between the customer and a particular cloud vendor. Some applications will be best built to take advantage of a particular database or analytics suite. And the supercloud may actually give platform providers more incentive to develop services that dont lend themselves to cross-cloud abstraction.

Nevertheless, the overall trend is clear and thats goodness for organizations that have struggled with years of complexity. Clouds are really very sophisticated operating systems with services that meet business needs, not just programmer needs, Lowery said. Were moving away from operating systems and focusing on the business value. That will continue.

Indeed, said Linthicum, its the single most exciting thing in cloud computing. Its a tectonic shift, he said. Its absolutely the right thing to do, but theres a tremendous amount of work still to be done.

Go here to see the original:
Inside the rush to build 'superclouds' - SiliconANGLE News

Read More..