Page 54«..1020..53545556..6070..»

Post-Quantum Cryptography (PQC) in Industrial and Critical Infrastructure Networks | by Cyber Safe Institute | Jul, 2024 – Medium

The rapid development of quantum computers poses a significant threat to current cybersecurity practices. While this has spurred considerable development in quantum-resistant cryptography, particularly in IT, integrating these advancements into the realm of industrial and critical infrastructure networks requires a tailored approach. The unique characteristics of these networks demand specialized solutions for maintaining security in a post-quantum world.

Industrial Control Systems (ICSs) play a crucial role in critical infrastructure, encompassing various sectors like energy, water, transportation, and manufacturing. [1] These systems often rely on interconnected Cyber-Physical Systems (CPS) to manage and control physical processes. [1] A cyberattack on a critical infrastructure network could have catastrophic consequences, disrupting essential services and potentially leading to significant economic and societal damage. [2] For instance, imagine the impact of a cyberattack on a transportation system reliant on communication between autonomous vehicles and central control stations. [2] The consequences could be both economically and socially devastating.

However, securing these networks presents unique challenges. Many ICSs utilize legacy hardware with long lifespans, often exceeding 20 years. [3] This longevity makes it difficult to keep pace with the evolving landscape of cybersecurity threats and integrate modern security measures seamlessly. [3] Additionally, the need for real-time responsiveness in many critical infrastructure components, sometimes demanding reactions within milliseconds, further complicates the implementation of comprehensive cybersecurity protocols. [4] Unlike IT systems, where downtime, while inconvenient, is often tolerable, even brief disruptions in critical infrastructure operations can have severe consequences. [4] This necessitates robust cybersecurity measures specifically designed for the constraints of these vital networks.

The development of quantum computers presents a new challenge to cybersecurity by potentially rendering current cryptographic methods obsolete. Current widely used public key cryptosystems like RSA and ECC, which depend on the difficulty of factoring large numbers or solving elliptic curve discrete logarithms, are particularly vulnerable. [5, 6] Quantum algorithms, specifically Shors algorithm, could solve these mathematical problems exponentially faster than classical algorithms, making it feasible to break these cryptosystems. [6]

Experts estimate that a cryptographic secrets value lasts for about 15 years. [3] Given the rapid progress in quantum computing, its crucial to implement quantum-resistant algorithms well in advance of a fully functional quantum computer becoming a reality. [6] This urgency stems from the harvest now, decrypt later strategy potentially employed by malicious actors, where encrypted data is collected today to be decrypted later when powerful quantum computers become available. [7] The potential for such breaches underscores the need for transitioning to quantum-secure cryptography as a pressing concern.

Post-Quantum Cryptography (PQC) offers a solution to the challenges posed by quantum computers. [8] PQC algorithms are based on mathematical problems that are believed to be difficult for both classical and quantum computers to solve, ensuring security even in a future dominated by quantum technology. [8] There are several families of PQC algorithms, each relying on different hard problems:

These families present a diverse set of options for quantum-resistant cryptography, each with its own strengths and weaknesses in terms of security levels, ciphertext size, speed, and computational requirements. [10, 11] The diversity of approaches within PQC underscores the active research and development in this field, driven by the need for robust cryptographic solutions in the face of emerging quantum threats.

Integrating PQC into industrial and critical infrastructure networks presents its own set of challenges. These networks often have strict latency requirements, making the computational overhead introduced by some PQC algorithms a concern. [12] Additionally, many of these systems rely on legacy devices with limited computational power and memory, making it challenging to implement complex cryptographic protocols. [12]

Furthermore, the lack of a unified global standard for PQC poses additional challenges. [13] Different countries and organizations are independently developing their own PQC standards, potentially leading to a fragmented landscape with various protocols and implementations. [13, 14] This lack of uniformity could create interoperability issues and complicate the integration of PQC into global critical infrastructure networks, as companies operating in different regions might need to adhere to different standards, increasing complexity and costs.

To effectively integrate PQC into industrial and critical infrastructure networks, several key steps need to be taken:

The integration of PQC into industrial and critical infrastructure networks is not merely a technological upgrade; its a necessity for ensuring the security and resilience of essential services in a quantum future. While the transition presents challenges, it also offers an opportunity to rethink cybersecurity approaches and develop innovative solutions tailored to the unique demands of these vital networks. As quantum computing technology continues to advance, adopting proactive, collaborative, and innovative strategies for integrating PQC will be paramount in ensuring the continued functionality and security of critical infrastructure systems.

Visit link:
Post-Quantum Cryptography (PQC) in Industrial and Critical Infrastructure Networks | by Cyber Safe Institute | Jul, 2024 - Medium

Read More..

Revolutionizing Success in the Quantum Information System – openPR

Quantum Information System Service Market

The global Quantum Information System Service market report is provided for the international markets as well as development trends, competitive landscape analysis, and key region's development status. Development policies and plans are discussed as well as manufacturing processes and cost structures are also analyzed. This report additionally states import/export consumption, supply and demand Figures, cost, price, revenue, and gross margins. The Global Quantum Information System Service market 2024 research provides a basic overview of the industry including definitions, classifications, applications, and industry chain structure.

Get the Sample Copy of the Report at: https://www.worldwidemarketreports.com/sample/1019367

Scope of the Quantum Information System Service Market:

The Global Quantum Information System Service market is anticipated to rise at a considerable rate during the forecast period, between 2024 and 2031. In 2024, the market is growing at a steady rate, and with the rising adoption of strategies by key players, the market is expected to rise over the projected horizon.

The report also gives a 360-degree overview of the competitive landscape of the industries that are:

IBM Google Microsoft D-Wave Systems Rigetti Computing IonQ

Quantum Information System Service Market Segments:

By Types:

Cloud-Based On-Premises

By Applications:

Large Enterprises Medium Enterprises Small Enterprises

Request Sample Copy of this Report at: https://www.worldwidemarketreports.com/sample/1019367

Key Market Segmentation:

WMR provides an analysis of the key trends in each sub-segment of the global Quantum Information System Service market, along with forecasts at the global, regional, and country-level analysis from 2024 to 2031. Our report has categorized the market based on type, offering, technology, system, and end-use industry. The biggest highlight of the report is to provide companies in the industry with a strategic analysis of the impact of COVID-19. At the same time, this report analyzed the market of the leading 20 countries and introduce the market potential of these countries.

: A few important variables, including the rising consumer demand for the product, effective marketing tactics in new markets, and significant financial investments in product development, are the primary drivers of Quantum Information System Service.

: Easy availability to rivals is one of the challenges in the market for Quantum Information System Service. Another barrier in the market is the low cost of alternatives. However, firms intend to overcome this obstacle by using cutting-edge technology and managing prices, which will subsequently boost product demand. Moreover, in order for market participants to prevent risks, alter their plans, and carry on with operations, researchers have also highlighted major hurdles for them. By doing this, producers will be able to properly manage their resources without sacrificing product quality or timely market delivery.

: businesses can take advantage of them by putting the proper plans in place. The prospects described in the report assist the stakeholders and report buyers in properly planning their investments and obtaining the most return on investment.

: The market sees a few developments that assist businesses in developing more successful tactics. The report with the most recent data discusses the current trends. Customers can obtain an idea of the upcoming offerings on the market, and businesses can plan on producing greatly improved solutions with the use of this information.

Geographically, the detailed analysis of consumption, revenue, market share, and growth rate of the following regions:

North America (United States, Canada, Mexico) Europe (Germany, UK, France, Italy, Spain, Others) Asia-Pacific (China, Japan, India, South Korea, Southeast Asia, Others) The Middle East and Africa (Saudi Arabia, UAE, South Africa, Others) South America (Brazil, Others)

Global Quantum Information System Service Market Development Strategy Pre and Post COVID-19, by Corporate Strategy Analysis, Landscape, Type, Application, and Leading 20 Countries covers and analyzes the potential of the global Quantum Information System Service industry, providing statistical information about market dynamics, growth factors, major challenges, PEST analysis, and market entry strategy Analysis, opportunities and forecasts.

Valuable Points from Quantum Information System Service Market Research Report 2024-2031:

Significant changes in Market dynamics. Reporting and assessment of recent industry developments. A complete background analysis, which includes a valuation of the parental Quantum Information System Service Market. Current, Historical, and projected size of the Quantum Information System Service Market from the viewpoint of both value and volume. Quantum Information System Service Market segmentation according to Top Regions. Quantum Information System Service Market shares and strategies of key Manufacturers. Emerging Specific segments and regions for Quantum Information System Service Market. An objective valuation of the trajectory of the Market. Recommendations to Top Companies for reinforcement of their foothold in the market.

FAQ's:

[1] Who are the global manufacturers of Quantum Information System Service, what are their share, price, volume, competitive landscape, SWOT analysis, and future growth plans?

[2] What are the key drivers, growth/restraining factors, and challenges of Quantum Information System Service?

[3] How is the Quantum Information System Service industry expected to grow in the projected period?

[4] How has COVID-19 affected the Quantum Information System Service industry and is there any change in the regulatory policy framework?

[5] What are the key areas of applications and product types of the Quantum Information System Service industry that can expect huge demand during the forecast period?

[6] What are the key offerings and new strategies adopted by Quantum Information System Service players?

Buy this report and Get Up to % Discount At: https://www.worldwidemarketreports.com/promobuy/1019367

Reason to Buy:

Save and reduce time carrying out entry-level research by identifying the growth, size, leading players, and segments in the global Quantum Information System Service Market. Highlights key business priorities in order to guide the companies to reform their business strategies and establish themselves in the wide geography. The key findings and recommendations highlight crucial progressive industry trends in the Quantum Information System Service Market, thereby allowing players to develop effective long-term strategies in order to garner their market revenue. Develop/modify business expansion plans by using substantial growth offerings in developed and emerging markets. Scrutinize in-depth global market trends and outlook coupled with the factors driving the market, as well as those restraining the growth to a certain extent. Enhance the decision-making process by understanding the strategies that underpin commercial interest with respect to products, segmentation, and industry verticals.

About Author:

Vaagisha brings over three years of expertise as a content editor in the market research domain. Originally a creative writer, she discovered her passion for editing, combining her flair for writing with a meticulous eye for detail. Her ability to craft and refine compelling content makes her an invaluable asset in delivering polished and engaging write-ups.

(LinkedIn: https://www.linkedin.com/in/vaagisha-singh-8080b91)

Contact Us:

Mr. Shah Worldwide Market Reports, Tel: U.S. +1-415-871-0703 U.K. +44-203-289-4040 Japan +81-50-5539-1737 Email: sales@worldwidemarketreports.com Website: https://www.worldwidemarketreports.com/

About WMR:

Worldwide Market Reports is your one-stop repository of detailed and in-depth market research reports compiled by an extensive list of publishers from across the globe. We offer reports across virtually all domains and an exhaustive list of sub-domains under the sun. The in-depth market analysis by some of the most vastly experienced analysts provides our diverse range of clients from across all industries with vital decision-making insights to plan and align their market strategies in line with current market trends.

This release was published on openPR.

Originally posted here:
Revolutionizing Success in the Quantum Information System - openPR

Read More..

Danish startup secures 10M seed to advance quantum computing in life sciences – ArcticStartup

- Advertisement -

Kvantify, a Copenhagen-based quantum software startup, has successfully closed a 10 million seed round. This funding will enable the startup to strengthen its position as a global leader in quantum computing, focusing initially on life sciences applications. The round is led by Danish VC Dreamcraft, alongside biotech investor Lundbeckfonden BioCapital and the private investment company 2degrees, with participation from Redstone VC, 2xN, and EIFO. Kvantify plans to use the investment to accelerate the development of quantum solutions for drug discovery and chemical simulation, aiming to address complex problems and expand applicability across various industries. This strategic funding will enhance Kvantifys innovative capabilities, ensuring a significant impact on the life sciences sector and beyond.

Kvantify is dedicated to harnessing the power of quantum computing to solve complex scientific and industrial challenges. With a strong emphasis on life sciences, Kvantify develops advanced quantum algorithms and high-performance computing solutions aimed at revolutionizing drug discovery and chemical simulation. Their mission is to make quantum technology accessible and valuable to businesses worldwide, driving innovation and efficiency in various sectors. Leveraging an interdisciplinary team and cutting-edge technology, Kvantify is positioned at the forefront of the quantum computing revolution.

The seed round is notable not only for its substantial size but also for the strategic match of the new investors to Kvantifys mission. It is led by Danish VC Dreamcraft, together with biotech investor Lundbeckfonden BioCapital and the private investment company 2degrees. Other notable investors include international sector-focused tech investor Redstone VC, Danish lead quantum VC 2xN as well as EIFO.

Lundbeckfonden BioCapital is a large Danish investor focused on local life science companies, supporting the translation and commercialization of ground-breaking science. This is Lundbeckfonden BioCapitals first investment outside the therapeutics space.

With our investment in Kvantify, we are broadening our footprint in and commitment to further strengthening the Danish life science ecosystem. Quantum computing can deliver accuracy and derisking to the early stages of drug development to a level not possible with classical computers, thereby enabling faster speed to market. We are therefore excited about this opportunity and look forward to working with the Kvantify team to bridge quantum computing and drug development to the future benefit of patients, said Jacob Falck Hansen, Partner at Lundbeckfonden BioCapital.

Danish VC Dreamcraft invests in tech-driven companies, from pre-seed to series A, and has a proven track record with B2B SaaS software.

Were thrilled to partner with the team at Kvantify as they take a significant step forward in their mission to fulfill the promise of industrial applications of quantum computers. The potential of quantum chemical computational drug discovery is massive and represents a truly exciting beachhead market. We cannot wait to see how Kvantify will help solve todays seemingly impossible problems and serve as a crucial tool in designing the solutions of the future. says Carsten Salling, General Partner at Dreamcraft.

Redstone QAI Quantum Fund is a highly specialized venture capital fund that focuses on investing in groundbreaking technologies within the quantum technologies sector.

Kvantifys focus on applying quantum computing to life sciences and further industrial use cases across various sectors aligns with our strategic vision of advancing practical and impactful quantum solutions. With their interdisciplinary team, in-depth knowledge of quantum technology, and innovative approach to enhancing computational efficiency, Kvantify is perfectly placed to bring tremendous value to commercial markets, says Marco Stutz, Partner at Redstone.

In light of their successful product launch for a groundbreaking drug discovery tool, Hans Henrik Knudsen, CEO of Kvantify, comments:

On behalf of the founding team, we are incredibly excited about the completion of our 10 million seed round, which marks a significant milestone for Kvantify. This funding not only validates our vision of leveraging quantum computing to revolutionize the life sciences industry but also provides us with the resources and strategic partnerships needed to accelerate our development and growth. With the support of new and existing investors, we are well-positioned to continue to bring groundbreaking solutions to market.

Click to read more funding news.

- Advertisement -

View original post here:
Danish startup secures 10M seed to advance quantum computing in life sciences - ArcticStartup

Read More..

The Quantum Algorithm Revolution: A New Era in Computing | by Disruptive Concepts | Jul, 2024 – Medium

A futuristic representation of a quantum computer, showcasing the intricate and powerful nature of quantum technology.

Quantum computing has long been the stuff of science fiction, but recent advances have propelled it into the realm of reality. One of the most exciting developments is a quantum algorithm designed for the Planted Noisy kXOR problem, achieving a nearly quartic speedup over classical algorithms. This breakthrough leverages the power of quantum mechanics to solve problems far more efficiently than ever before. The implications for cryptography, data analysis, and beyond are profound, promising a future where quantum speedups become the norm.

Central to this quantum leap is the Kikuchi Method, a sophisticated technique that simplifies complex problems by transforming them into more manageable forms. By converting kXOR problems into 2XOR problems, the Kikuchi Method allows quantum algorithms to exploit linear algebraic methods. This transformation is crucial for achieving significant speedups, demonstrating the potential to revolutionize how we approach problem-solving in various fields, from cryptography to machine learning.

The heart of this breakthrough lies in the nearly quartic speedup provided by the quantum algorithm. Traditional computing methods struggle with the

Original post:
The Quantum Algorithm Revolution: A New Era in Computing | by Disruptive Concepts | Jul, 2024 - Medium

Read More..

Post-Quantum Cryptography: Safeguarding Critical Infrastructure in the Quantum Age – Medium

The rapid evolution of communication technologies has ushered in an era of unprecedented interconnectedness. This hyperconnected world relies heavily on secure and private communications for critical tasks [1]. Cyber vulnerabilities in essential systems, such as those managing smart cities or automated industries, could lead to catastrophic economic and social consequences [2]. For instance, malicious intrusions into communication networks guiding autonomous vehicles could have fatal repercussions [2].

Modern warfare and crime often involve hacking activities targeting critical infrastructures (CI) [2]. These attacks aim to disrupt operations, shorten the lifespan of devices, or steal sensitive information, resulting in an estimated 2,200 known cyberattacks daily in 2022 [2]. The potential for a Cyber Apocalypse, where cyberattacks cripple a nations civilian and military services by exploiting vulnerabilities in interconnected CI systems, has become a growing concern [2].

Traditional cryptographic methods, like RSA and ECC, face a significant threat from the advent of quantum computers [3, 4]. Shors algorithm, executable on quantum computers, can solve factorization problems exponentially faster than classical algorithms, jeopardizing the security of widely used public key cryptosystems [5].

As quantum computing rapidly advances, transitioning to quantum-resistant cryptographic solutions is crucial. This urgency stems from the harvest now, decrypt later strategy, where malicious actors store encrypted data today to decrypt it once powerful quantum computers become available [6]. This threat necessitates proactive measures to ensure long-term cybersecurity, especially for systems with extended lifespans, like those found in CI [6].

Unlike information technology (IT) systems, which prioritize confidentiality, operational technology (OT) systems, often used in CI, demand high availability, with minimal downtime tolerance [7]. This difference highlights a key challenge in securing CI: any security solution should not disrupt the continuous operation of critical functions [7].

Implementing robust cybersecurity in CI faces further hurdles due to factors like legacy equipment, slow patching processes, and real-time responsiveness requirements, often necessitating millisecond-level reactions [7]. Integrating new cybersecurity measures into older CI, built without considering modern threats, presents a considerable challenge and cost compared to newer facilities designed with security in mind [8, 9].

Recent regulations requiring VPNs for insecure industrial protocols and the push for post-quantum encryption in critical infrastructure underscore the need for constant adaptation in industrial cybersecurity [10].

Post-Quantum Cryptography (PQC) offers a solution to the threat posed by quantum computers to classical cryptographic systems. PQC relies on mathematical problems that are difficult for both classical and quantum computers to solve, ensuring security in a post-quantum world [11].

There are seven main families of PQC algorithms:

Among these, lattice-based cryptography appears most promising for CI due to its relatively small key sizes and lower computational costs compared to other PQC families [13]. However, recent research proposing a polynomial-time quantum algorithm for solving the Learning with Errors (LWE) problem, which underpins many lattice-based cryptosystems, warrants caution and further investigation [14].

While not ideal for CI due to large ciphertext sizes, hash-based cryptography has seen wider adoption. SPHINCS+, a multi-time signature scheme, is being considered for standardization by Europe, Japan, and the United States [15].

Integrating PQC into CI requires careful consideration of the unique characteristics and constraints of these systems.

Latency: A primary concern is the potential latency introduced by PQC algorithms. Real-time responsiveness is paramount in OT environments, and any delays can have significant consequences [7]. Therefore, selecting and implementing PQC solutions must prioritize minimal latency to avoid operational disruptions.

Legacy Systems: Many CI rely on legacy systems with limited computational power and memory [3]. Integrating PQC into these systems without substantial hardware upgrades poses a significant challenge [4].

Flexibility and Adaptability: The PQC landscape is still evolving, with various standardization efforts globally [16]. It is crucial to implement PQC solutions with flexibility in mind, enabling adaptation to new standards and potential vulnerabilities in existing algorithms [17].

Standardization: While various countries are making efforts to standardize PQC, these efforts are primarily focused on IT systems [18]. Dedicated standardization processes for PQC implementation in industrial environments are crucial to address the specific security needs and operational constraints of CI [18].

Transitioning CI to a quantum-secure state necessitates a multi-faceted approach:

Side-Channel Attacks: Although less prevalent in OT than in IT, side-channel attacks (SCA) pose a concern for CI, particularly given the increasing sophistication of remote attack techniques [22]. Research highlights vulnerabilities in industrial control environments, emphasizing the need for robust countermeasures [23]. Addressing SCA vulnerabilities, especially in the context of PQC implementation, requires careful consideration of factors like error and fault detection, particularly in lattice-based cryptography [23].

For instance, optimizing the Number Theoretic Transform (NTT), often used in lattice-based cryptography, might inadvertently create side channels, necessitating research into secure NTT implementations [23]. Additionally, developing PQC algorithms with inherent resistance to SCA is critical for ensuring the long-term security of CI.

The development of quantum computers presents both a challenge and an opportunity for cybersecurity. While threatening current cryptographic methods, it drives the creation of more resilient solutions. The integration of PQC into CI is not merely a technical upgrade but a crucial step in ensuring the continued functionality and security of the essential services that underpin modern society. By addressing the unique challenges of this domain and prioritizing research, development, and standardization tailored for industrial environments, we can pave the way for a future where critical infrastructure remains resilient and secure in the face of evolving cyber threats.

Read the rest here:
Post-Quantum Cryptography: Safeguarding Critical Infrastructure in the Quantum Age - Medium

Read More..

AI Coin Price: Will Artificial Superintelligence Alliance Have Bullish Impact? – Bankless Times

The SingularityNET, Fetch.ai, and Ocean Protocol-led, Artificial Superintelligence Alliance, have announced the first-ever token launch in the AI space, combining the AGIX, OCEAN, and FET tokens into a new token.

This merger is predicted to establish a fully decentralized AI platform and aim to improve overall organizational efficacy in using AI while adhering to the most transparent and ethical AI advancement criteria.

Together, Fetch.ai, SingularityNET (AGIX), and Ocean Protocol comprise the Artificial Super Intelligence (ASI) Alliance. The alliance seeks to expedite the development of decentralized Artificial General Intelligence (AGI) and, eventually, Artificial Superintelligence (ASI) as the largest open-sourced, independent body in AI research and development.

The alliance's three technological partners will produce products using each project's unique ecosystem-specific features to ensure ethical AI development.

Notably, on July 1, 2024, the three tokens will unite as part of phase one of the project. There will be no disruption to FET trading after the merger. In the meantime, exchanges will begin to delist the current AGIX and OCEAN tokens.

For a seamless transaction during the ASI's rebranding process on all major and minor cryptocurrency platforms, the first phase will concentrate on data aggregator websites and exchanges.

The second phase will concentrate on ASI and community development. Migration contracts for the unconverted tokens across several blockchains will be accessible during this period.

Additionally, during the next upgrade, all FET mainnet tokens will automatically adhere to ASI. The duration of this process is set for several years to provide users and investors enough time to convert their holdings.

Though not very sensitive to broader market news, AI Coins and the segment pick industry-based news to find bullish or bearish signals. The recent rise in Nvidia's market cap, as previously reported by Bankless Times, was an excellent example of this.

Bullish momentum and investor confidence are becoming increasingly crucial for AI-powered cryptocurrencies. Digital assets like AI coins use artificial intelligence to enhance blockchain networks' user experience, security, and scalability.

While a lack of explanations and erratic trading have kept AI currencies volatile, investors should soon recognize today's news as an additional bullish signal, which might drive up the value of many AI coins. This, together with substantial industrial expansion, will be crucial in boosting the bullish momentum of AI coins in the short term.

Read the original:

AI Coin Price: Will Artificial Superintelligence Alliance Have Bullish Impact? - Bankless Times

Read More..

Understanding serverless and serverful computing in the cloud era – TechRadar

Two distinct methods have emerged in cloud computing: serverless and serverful computing. Serverless computing represents a significant departure from traditional approaches, offering exciting possibilities for innovation, operational streamlining, and cost reduction. But what exactly does it involve, and how does it differ from the established serverful model?

Serverless computing introduces an approach where you, the developer, only worry about the code you need to run, not the infrastructure around it. This narrowing of focus simplifies operational management and cuts expenses by pushing the server management tasks elsewhere. As a concept, its similar to Business Process Outsourcing (BPO) or contracting out Facilities Management. Youre concentrating on the areas where you have IP or can build value and letting someone else own those non-core processes that extract value.

In contrast, Serverful computing is how most organizations have consumed the Cloud, where you are accountable for managing and overseeing servers while providing you with the most control and customization options.

Empowering IT professionals and developers with knowledge of these approaches and their inherent tradeoffs is crucial for devising an effective cloud strategy. Your expertise and understanding are key to choosing the right approach for your business.

Social Links Navigation

UK Director for Cloud Computing Technology and Strategy at Akamai.

Serverful computing, or traditional server-based computing, involves a hands-on approach to deploying applications. In this model, you are responsible for managing the servers that run your applications, which includes provisioning servers, updating operating systems, scaling resources to meet demand, access control, and ensuring high availability and fault tolerance.

This approach provides more control over your IT infrastructure. You can customize almost every aspect of your environment to suit your application. For example, you can deploy additional security controls or software, tune the kernel to get maximum performance or use specific operating systems needed to support aspects of your application stackall of which arent readily achievable in a serverless environment.

On the other hand, Serverless computing takes most of the complexity away from managing cloud computing infrastructure by abstracting away the infrastructure. With this abstraction, you avoid directly managing cloud servers and instead hire backend computing in an as a service model. There are still servers, but you no longer need to worry about them; the provider ensures theyre available, patched, compliant, and secure.

Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!

Serverless and Event-Driven computing are often used interchangeably, but whilst they overlap, there are some crucial differences.

Serverless computing can be used to implement event-driven architectures because it can automatically scale to handle a varying number of events and only charges for actual execution time. For instance, a serverless function can be triggered by an event such as an HTTP request or a message in a queue.

Not all event-driven architectures are serverless, and not all serverless functions are event-driven. Event-driven systems can also be built using traditional serverful infrastructure, and serverless functions can perform scheduled tasks or be invoked directly by an API rather than being driven by events.

There is no one-size-fits-all approach, and you may find that you use both options even within a single application. In an HR system, storing employee records in a serverful database is practical to support complex or long-running queries, such as payroll processing. However, multi-stage and ad-hoc time-off requests are well-suited for a serverless application.

Serverless computing offers two primary advantages: simplicity and an execution-based cost model. By adopting serverless, businesses can manage their infrastructure more easily, as the cloud provider takes care of server provisioning, scaling, and maintenance. This approach allows developers to focus on writing and deploying applications without the burden of managing underlying servers.

Serverless computing also enhances efficiency and resource utilization, as businesses only incur costs for the actual computing power used and when used. Business leaders can plan more simply because they know that each transaction costs and we expect , so our bill this month will be .

When used on platforms with open standards, for example, NATS.io instead of a Hyperscalers data-streaming solution, this transaction-based model can significantly reduce expenses and unlock new opportunities for innovation, freeing developers and managers to concentrate on building high-quality applications rather than dealing with infrastructure complexities.

On the other hand, serverful computing provides businesses with greater control and customisation over their infrastructure. By managing your servers, you can tailor their environment to meet specific needs and ensure high performance, reliability, and security. This approach is beneficial for applications that require consistent and long-term resource allocation, as it allows for fine-tuning and optimization that serverless models may not offer.

Additionally, serverful computing enables direct oversight of the hardware and software stack, enabling detailed monitoring and troubleshooting. This hands-on control can be crucial for enterprises with stringent regulatory requirements or those needing to handle sensitive data securely.

While serverless computing offers compelling benefits, it also presents challenges that businesses must navigate. On a smaller scale, being serverless is a highly efficient way to consume cloud computing services. When demand begins to ramp up, it can rapidly become costly, especially if platform lock-in is a factor. Think of it like taking a taxi versus buying a car. A taxi ride once a week is a cheap way to get home from the office, but taking a taxi to and from the office every day, to and from your kids school to drop them off or collect them, and to the shops at the weekend for groceries is going to quickly become outrageously costly when compared to buying a car.

To mitigate these risks, companies need to establish a culture of cost monitoring, open standards, and vendor evaluation. Choosing vendors with low or no egress fees can help control expenses, and using open standards ensures the app's portability. This avoids the risk of introducing technical debt by becoming overly reliant on a single provider's proprietary services or APIs. This will hinder flexibility and increase migration complexities down the line, potentially resulting in significant refactoring costs.

Balancing the advantages of serverless computing with these challenges requires careful planning and strategic decision-making to ensure long-term success in the cloud environment.

The decision here is how you manage the tradeoffs inherent in serverful and serverless computing: control vs consume, open standards vs proprietary, fixed costs vs dynamic cost base. Looking ahead to the next six months and beyond, serverless and serverful computing are poised to continue evolving in response to changing business needs.

While offering simplicity and cost-effectiveness, serverless computing remains constrained by factors such as speed and latency, much like other cloud-based services. However, many providers have built Edge and Distributed platforms that deliver more sophisticated serverless offerings, bringing computing power closer to end-users, mitigating latency issues and enhancing overall performance.

In contrast, serverful computing will maintain its relevance, particularly for applications requiring more significant control over infrastructure, higher performance, or specific regulatory or security requirements. There will always be a place for both serverless and serverful cloud computing. As cloud technology continues to mature, we may see advancements in serverful computing that improve automation, scalability, and resource optimization, further enhancing its appeal in certain use cases.

Ultimately, the future of cloud computing lies in striking the right balance between serverless and serverful approaches, leveraging the strengths of each to optimize performance, efficiency, security, and agility in an increasingly digital world.

We've featured the best cloud hosting provider.

This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro

Link:
Understanding serverless and serverful computing in the cloud era - TechRadar

Read More..

Northern Data Group to launch Europe’s first cloud service with Nvidia H200 GPUs Blocks and Files – Blocks & Files

Datacenter services outfit Northern Data Group will be the first cloud provider in Europe to offer the use of the recently launched powerful H200 GPU hardware from Nvidia.

Germany-headquartered Northern Data is made up of the Taiga Cloud, Ardent Data Centers, and Peak Mining business units. It will be Taiga Cloud that offers the GPUs in datacenters in the final quarter of this year, said Northern Data.

Northern Data Group is an Elite Partner of Nvidia. Taigas energy-efficient cloud is said to be powered by Europes largest cluster of Nvidia 100 Tensor Core and H100 Tensor Core GPUs, which helps organizations accelerate AI and ML innovation on demand, according to the company.

Taiga is deploying the new GPUs in partnership with Supermicro, and the move will help further build up Northern Datas high-performance computing solutions, while complementing its existing GenAI technologies available to customers.

The first full island of over 2,000 Nvidia H200 GPUs will deliver 32 petaFLOPS of performance, it is promised. They will utilize BlueField-3 data processing units (DPUs) and Nvidia Mellanox CX7 NICs. The configuration of Nvidia-referenced architecture will allow customers to access more bandwidth, and faster and more efficient data storage access, said the provider.

The GPUs will be accommodated in one of Northern Data Groups European datacenters, powered by carbon-free, renewable energy, and have a power usage effectiveness ratio of less than 1.2, it is promised.

Our GenAI platform is constantly evolving and we are proud to collaborate with Supermicro to be the first in Europe to offer access to Nvidia H200 GPUs, said Aroosh Thillainathan, founder and CEO, Northern Data Group. This is testament to our commitment to continually offer industry-leading, next-generation solutions.

We are collaborating with Northern Data Group to expand its GenAI cloud offering to include Supermicro GPU servers based on the latest Nvidia H200 HGX GPUs, added Vik Malyala, president and managing director, EMEA, and SVP of technology and AI at Supermicro.

We are fully committed to delivering the most performant and energy-efficient AI infrastructure and solutions, and this collaboration will accelerate availability and bring the best value for customers in Europe.

Last December, Ardent Data Centers revealed a 110 million ($119 million) investment to expand its ability to provide colocation services for HPC and AI compute power across both Europe and the US.

It signed Letters of Intent with two datacenter sites in the US, and was the preferred bidder on a strategic site in the UK. The assets will boost capacity to help address the surging demand for compute power needed for GenAI applications. The anchor tenant for the extra capacity will be sister company Taiga Cloud.

Read the rest here:
Northern Data Group to launch Europe's first cloud service with Nvidia H200 GPUs Blocks and Files - Blocks & Files

Read More..

Cloud Server Market Analysis, Size, Growth, Competitive Strategies, and Worldwide Demand – openPR

Cloud Server

Get Free Exclusive PDF Sample Copy of This Research @ https://www.advancemarketanalytics.com/sample-report/33229-global-cloud-server-market?utm_source=OpenPR&utm_medium=Vinay

Major players profiled in the study are: IBM (United States), Rackspace Technology, Inc. (United States), Microsoft Corporation (United States), Google LLC (United States), HP (United States), Dell (United States), Oracle (United States), Lenovo (China), Sugon (China), Inspur (China), Cisco Systems, Inc. (United States)

Scope of the Report of Cloud Server The Cloud Server market is expected to grow in the future due to increasing business demand for maximum flexibility of resources and high demand for secure servers from enterprises. No deployment, mobility and lesser costs are the major factors for the growth of the cloud server market globally. Cloud servers avoid the hardware issues seen with physical servers, and they are likely to be the most stable option for businesses to keep their IT budget down which is boosting the market.

In Feb 2020, SiteGround, the largest independent web hosting platform trusted by the owners of more than two million domains, announced that it will be moving a big part of its infrastructure to Google Cloud.

The Global Cloud Server Market segments and Market Data Break Down are illuminated below: by Application (Production, Development & Testing, Disaster Recovery, Others), Industry Vertical (BFSI, Retail, Government, Healthcare and Life Sciences, Telecommunication and IT, Travel and Hospitality, Transportation and Logistics, Others), Deployment Model (Public Cloud, Private Cloud, Hybrid Cloud)

Market Opportunities: Increasing Awareness about Cloud Server among Developing Economies Rising Number of Small and Medium-sized Enterprises Globally

Market Drivers: Increase in Demand for Secure Serve due to Cyber-Attacks and System Downtime Growing Demand for Quick Access to Real-Time Data and Deploy Large Projects

Market Trend: Increase Usage of Virtualization Infrastructure and Adoption of Cloud-based Services among Business Enterprise

What can be explored with the Cloud Server Market Study? Gain Market Understanding Identify Growth Opportunities Analyze and Measure the Global Cloud Server Market by Identifying Investment across various Industry Verticals Understand the Trends that will drive Future Changes in Cloud Server Understand the Competitive Scenarios - Track Right Markets - Identify the Right Verticals

Region Included are: North America, Europe, Asia Pacific, Oceania, South America, Middle East & Africa

Country Level Break-Up: United States, Canada, Mexico, Brazil, Argentina, Colombia, Chile, South Africa, Nigeria, Tunisia, Morocco, Germany, United Kingdom (UK), the Netherlands, Spain, Italy, Belgium, Austria, Turkey, Russia, France, Poland, Israel, United Arab Emirates, Qatar, Saudi Arabia, China, Japan, Taiwan, South Korea, Singapore, India, Australia and New Zealand etc.

Get Up to 30% Discount on This Premium Report @ https://www.advancemarketanalytics.com/request-discount/33229-global-cloud-server-market?utm_source=OpenPR&utm_medium=Vinay

Strategic Points Covered in Table of Content of Global Cloud Server Market: Chapter 1: Introduction, market driving force product Objective of Study and Research Scope the Cloud Server market Chapter 2: Exclusive Summary - the basic information of the Cloud Server Market. Chapter 3: Displaying the Market Dynamics- Drivers, Trends and Challenges & Opportunities of the Cloud Server Chapter 4: Presenting the Cloud Server Market Factor Analysis, Porters Five Forces, Supply/Value Chain, PESTEL analysis, Market Entropy, Patent/Trademark Analysis. Chapter 5: Displaying the by Type, End User and Region/Country 2016-2022 Chapter 6: Evaluating the leading manufacturers of the Cloud Server market which consists of its Competitive Landscape, Peer Group Analysis, BCG Matrix & Company Profile Chapter 7: To evaluate the market by segments, by countries and by Manufacturers/Company with revenue share and sales by key countries in these various regions (2024-2030) Chapter 8 & 9: Displaying the Appendix, Methodology and Data Source

Finally, Cloud Server Market is a valuable source of guidance for individuals and companies.

Read Detailed Index of full Research Study at @ https://www.advancemarketanalytics.com/buy-now?format=1&report=33229?utm_source=OpenPR&utm_medium=Vinay

Thanks for reading this article; you can also get individual chapter wise section or region wise report version like North America, Middle East, Africa, Europe or LATAM, Southeast Asia.

Contact Us: Craig Francis (PR & Marketing Manager) AMA Research & Media LLP Unit No. 429, Parsonage Road Edison, NJ New Jersey USA - 08837 Phone: +1(201) 7937323, +1(201) 7937193 sales@advancemarketanalytics.com

About Author: Advance Market Analytics is Global leaders of Market Research Industry provides the quantified B2B research to Fortune 500 companies on high growth emerging opportunities which will impact more than 80% of worldwide companies' revenues. Our Analyst is tracking high growth study with detailed statistical and in-depth analysis of market trends & dynamics that provide a complete overview of the industry. We follow an extensive research methodology coupled with critical insights related industry factors and market forces to generate the best value for our clients. We Provides reliable primary and secondary data sources, our analysts and consultants derive informative and usable data suited for our clients business needs. The research study enables clients to meet varied market objectives a from global footprint expansion to supply chain optimization and from competitor profiling to M&As.

This release was published on openPR.

Link:
Cloud Server Market Analysis, Size, Growth, Competitive Strategies, and Worldwide Demand - openPR

Read More..

Apple is reportedly making M5 chips for both Mac and cloud AI use – Pocket-lint

Key Takeaways

Apple is switching to a dual-purpose focus with its upcoming M5 processor, aiming to power both Macs and the servers running Apple Intelligence, according to the Chinese-language Economic Daily, by way of DigiTimes. The chip is reportedly in small-scale trial production, using a more advanced SoIC (System on Integrated Chip) packaging technology created in tandem with long-time Apple partner TSMC. Beyond just exploiting 3D architecture, like existing SoICs, the M5 allegedly incorporates thermoplastic carbon fiber composite molding technology.

At WWDC 24, Apple announced some major AI upgrades. Here are all the new Apple Intelligence features coming to Apple's devices.

The significance of that last point is uncertain, but to date, most of Apple's M-series chips have been geared towards Macs. With Apple Intelligence slated to launch alongside iOS 18, iPadOS 18, and macOS Sonoma this fall, the company may need future servers to be maximally efficient to cope with millions of generative AI prompts every day. Its current AI servers are believed to be equipped with the M2 Ultra, also found in the Mac Pro and higher-end versions of the Mac Studio.

Apple and TSMC's goal for the M5 is to have it enter mass production in 2025 and 2026, DigiTimes says. If so, the first M5 Macs will likely ship towards the end of 2025 rather than earlier, since Apple has yet to put the M4 in anything but 11- and 13-inch iPad Pros. The first M4 Macs are expected to arrive by the end of 2024, and could be announced as soon as Apple's September iPhone event.

The M series is optimized for Apple's various software platforms. They're more efficient at associated tasks than an AMD or Intel chip would be, which can mean speed boosts in some cases, and less memory usage in others. A knock-on benefit may be reduced power consumption, which is extremely important in the context of cloud servers. Apple datacenters are estimated to have consumed 2.344 billion kWh of electricity in 2023 alone, which is not only expensive but an obstacle to Apple's environmental sustainability goals. The company is going to have to ramp up its renewable power projects to support Apple Intelligence, and may be hoping that the M5 will take some of the edge off.

Over the decades, Apple has gradually brought more and more chip design in-house, even if it's reliant on firms like TSMC to actually manufacture parts. Some other examples include the A-series processors used in iPhones and iPads, and its W-, U-, and H-series wireless chips.

Read the original here:
Apple is reportedly making M5 chips for both Mac and cloud AI use - Pocket-lint

Read More..