Page 3,832«..1020..3,8313,8323,8333,834..3,8403,850..»

Revolutionary Mellanox ConnectX-6 Dx SmartNICs and BlueField-2 I/O Processing Units Transform Cloud and Data Center Security – Yahoo Finance

Mellanox Announces General Availability of ConnectX-6 Dx SmartNICs, Featuring Pioneering Security Accelerators for Cloud Platforms

RSA Conference 2020 Mellanox Technologies, Ltd. (NASDAQ: MLNX), a leading supplier of high-performance, end-to-end smart interconnect solutions for data center servers and storage systems, today announced the immediate general availability of ConnectX-6 Dx SmartNICs, in addition to the soon-to-be-released BlueField-2 I/O Processing Units (IPUs). Both feature a suite of cutting-edge security acceleration engines and platform security capabilities for building highly secure and efficient data center infrastructures at massive scale, across public, on-premises and edge environments.

Todays rapidly evolving cyber threats are driving organizations to continuously assess their security postures and adopt holistic defense strategies that address physical, virtual and human factors. The growth of cloud and edge computing has redefined the physical and virtual boundaries of the data center. Continuous advancements in cloud networking technologies and growing data volumes pose challenges to enterprises and cloud operators to deliver enhanced digital experiences, while also protecting data and assets.

"Networking and security must converge to achieve consistent and predictable application performance, with all the necessary levels of data privacy, integrity and reliability. This vision is the core foundation on which we designed our ConnectX-6 Dx SmartNIC and BlueField-2 IPU products," said Amit Krig, senior vice president, Ethernet NIC and IPU Product Line at Mellanox Technologies. "Today we are excited to ship production qualified ConnectX-6 Dx SmartNICs to our hyperscale customers, turning our vision into a reality."

"As countless economic opportunities are unlocked by software-defined cloud technologies, they bring along security requirements. Pure software-defined cyber security offerings that are decoupled from the underlying hardware will be challenged to achieve adequate protection, scale, and efficiency," said Vikram Phatak, Founder of NSS Labs. "By deploying purpose-built networking hardware like ConnectX-6 Dx SmartNICs and BlueField-2 IPUs with cloud-native software, cloud adopters and innovators can benefit from hardware-accelerated, fine-grained security that improves efficiency, increases agility and lowers costs."

ConnectX-6 Dx and BlueField-2 play a key role in distributed, zero-trust security architectures that extend traditional perimeter security to every endpoint. Integrating cutting-edge IPsec and TLS cryptographic acceleration technologies with leading open-source, upstream software solutions, Mellanox allows customers and partners to take advantage of innovative hardware acceleration capabilities in both new and existing data center environments. Both virtualized and bare metal servers benefit from secure web application delivery, east-west communication encryption, RoCE transport communication encryption and data-at-rest storage encryption, at data rates of up to 200Gb/s, and enhanced CPU efficiencies.

The emergence of cloud workload protection solutions calls for scalable and stateful packet filtering capabilities that both preserve application performance and enforce resilient security policies. Mellanox ConnectX SmartNICs and BlueField IPUs provide in-hardware security policy enforcement and connection tracking at full wire speed with up to 100X performance gains compared to non-accelerated solutions - making them ideal to boost next-generation firewalls in bare-metal, virtualized and containerized cloud environments.

As an IPU, BlueField-2 provides even more in-hardware security capabilities, including agentless micro-segmentation, advanced malware detection, deep packet inspection and application recognition, that far outperform software-only solutions. Mellanox BlueField IPUs enable the best of both worlds the speed and flexibility of software-defined solutions, with tighter security, accelerated performance and improved efficiency by processing data in the device hardware at the I/O path.

See the Mellanox ConnectX-6 Dx and BlueField-2 IPU in the Mellanox booth #4525 at the RSA Conference, February 24-27, Moscone Center, San Francisco.

Story continues

Supporting Quotes:

"The 2nd Gen AMD EPYC processors are the first x86 data center processor that supports PCIe 4.0 and provides industry leading performance for workloads such as virtualization, database applications, and high-performance computing," said Raghu Nambiar, corporate vice president, Data center Ecosystems & Application Engineering, AMD. "Our partnership with Mellanox, the leader in high-performance network solutions, is a natural fit for our open ecosystem strategy and we look forward to seeing our joint customers deploy Mellanox ConnectX-6 Dx SmartNICs and BlueField-2 IPUs on 2nd Gen AMD EPYC based systems."

"Arm Neoverse platforms are designed for fast, compute-intensive tasks, and support the broadest set of applications," said Mohamed Awad, vice president of marketing, Infrastructure Line of Business at Arm. "Integrating high-density Arm-based processors with Mellanox BlueField-2 IPUs delivers an optimal balance of performance and efficiency, capable of enabling advanced functionality across cloud, AI, and edge applications."

"Our hyperscale cloud platform serves over 200 million daily active users of our artificial intelligence-based mobile news feed and searching app. Data security is essential across every digital product and service we offer, and even more so as we plan to move beyond mobile, into smart homes and automobiles," said Ning Liu, director of system department at Baidu. "We have piloted the first generation of Mellanoxs BlueField IPU and look forward to deploying BlueField-2 for our bare-metal instances in production."

"BlueField-2 SmartNICs with Ubuntu preinstalled break new grounds in data center security at scale and unlock exciting fields for bare-metal provisioning, network offloading and innovative data center architectures," said Loic Minier, global director, Field Engineering at Canonical. "By leveraging snaps, Bluefield 2 provides the best platform to deliver applications at the edge and manage data center security."

"Guardicore protects organization's most critical IT assets, wherever they run. We do so by providing a simple way to micro-segmentation data centers and cloud environments," says Sharon Besser, VP Business Development at Guardicore. "Our integration with Mellanoxs BlueField IPUs is enabling us to provide hardware accelerated segmentation in very high-speed and low latency environments. We look forward to expanding the integration and to derive the benefits of the enhanced security capabilities of ConnectX-6 Dx SmartNICs and BlueField-2 IPUs."

"Our collaboration with Mellanox has helped bring additional capabilities to IBM and our customers," said Matthew Drahzal, IBM offering executive. "We look forward to exploring ways for this new technology to accelerate our modern AI and HPC offerings in this rapidly evolving space."

"Enterprise security is paramount as AI moves to the edge," said Justin Boitano, general manager of Enterprise and Edge Computing at NVIDIA. "Mellanox's new products offer high-speed crypto engines that enable highly secure and reliable communications between the cloud and AI applications running on millions of NVIDIA-powered edge devices."

"Enhancing system security is a key component of Red Hats open technology approach across our Linux, hybrid cloud, container and Kubernetes offerings, and we do so by following a secure by default approach across our technology portfolio," said Tom Nadeau, Technical Director, NFV at Red Hat. "Our collaboration with Mellanox in this area and others, including vDPA and virtio, has helped to amplify the adoption of software-defined networking across the open hybrid cloud, and were pleased to extend our work with them in the open source communities in driving efforts around hardware-accelerated security."

"UClouds differentiated cloud products and services deliver the highest levels of stability and reliability for various industry and universal use-cases," said Leo Xu, director of network group at UCloud. "The successful deployment of ConnectX SmartNICs in production have significantly increased our network capacity and reduced latency. We plan to deploy the next generation BlueField IPUs for streamlining bare-metal server provisioning and operations."

Supporting Resources:

About Mellanox

Mellanox Technologies (NASDAQ: MLNX) is a leading supplier of end-to-end Ethernet and InfiniBand smart interconnect solutions and services for servers and storage. Mellanox interconnect solutions increase data center efficiency by providing the highest throughput and lowest latency, delivering data faster to applications, unlocking system performance and improving data security. Mellanox offers a choice of fast interconnect products: adapters, switches, software and silicon that accelerate application performance and maximize business results for a wide range of markets including cloud and hyperscale, high performance computing, artificial intelligence, enterprise data centers, cyber security, storage, financial services and more. More information is available at: http://www.mellanox.com/.

Note: Mellanox, ConnectX, and BlueField are registered trademarks of Mellanox Technologies, Ltd. All other trademarks are property of their respective owners.

View source version on businesswire.com: https://www.businesswire.com/news/home/20200224005186/en/

Contacts

Mellanox Technologies, Ltd.

Press/Media Contact Greg CrossZonic Public Relations+1 (925) 413-5327gcross@zonicgroup.com

Israel PR Contact Jonathan WolfJWPR Public Relations and Communications+972-54-22-094-22yoni@galaipr.com

Read more here:
Revolutionary Mellanox ConnectX-6 Dx SmartNICs and BlueField-2 I/O Processing Units Transform Cloud and Data Center Security - Yahoo Finance

Read More..

3 ways AI is transforming the insurance industry – The Next Web

As AI, big data, and the internet of things (IoT) find their way into every aspect of our lives, many industries are undergoing a transformation. Insurance executives believe that artificial intelligence (AI) will significantly transform their industry in the next three years, with insurers investing in AI to empower agents, brokers and employees to enhance the customer experience with automated personalized services, faster claims handling and individual risk-based underwriting processes, consulting firm Accenture forecasted in 2017.

Three years later, AI algorithms have made great inroads in different sectors of the insurance industry and are lowering costs while driving efficiency and improving the customer experience. Slowly but surely, the industry is transforming. There are some kinks that need to be ironed out, but for the most part, the changes have been for the better, and theres more down the road.

Here are three areas worth watching.

Thanks to advances in edge hardware, cloud technology, and the internet of things, more and more information about objects, people, and organizations are being digitized. Telematics, wearables, and smart home sensors are just some of the technologies that are enabling us to collect detailed information about the physical world. And ubiquitous connectivity enables us to aggregate that data in cloud servers for further processing by machine learning algorithms.

In its 2018 Emerging Risk Report, Lloyds outlined some of the benefits that the growth of IoT will bring to the insurance industry, including better risk understanding, avoiding preventable losses, capturing patterns and behaviors and enabling proactive monitoring. Today, many insurers are embracing these trends to improve the speed and efficiency of their services. For one thing, having more data enables insurers to provide personalized and tailored premiums to individual customers.

One example is U.S.-based Layr, a cloud-based commercial insurance platform for small businesses, which came out of Lloyds insurance accelerator program, Lloyds Lab, and received funding in May to develop its AI-based solution. Layr uses machine learning to peruse customer data and compare applicants to clusters of similar businesses. This enables the companys prediction engine to automatically match clients with the right policies.

Being able to collect rich, real-time data from the physical world through IoT sensors is also leaving its mark in the insurance industry. An example is Parsyl, an IoT startup that helps shippers, retailers, and insurers understand the quality conditions of sensitive and perishable products as they move through the supply chain.

Parsyl, another Lloyds Lab graduate, is incorporating its sensor technology in quality assurance and risk management solutions for clients that handle products that require specialist transport and storage. Such products included temperature-controlled foods, biological pharmaceuticals, and sensitive life science and high-tech products. Installing the sensors provides insurers with accurate data and insights about the cargo while giving the customers the benefit of expedited settlement, reduced claims cost, and machine learningbased risk mitigation.

One of the interesting trends in the field is the synergies between insurance and insurtech companies with specialized tech companies to enhance risk prediction and management. One interesting example is the partnership between weather forecasting company Climacell and Munich Re Syndicate, one of the leading marine and specialty underwriters at Lloyds. Climacell, which was part of Lloyds Labs third cohort, uses sensor data from various devices and machine learning algorithms to make precise weather forecasts. Munich Re Syndicate will use Climacells technology to help its clients better understand how the weather will affect their business and make informed decisions that enhance operational efficiency, safety and profitability.

In the car insurance sector, insurers use telematics to collect real-time driving data from vehicles. As opposed to the past, where they had to rely on basic information about the vehicle and driver to craft their insurance policies, they can now analyze telematics data with machine learning algorithms to create personalized risk profiles for drivers. Many insurers use this data to give discounts to drivers who have safe driving habits and penalize dangerous behavior such as speeding, hard braking, harsh acceleration, and hard cornering. The same data can help reconstruct accident scenes and enable insurers to better understand and assess what happened, which results in much faster claims processing.

In the health insurance sector, service providers use machine learning to help patients choose the best health insurance coverage options to fit their needs. Data collected from wearables such as fitness trackers and heart rate monitors help insurers monitor track and reward healthy habits such as regular exercise, and encourage preventive care by providing healthy nutrition tips.

An example is Insurtech startup Collective Health, which uses machine learning to identify risk and match its members with the right resources for their healthcare. The companys AI model brings together claims data, prior authorizations, eligibility data, engagement data, and healthcare utilization data to develop a holistic profile of each member and their needs. The rich, AI-powered profile helps identify members health needs, such as those who need help from a pharmacist with confusing medications or need assistance from a nurse to help arrange home healthcare services.

AI reduces the task of manually reviewing thousands of medical claims, and instead focuses our staff on performing warm, human outreach, and thinking through complex problems together with our members, says Dr. Sanjay Basu, Collective Healths Director of Research and Analytics.

One of the big advantages of the availability of data and machine learning algorithms is fraud prevention. Machine learning algorithms trained on the huge amount of data available on customers can glean patterns that separate legitimate claims from fraudulent ones. Today, most insurers use machine learning to detect and prevent fraud.

And with IoT and sensor technology continuing to expand at an accelerating pace, and with the propagation of 5G networks, we will continue to have even bigger datasets and accurate data to better understand insurance risks.

I think that is one of the most exciting developments to come in the next decade is that we will have this incredible depth of knowledge about the world around us, says Trevor Maynard, Head of Innovation at Lloyds.

Computers have historically struggled to deal with data that is not neatly arranged in tables with rows and columns. But unfortunately, most of our data is unstructured and lies the documents, chat logs, emails, and the textual data we generate in our day-to-day interactions. Natural language processing, the science of helping computers understand and draw value from unstructured text, is a hot area of research and has seen tremendous progress in recent years.

The insurance sector, which is laden with textual data, has benefitted immensely from advances in NLP. Insurers have been able to leverage language models to reduce the time it takes to respond to customer queries and find relevant information from the tons of documents they must review in claims settlement.

An example is Lloyds International Trading Advice (LITA), a consultancy within Lloyds that gives insurance companies regulatory information about the countries in which it operates. LITA covers more than 200 geographies, and the regulatory rules of each area is registered in lots unstructured documents.

Previously, LITAs experts had to manually go through these documents to answer questions about regulations and compliance, which usually took several days per query. To optimize the process, the LITA team used the ton of data they had gathered through their interactions with their customers to train a question-answering AI model. They were able to develop a system that automated a large part of the consultation process and improved the service-level agreement from five days to less than an hour.

The AI augments the role of the employees, says Craig Civil, Head of Data Innovation, R&D and Analytics at Lloyds. It makes their job far more satisfying because we automate 80% of the work, and the 20% are truly interesting one-off questions that you do need an experienced team that can do the research and answer.

Advances in NLP have also ushered in a breed of customer service chatbots in different sectors, including the insurance industry. Established insurance firms such as Geico as well as insurtech startups such as Lemonade are using AI-powered chatbots to settle claims. These chatbots handle the low-level customer queries and free agents to handle more complicated tasks.

AI researchers continue to develop larger and more complicated models that can tackle more complicated language-related tasks. In the past year, weve seen the release of state-of-the-art language models such as OpenAIs GPT-2 and Googles Meena. While were still pretty far from developing AI that can truly understand human language, practical uses will emerge from continued advances in natural language processing. AI will do the legwork, gathering import data and highlighting trends in text data, making it easier and less costly for insurers to piece that information together and address their clients needs.

Computer vision is the science of enabling machines to extract meaning and context from visual data. In the past few years, computer vision has been advancing by leaps and bounds thanks to convolutional neural networks, AI models that can perform image recognition and classification tasks with stunning accuracy.

Insurers now use image recognition algorithms to automate many of the tasks that previously required human labor. Insurance firm Liberty Mutual uses AI to provide fast damage assessment of vehicle damage and claims settlement. Users take a picture of the damaged car with their smartphone and submit it to the AI Auto Damage Estimator, which uses a machine learning algorithm trained on thousands of car accident photos to assess damage and costs. The process doesnt take more than a few seconds.

Computer vision is also enabling insurers to perform tasks that were previously impossible. An example is State Farms Drive Safe & Save platform, which uses AI to analyze in-car camera feeds and detect and provide feedback on unsafe behavior such as distracted driving and texting.

One of the interesting trends to watch in computer vision is advances in edge computing and edge AI. Computer vision tasks previously required applications to send their data to cloud servers, where AI algorithms leveraged massive compute resources to process and analyze the data. But in recent years, specialized hardware and more efficient machine algorithms are gradually enabling on-device AI inference. The improved speed and efficiency are paving the way for real-time analysis of visual data and risk assessment.

Despite its many exciting applications, AI-based insurance is still in its early stages and the best is yet to come. As technology continues to permeate our lives, AI algorithms will be able to provide faster and more accurate solutions, making the insurance industry much more pleasing and less frustrating for both clients and agents.

Published February 24, 2020 13:40 UTC

See the rest here:
3 ways AI is transforming the insurance industry - The Next Web

Read More..

BeyondTrust Expands Cloud Leadership with Privilege Management Delivered As-A-Service – GlobeNewswire

ATLANTA, Feb. 24, 2020 (GLOBE NEWSWIRE) -- BeyondTrust, the worldwide technology leader in Privileged Access Management (PAM), today announced the new Privilege Management SaaS, supporting Windows desktops/servers and MacOS. By delivering its leading Endpoint Privilege Management solution via a SaaS management platform, BeyondTrust is making it even easier for customers to eliminate unnecessary privileges and stop malicious attacks by enforcing least privilege on Windows and Mac systems.

With full management capabilities in the cloud, BeyondTrust gives customers the best the cloud has to offer high availability, security, access, and scalability, while removing the overhead of managing infrastructure. For organizations looking to reduce privileged access risks without adding administrative and financial burdens on their organization, BeyondTrusts SaaS solutions feature rapid deployment and make managing privileged access easier and more cost-effective. Additionally, the SaaS offering is available in a subscription model, allowing customers to pay only for what they need and expand with their business.

Security and IT teams are struggling to manage aging software and hardware, while pressured by the business to rollout new solutions and versions with little to no disruption, said Daniel DeRosa, Senior Vice President and Chief Product Officer at BeyondTrust. Our new Privilege Management SaaS offering ensures that no valuable time is lost, and employee productivity is maximized. With our PAM SaaS solutions, businesses get the best of both worlds, removing the burden of managing their infrastructure while enjoying the feature-richness of the cloud.

According to the 2019 BeyondTrust Microsoft Vulnerabilities Report, of the 189 Critical Microsoft vulnerabilities reported, 81% could be mitigated by removing local admin rights from users. Privilege Management SaaS enables organizations to eliminate admin rights quickly and efficiently, without disrupting user productivity, unlike traditional privilege management products that can take months to properly configure.

Key features of Privilege Management SaaS:

Privilege Management is a key part of the BeyondTrust Privileged Access Management portfolioan integrated solution that provides visibility and control over all privileged accounts and users. By uniting the broadest set of privileged security capabilities, BeyondTrusts Universal Privilege Management approach simplifies deployments, reduces costs, improves usability, and reduces privilege risks.

Privilege Management SaaS will be available in Q2 2020.

About BeyondTrust

BeyondTrust is the worldwide leader in Privileged Access Management (PAM), empowering organizations to secure and manage their entire universe of privileges. Our integrated products and platform offer the industry's most advanced PAM solution, enabling organizations to quickly shrink their attack surface across traditional, cloud and hybrid environments.

The BeyondTrust Universal Privilege Management approach secures and protects privileges across passwords, endpoints, and access, giving organizations the visibility and control they need to reduce risk, achieve compliance, and boost operational performance. We are trusted by 20,000 customers, including 70 percent of the Fortune 500, and a global partner network. Learn more at http://www.beyondtrust.com.

Follow BeyondTrust:

Twitter: http://twitter.com/beyondtrustBlog: http://www.beyondtrust.com/blogLinkedIn: http://www.linkedin.com/company/beyondtrust Facebook: http://www.facebook.com/beyondtrust

For BeyondTrust:

Mike BradshawConnect Marketing for BeyondTrustP: (801) 373-7888E: mikeb@connectmarketing.com

More here:
BeyondTrust Expands Cloud Leadership with Privilege Management Delivered As-A-Service - GlobeNewswire

Read More..

Nokia introduces cloud-native Assurance and Experience software to help CSPs move toward experience-driven and automated 5G network operations -…

Press Release

Nokia introduces cloud-native Assurance and Experience software to help CSPs move toward experience-driven and automated 5G network operations

24 February 2020

Espoo, Finland Nokia today introduced two new cloud-native software applications that help communication service provider (CSPs) operate their networks more efficiently and effectively, and drive new revenue opportunities.

Faced with growing competition and the roll-out of complex, virtualized 5G networks that can generate up to a 100-fold increase in network actions, CSPs require automated, service-centric operations that prioritize the customer experience.

The Nokia Assurance Center effectively blends the traditionally separate fault and performance management processes to drive intelligent root cause analysis, and triggers prioritized and automated resolution.

In addition to the growth in network complexity, operators will increasingly be expected to provide commercial terms tied to Service Level Agreements (SLAs) as they expand into vertical industries. The Nokia Experience Center incorporates the subscriber dimension, quantifies the customer experience of the services being delivered, and links these values into the SLAs.

Nokia Assurance Center and Nokia Experience Center complement each other, yet operate independently in order to support the modular approach that many CSPs take to building their networks. The Nokia Assurance Center has a focus from the service layer down to the network layer, while the Nokia Experience Center has a focus from the customer and subscriber experience perspective.

The two products are built on Nokias Common Software Foundation (CSF), which ensures that Nokias cloud-native products allow customers to use their choice of deployment strategies. With security built-in from the beginning, CSF runs on all the leading public and private cloud platforms and servers. With such flexibility, CSF makes Nokias products easier to deploy, integrate, operate and upgrade.

Francis Haysom, Partner and principal analyst, Appledore Research, said:Nokia is taking a leading position in practically combining AI with, what Appledore term, Rapid Automated Service Assurance; enabling the automation of network operations. Nokia Experience Center and Assurance Center will support the move to a network driven by customer intent (experience and SLAs) rather than a specific network technology.

Brian McCann, Chief Product Officer, Nokia Software, said: Nokia Assurance Center and Nokia Experience Center reflect the long Nokia pedigree of being an industry leader in promoting service operations. These products bring a new level of service automation and data- and customer-centricity for operators that make many of their servicing capabilities faster, smarter, more cost effective and more relevant in order to better serve their customers.

Additional ResourcesNokia Assurance Center webpageNokia Experience Center webpage5G Operations webpage

About NokiaWe create the technology to connect the world. We develop and deliver the industrys only end-to-end portfolio of network equipment, software, services and licensing that is available globally. Our customers include communications service providers whose combined networks support 6.1 billion subscriptions, as well as enterprises in the private and public sector that use our network portfolio to increase productivity and enrich lives.

Through our research teams, including the world-renowned Nokia Bell Labs, we are leading the world to adopt end-to-end 5G networks that are faster, more secure and capable of revolutionizing lives, economies and societies. Nokia adheres to the highest ethical business standards as we create technology with social purpose, quality and integrity.www.nokia.com

Media InquiriesCommunicationsPhone:+358 (0) 10 448 4900E-mail:press.services@nokia.com

See the article here:
Nokia introduces cloud-native Assurance and Experience software to help CSPs move toward experience-driven and automated 5G network operations -...

Read More..

Global Virtual Private Server Market (2019 to 2026) – CAGR of 16.2% Expected During the Forecast Period – ResearchAndMarkets.com – Business Wire

DUBLIN--(BUSINESS WIRE)--The "Virtual Private Server Market by Type, Operating System, Organization Size, and Industry Vertical: Global Opportunity Analysis and Industry Forecast, 2019-2026" report has been added to ResearchAndMarkets.com's offering.

The global virtual private server market was valued at $2.6 billion in 2018, and is projected to reach $8.3 billion by 2026, growing at a CAGR of 16.2% from 2019 to 2026.

The rise in cyber threats & cyber-attacks in data centers across the globe has boosted the demand for VPS, which acts as a key driver of the global virtual private server market. This is attributed to the capability of VPS server that provides sandbox security features. In addition, increase in adoption of cloud computing has proliferated the deployment of VPS severs at a significant rate, owing to its virtualization feature that replicates similarity of security and performance of VPS services.

This factor is expected to augment the growth of the global market during the forecast period. On a contrary, limitations on availability of physical resource and bandwidth is a major restraining factor, which is anticipated to hamper the market growth to a certain extent.

The global virtual private server market is segmented into type, operating system, organization size, industry vertical, and region. On the basis of type, the market is bifurcated into managed VPS and unmanaged VPS. By operating system, it is divided into Windows and Linux. As per organization size, it is classified into large enterprises and small & medium enterprises. Depending on industry vertical, it is segregated into IT & telecommunication, retail, BFSI, manufacturing, healthcare, and others. Region wise, it is analyzed across North America, Europe, Asia-Pacific, and LAMEA.

The report includes the profiles of key players operating in the market analysis. These include Amazon Web Services, Inc., DreamHost, LLC, Endurance International Group, GoDaddy Operating Company, LLC, IBM, InMotion Hosting, Liquid Web, OVH, Rackspace US, Inc., and United Internet AG.

Key Benefits

Key Topics Covered:

Chapter 1: Introduction

1.1. Report Description

1.2. Key Benefits For Stakeholders

1.3. Key Market Segments

1.4. Research Methodology

1.4.1. Secondary Research

1.4.2. Primary Research

1.4.3. Analyst Tools & Models

Chapter 2: Executive Summary

2.1. Key Findings

2.1.1. Top Impacting Factors

2.1.2. Top Investment Pockets

2.2. Cxo Perspective

Chapter 3: Market Overview

3.1. Market Definition And Scope

3.2. Key Forces Shaping Global Virtual Private Server Market

3.2.1. Moderate-To-High Bargaining Power of Suppliers

3.2.2. Moderate-To-High Bargaining Power of Buyer

3.2.3. Moderate-To-High Threat of Substitutes

3.2.4. Moderate-To-High Threat of New Entrants

3.2.5. Moderate-To-High Competitive Rivalry

3.3. Value Chain Analysis

3.4. Case Studies

3.4.1. Snapcomms

3.4.2. The Hackett Group

3.5. Impact of Government Regulations On Global Virtual Private Server Market

3.6. Market Dynamics

3.6.1. Drivers

3.6.1.1. Growing In Security Concerns Among Enterprises

3.6.1.2. Enhanced Customization, Scalability, And Downtime

3.6.1.3. Increase In Adoption of Cloud-Based Services Among Enterprises

3.6.1.4. Reduction In Overall Hardware Requirement In The Data Center Infrastructure

3.6.2. Restraints

3.6.2.1. Limited Physical Resource And Bandwidth Availability

3.6.2.2. Limited Efficiency Compared To Dedicated Hosting

3.6.3. Opportunities

3.6.3.1. Integration of Machine Learning And Ai With Vps

3.7. Market Evolution/ Industry Roadmap

3.8. Patent Analysis

3.8.1. By Region (2004-2018)

3.8.2. By Applicant

Chapter 4: Global Virtual Private Server Market, By Type

4.1. Overview

4.2. Managed Vps

4.2.1. Key Market Trends, Growth Factors And Opportunities

4.2.2. Market Size And Forecast, By Region

4.2.3. Market Analysis By Country

4.3. Unmanaged Vps

Chapter 5: Global Smart Space Market, By Operating System

5.1. Overview

5.2. Windows

5.3. Linux

Chapter 6: Global Virtual Private Server Market, By Organization Size

6.1. Overview

6.2. Large Enterprises

6.3. Small & Medium Enterprises

Chapter 7: Global Virtual Private Server Market, By Industry Vertical

7.1. Overview

7.2. It And Telecommunication

7.3. Retail

7.4. BFSI

7.5. Manufacturing

7.6. Healthcare

7.7. Others

Chapter 8: Global Virtual Private Server Market, By Region

8.1. Overview

8.1.1. Market Size And Forecast, By Region

8.2. North America

8.3. Europe

8.4. Asia-Pacific

8.5. LAMEA

Chapter 9: Competitive Landscape

9.1. Market Share Analysis, 2018

9.2. Competitive Dashboard

9.3. Key Developments

9.3.1. New Product Launches

9.3.2. Partnership

9.3.3. Product Development

9.3.4. Collaboration

9.3.5. Acquisition

9.3.6. Agreement

9.3.7. Business Expansion

For more information about this report visit https://www.researchandmarkets.com/r/rfkjev

Read the rest here:
Global Virtual Private Server Market (2019 to 2026) - CAGR of 16.2% Expected During the Forecast Period - ResearchAndMarkets.com - Business Wire

Read More..

Ride The Tiger: Micron Is Positioned To Become A Powerhouse – Seeking Alpha

My calls on Micron (NASDAQ:MU) have been ill-timed. That's in part due to my lack of skill as a market timer, and, in part, due to the swinging nature of the stock and its underlying business. The catalyst for this stock has been changing expectations on future memory prices and suffice to say that it has not been a constant variable. However, I see Micron as the archetype of an overlooked growth stock.

Source: Intel 10-K

Time and again, investors have been burned by the wild swings in the company's stock price. Additionally, Micron's share price has struggled to pass the 60 USD level. Therefore, investors have shrugged off this stock on the account of cyclicality and expectations of weakening memory prices.

As an aside, Intel (INTC) comes as the perfect example of a company that was constantly criticized, back in 2015, for being too dependent on the personal computer industry, and for having lost the mobile phone train, but the reality is that in the process it nearly doubled its market capitalization.

Intel Stock Price 5Y

Source: Seeking Alpha

Micron is growing at a fast clip; however, market analysts and pundits tend to dismiss it at the first opportunity. But, that per se, is not new, nor exclusive to Micron. What makes it interesting is the fact that Micron is right in the middle of the data megatrend. And, if data trend jargon seems too holistic for you, just think that what is really driving data are the applications to AI, machine learning, IoT, among others, that are using data to reinvent old industries. Micron provides exactly what data needs: memory, more specifically, fast and reliable memory.

Micron's most consolidated markets in the computer and networking segments include clients among individuals, enterprises, organizations operating with high-performance graphics, and cloud servers. In the mobile segment, the company provides several memory and performance solutions for smartphones.

However, the best part lies in new solutions in development for upcoming markets. These include:

Most of these applications are still in their infancy with a huge way to go before being profitable, while others are maturing as we speak. Now, we shouldn't be too optimistic, there's lots of competition, and Micron will just grab a fraction of the total addressable markets. However, that's exciting enough.

If there is intense completion from the likes of Toshiba (OTCPK:TOSYY), Western Digital Corporation (WDC), Samsung (OTC:SSNNF), Seagate (STX), and Intel (INTC), among others, the company needs to have an edge to succeed. That edge has to come from two fronts: technology development and manufacturing efficiency. Let's focus on the latter.

The company has a worldwide manufacturing base. The production of semiconductors depends on maintaining a highly controlled production space that minimizes contamination. The manufacturing costs depend, among others, on the speed of the transition to higher density products and the sophistication of the manufacturing equipment and processes.

For investors, more than dense technicalities, what matters is the profitability and the sustainability of the operations. In that regard, Micron has maintained incredible gross, operating, and profit margins. All that despite periodical price drops in memory prices, as we have seen previously. These levels of profitability are only possible due to the increasing manufacturing efficiency and productivity, both offsetting the drops in price.

(Source: Author computations based on Micron 10-K)

Comparing companies based on these ratios is never straightforward, mainly because of differences in accounting practices. However, we can apply a forensics approach to financial statements and derive some clues. To smooth cyclicality, let us use three-year averages to compute gross margins.

3-year average gross margins for Micron, Intel, Western Digital and Seagate

(Source: Micron 10-K, Seeking Alpha's Financials data, Author's computations)

More than comparing apples to oranges, the interesting part is comparing the trendline for each company. Micron has come a long way, from being a small player with razor-thin margins, to having great margins. Now, let us see how that translated into the bottom line.

3-year average profit margins for Micron, Intel, Western Digital and Seagate

(Source: Micron 10-K, Seeking Alpha's Financials data, Author's computations)

Here we can see that the improvements in Micron's margins weren't just an accounting gimmick. It materialized all the way down into the bottom line. This is what it looks like when technology improvements meet manufacturing excellence. The company has grown in size and got more efficient at the same time.

(Source: Micron 10-K, Seeking Alpha's Financials data, Author's computations)

That's the hardest question, and usually, financials do not provide much insight on this regard. We can form a thesis that if the company was able to gather a pool of talented people and resources that allowed it to get to this point, then, as long as they keep investing, the company will carry the momentum forward.

Source: Micron's 10-K

Don't read too much into these figures. They are just indicative that the company is keeping its investment levels. However, if we add the fact that Micron is navigating a mega-trend towards more data storage and processing, we feel inclined to believe that the company will carry the momentum forward.

The company is in great shape, with a lean cost structure, as evidenced in its margins, and with a strong balance sheet.

Source: Micron's 10-K, Author's computations

Micron is in great shape to contribute with memory and storage products to the next wave of artificial intelligence and machine learning technology development. In addition to the current line-up of products and services, the company has the balance sheet to go after strategic acquisitions, as they did with Inotera in 2016.

All-in-all, Micron has great momentum in its operations, it has achieved an impressive scale, and it is right at the center of some of the most promising industries for the next decade: artificial intelligence and machine learning. Trading at around 12 times its 5-year average EPS (5-year average EPS: $4.73), the company seems very attractive for a buy-and-hold investor.

Disclosure: I am/we are long MU. I wrote this article myself, and it expresses my own opinions. I am not receiving compensation for it (other than from Seeking Alpha). I have no business relationship with any company whose stock is mentioned in this article.

Additional disclosure: This text expresses the views of the author as of the date indicated and such views are subject to change without notice. The author has no duty or obligation to update the information contained herein. Further, wherever there is the potential for profit there is also the possibility of loss. Additionally, the present article is being made available for educational purposes only and should not be used for any other purpose. The information contained herein does not constitute and should not be construed as an offering of advisory services or an offer to sell or solicitation to buy any securities or related financial instruments in any jurisdiction. Some information and data contained herein concerning economic trends and performance is based on or derived from information provided by independent third-party sources. The author trusts that the sources from which such information has been obtained are reliable; however, it cannot guarantee the accuracy of such information and has not independently verified the accuracy or completeness of such information or the assumptions on which such information is based.

More here:
Ride The Tiger: Micron Is Positioned To Become A Powerhouse - Seeking Alpha

Read More..

MWC Canceled, GIGABYTE Turns Its Exhibition Digital and Showcases Multi-access Edge Computing Infrastructure to Realize 5G Networks – Yahoo Finance

GIGABYTE was originally scheduled to exhibit its edge computing products, as well as data center solutions to seize the 5G opportunities at the Mobile World Congress (MWC) held at the end of February. Although the 2020 event is canceled by the GSMA, GIGABYTE made its show into a digital format, [GIGABYTE MWC Online](https://www.gigabyte.com/MWC/2020), allowing its partners, customers, and users to still see its latest developments for the 5G era.

This press release features multimedia. View the full release here: https://www.businesswire.com/news/home/20200224005042/en/

GIGABYTE MWC Online 2020 (Photo: Business Wire)

MWC is the worlds biggest exhibition for the telecommunications industry aimed to highlight how next-generation networks will form the basis of wide-reaching value creation and economic impact in the 5G era. GIGABYTE, being an enabler for digital era businesses to transform their core values seamlessly with the technology trends of tomorrow, wants to take the initiative to showcase its server solutions ready to be used as MEC (Multi-access Edge Computing) platforms to support the new generation of 5G technologies and services, leveraging its innovative hardware design together with NFV (Network Functions Virtualization) and open-source software stacks to meet the QoS requirements of 5G.

On The Edge of 5G

On [GIGABYTE MWC Online], the MEC products and solutions are embedded into an interactive map to illustrate how they fit into 5G scenarios of eMBB, URLLC, and mMTC. GIGABYTE offers one of industrys most comprehensive server product lineups that can help businesses from different vertical markets to overcome the challenges of building data center infrastructure necessary for success in the 5G era. GIGABYTEs G591-HS0, featuring support for up to 32 accelerator cards, can provide powerful compute capabilities to build a combined MEC / VR stitching / processing server, which allows for high resolution videos to be processed and stored locally on the baseband unit, therefore making it ideal for enabling live streaming of 8K video or 360 VR at large spectator events. GIGABYTE also launched a 5G CrowdCell with LimeNET, to overcome 5Gs weaknesses of short range and signal drop-offs when hitting concrete walls, further completes the eMBB scenario of next-generation spectator experience. GIGABYTEs H242-Z10 is a prime example of a compact GPU accelerated edge server platform that can be deployed to connect an intelligent vehicles network in order to overcome performance limitations previously constrained by onboard vehicle processing, or the latency and transmission bottlenecks of cloud computing. Its ultra-small footprint form factor can be mass deployed at scale at the far edge, which is perfect to also be used as an mMTC server to support an IoT network to enable a smart city.

Data Center Know-How for 5G Era

GIGABYTE will establish its vital place in the 5G ecosystem through MEC-optimized server platforms. Moreover, GIGABYTE has long been an exceptional server hardware and solution provider that helps businesses to build the data center infrastructure that can accelerate their innovation in 5G. GIGABYTEs software-defined storage solution can consolidate existing media and effortlessly add new capacity that supports all main storage types (block, object or file) while simplifying management and reducing infrastructure costs; and GIGABYTEs DNN (Deep Neural Networks) Training Appliance is an on-premises AI development platform that combines high performance GPU servers and a user-friendly GUI with powerful optimizations to effectively improve the accuracy and reduce the time required for DNN training. Both of the server solutions are for businesses of all sizes to transition effortlessly into 5G, which promises endless possibilities and is the foundation of a new digitally driven industrial revolution featuring an era of Intelligent Connectivity the convergence of AI, Big Data, and IoT powered by 5G.

Story continues

5G and AI to Make a Breakthrough in 2020

According to Deloittes market trend report last year, 2020 will be the year for 5G and Artificial Intelligence to take off; and MWC echoes on its website that the market for AI is projected to reach $70 billion by 2020. Intelligent Connectivity is a pivotal evolution of 5G, and Artificial Intelligence is poised to have a transformative effect on consumers, enterprises, and governments around the world.

GIGABYTE demonstrates on [GIGABYTE MWC Online] an eMBB scenario in which higher density, greater performance, and feature-rich computing can help accelerate applications that require increased bandwidth. As an NVIDIA EGX platform builder, GIGABYTE is able to showcase a secure, cloud native edge computing platform for IoT, AI, 3D, XR and beyond, featuring an integrated hardware and software stack which can be used to quickly & easily deploy ready-to-use containers and AI frameworks from NGC (NVIDIA GPU Cloud). And GIGABYTEs AERO series laptop is the worlds first laptop powered by Microsoft Azure AI. Through machine learning, Azure will assist the AERO laptop to optimize system settings and control CPU & GPUs power consumption to effectively improve users working environment, and ultimately define what a laptop should perform in the Intelligent Connectivity era of 5G.

Find Your 5G Edge

MEC is a key technology for 5G to come to its full implementation, and GIGABYTE wants to use [GIGABYTE MWC Online] as a portal to share its expert know-how and critical products and solutions, and showcase its data center knowledge that will help you "Find Your 5G Edge" to vindicate its capability to "Upgrade Your Life".

By February 24th, the website will be fully accessible for everyone to explore edge computing and 5G. GIGABYTE has transformed its MWC exhibition into an online experience, including filming marketing and product experts short footages to provide insights and share perspectives. Partners and customers who might have hoped to connect with GIGABYTE to discuss 5G possibilities can do so by filling out a short online form, and visitors and users can participate in an online event to learn more about GIGABYTEs solutions and win an NVMe 1TB SDD.

About GIGABYTE

GIGABYTE is an engineer, visionary, and leader in the world of tech that uses its hardware expertise, patented innovations, and industry leadership to create, inspire and advance. Renowned for over 30 years of award-winning excellence, GIGABYTE is a cornerstone in the HPC community, providing businesses with server and data center expertise to accelerate their success. At the forefront of evolving technology, GIGABYTE is devoted to inventing smart solutions that enable digitalization from edge to cloud, and allow customers to capture, analyze, and transform digital information into economic data that can benefit humanity and "Upgrade Your Life". Please visit https://www.gigabyte.com/ for more information.

View source version on businesswire.com: https://www.businesswire.com/news/home/20200224005042/en/

Contacts

Shinyu Chenbrand@gigabyte.com GIGABYTE MWC Online:https://www.gigabyte.com/MWC/2020

See original here:
MWC Canceled, GIGABYTE Turns Its Exhibition Digital and Showcases Multi-access Edge Computing Infrastructure to Realize 5G Networks - Yahoo Finance

Read More..

TYAN Packs Lots of Performance in a 1U Package – Embedded Computing Design

Given the explosion of Internet of Things (IoT) devices combined with increased use of artificial intelligence (AI) and machine learning (ML), infinitely more content (data) is now created, stored, and processed at the Edge. Edge servers provide control of massive amounts of locally produced data and while rapid access to that data can be critical to small and medium-sized businesses, until recently, entry-level (affordable) servers did not provide the power they required.

Now however, entry level 1U servers with a very small footprint deliver a high-level of computing power, ease-of-use, scaling, and low power consumption. Housed in a very small flat box, they include a core processor, storage, memory slots, ports and interfaces, and multiple connectivity options to handle on-site processing, storage, and IoT device management.

Creating storage at the Edge, however, has not been without its own challenges. Entry-level storage that combines sufficient performance, small footprint, and good support at a reasonable cost has taken time.

Edge machine learning, data centers, high performance computing (HPC), and AI workloads involve large data sets on large distributed compute clusters. The recently announced TYAN Thunder CX GX38-B5550 1U compact sever provides several advantages in this space, including a shorter depth when compared to competitive solutions. In most server chassis designs and the servers used in them, for example, 1U is equivalent to 1.7 inches high, 19 inches wide, and 17.7 inches deep. In comparison, the TYAN GX38-5550 dimensions are 1.75 inches high, 16.9 inches wide, and 15 inches deep. The server optimizes a Micro-ATX motherboard and also delivers manageability features to ease the burden on the user.

1. The TYAN GX38-B5550 is nearly 3 shorter than competitive solutions. Source: TYAN

Additional features of the server include support for two 3.5-inch internal Serial ATA (SATA) drive bays for edge computing, 4 Gigabit Ethernet LAN, one FH/HL PCIe Gen.3 x8 slot, and a single 250W power supply.

2. TYAN entry-level servers deliver the performance power based on Intel Xeon E-2200 processors, combined with the flexibility of a smaller footprint. Source: TYAN

At the heart of the GX38-B5550 server is based on the newer Intel Xeon E-2200 series processor that delivers professional-grade performance, security, reliability, and affordability. This platform has the potential to deliver real-time analytics with large memory capacity, enhanced I/O capabilities, and latest SSD storage technology. The top-bin Intel 8-core Xeon E-2200 processor increases around 25% computing power compared to the top-bin Intel 6-core Xeon E-2100 one. Sixteen threads and up to 5-GHz Intel Turbo Boost Technology, combined with up to 128 Gbytes of DDR4 2666 ECC memory support, to provide TYAN customers with substantial performance in large file processing power, storage, and virtualization.

While the Thunder CX GX38-B5550 targets IoT applications, a second product from TYAN, the Thunder CX GT24E-B5556, is more of an entry-level solution. Its a 1U server thats designed with a single-socket Intel Xeon E-2200 microprocessor. As an Edge-based server, it is well suited for cost-effective Cloud-based gaming applications.

Both solutions described here offer enhanced performance in a compact package. The result is nearly twice as much computing power as was previously available in a similar form factor, and without the thermal issues that previously plagued small footprint solutions. Lots more information is available on TYANs entry-level servers.

The rest is here:
TYAN Packs Lots of Performance in a 1U Package - Embedded Computing Design

Read More..

Graphcore, the AI chipmaker, raises another $150M at a $1.95B valuation – TechCrunch

The UK has a strong history when it comes to processors, but the global chip market has seen some ups and downs of late. Today comes some big news that underscores how investors are doubling down on one of the big hopefuls for the next generation of chipmaking to see it through any possible winter winds. Graphcore, the Bristol-based startup that designs processors specifically for artificial intelligence applications, announced that it has raised another $150 million in funding for R&D and to continue bringing on new customers. Its valuation is now $1.95 billion.

Graphcore has now raised over $450 million and says that it has some $300 million in cash reserves an important detail considering the doldrums that have plagued the chipmaking market in the last few months, and could become exacerbated now with the slowdown in production due to the coronavirus outbreak.

The funding is an extension of its Series D, it said, and brings the total valuation of the company to $1.95 billion. (For reference, the original Series D in December 2018 valued Graphcore at $1.7 billion.) This latest round includes investments fromBaillie Gifford, Mayfair Equity Partners and M&G Investments all new backers as well as participation from previous investors Merian Chrysalis, Ahren Innovation Capital, Amadeus Capital Partners and Sofina. Other past backers of the startup include BMW, Microsoft, Atomico and Demis Hassabis of DeepMind.

Graphcores big claim to fame has been the development of what it calls its Intelligence Processing Unit (IPU) hardware and corresponding Poplar software which are designed specifically for the kind of simultaneous, intensive calculations demanded of AI applications innovators create next generation machine intelligence solutions (which are designed based on how humans think, in parallel processing mode).

Graphcore describes its IPU as the firstprocessor to be designed specifically for AI, although a number of other companies including Nvidia, Intel and AMD have made huge investments into this area and have ramped up their pace of development to meet market demands and hopefully overtake what have been limitations in the wider area of AI processing, a problem that still continues to persist and all these chipmakers continue to work on.

Deep learning has only really existed in since 2012, Nigel Toon, founder and CEO, said recently to TechCrunch. When we started Graphcore, what we heard from innovators was that hardware was holding them back.

This D2 round comes ahead of what it describes as strong demand for 2020, and is happening on the heels of a strong year for Graphcore, the company said, including a commercial deal with one of its previous strategic backers.

2019 was a transformative year for Graphcore as we moved from development to a full commercial business with volume production products shipping, said Nigel Toon, founder and CEO. We were pleased to publicly announce our close partnership with Microsoft in November 2019, jointly announcing IPU availability for external customers on the Azure Cloud, as well as for use by Microsoft internal AI initiatives. In addition, we announced availability of the DSS8440 IPU Server in partnership with Dell Technologies and the launch of the Cirrascale IPU-Bare Metal Cloud. We also announced some of our other early access customers which include Citadel Securities, Carmot Capital, and Qwant, the European search engine company.

See Toon speaking at our recent Disrupt conference in Berlin about the prospect for chips here:

Read the original here:
Graphcore, the AI chipmaker, raises another $150M at a $1.95B valuation - TechCrunch

Read More..

Should decision makers be concerned by the threat of quantum? – Information Age

Executives are wary about the possible threat of quantum, but how concerned should we be, and when should we start getting worried?

There is still room for quantum computing to advance.

The IT decision makers that we interviewed for our 2019 Post Quantum Crypto survey registered concern. Over half (55%) of them said that today it was a somewhat to extremely large threat, while others looked towards a darker horizon, with 71% saying that if it isnt today, it will definitely become one in the future.

When will tomorrow come, and how far away do quantum threats loom? Estimates differ as to when quantum computing will be available some say 5 years, some say 10, and others say 25. The estimates go up and up. One thing is clear: The timer has started.

In late 2018, researchers at the University of Munich proved that quantum computers have an edge over classical computers, by developing a quantum circuit that can solve problems that were comparatively unattainable for a classical computer. 2019 was a banner year for quantum. IBM kicked it off by revealing Q System One, the worlds first commercial quantum computer.

Later that year, Google announced that its current quantum project had begun to solve problems which were impossible for classical computers and reached quantum supremacy. So, the race to fulfil the potential of quantum computing has begun in earnest, and for all of the bountiful goods that it can provide the world, it can also pose considerable threats.

Quantum computing will be one of the defining technologies that will emerge over the next five to ten years, according to Chris Lloyd-Jones from Avanade. Read here

Quantum will break much of the encryption that underpins the modern internet. Thats at least what the US National Institute of Standards and Technology says.

While classical computing speaks in bits, a language composed of 1s and 0s, quantum computing speaks in qubits. Like normal bits, a qubit can either be a 1 or 0, or it can be an indeterminate state. Its that seemingly small difference which makes quantum, well, the quantum leap that it is.

This brings us to the 2048-bit RSA Key the minimum possible key length used to protect computer systems. Using classical computing, DigiCert has predicted that it would take several quadrillion years to defeat such a key. By comparison, the right quantum computer could break one in a matter of months.

The computer that can beat RSA or elliptic-curve cryptography the algorithms on which internet security relies has not yet been built. We are still on the first generation of quantum computers.

Quantum Computing will render much of todays encryption unsafe, says Dr. Andrew Shields, Cambridge Research Laboratory of Toshiba Research Europe, but Quantum Cryptography could be the solution. Read here

In January last year, the US National Academy of Sciences released a report entitled Quantum Computing: Progress and Prospects, and said that the computer that can do this must be five orders of magnitude larger and requires technological advances which have not yet been invented.

However, quantum computing, and thus quantum threats, have been proved possible in the last few years, and everyone is betting big on it. According to Gartner, 20% of all companies will be investing in quantum in the next five years. So, when do organisations need to start preparing?

Michele Mosca, co-founder of the Institute for Quantum Computing, devised a formula for organisations to determine when they have to start transitioning to quantum-safe algorithms:

D + T Qc

D represents how long a piece of data needs to remain secret; T represents how long it will take for all systems to become quantum-safe, and Qc is how long before a quantum threat arrives.

If Qc turns out to be less than the sum of D and T, then an organisation is vulnerable. Establishing the values of D and T will be a more difficult task, but it sets out a useful frame of reference for quantum preparation.

Commercially available quantum computing might not be here yet, and that gives us time to prepare. Unfortunately, wide cryptographic changes often take a long time to take effect, and there are often decades between the call to update and the actual update.

There are still organisations around today who cling to long outdated cryptographic protocols. By the time quantum becomes an imminent threat, there will still be plenty of computers that are using obsolete cryptography. Whether quantum arrives in 25 years, 15 years, 5 years or tomorrow, the clock is ticking, and organisations should start preparing now.

See original here:
Should decision makers be concerned by the threat of quantum? - Information Age

Read More..