Page 3,130«..1020..3,1293,1303,1313,132..3,1403,150..»

IBM speeds up Cloud Object Storage with LucidLink Blocks and Files – Blocks and Files

IBM is to offer LucidLink Filespaces, a fast-file protocol, as a turnkey bundle with Cloud Object Store.

The companies say the bundle can cost less than standard prices from AWS, Azure and other S3-compatible object stores. Peter Thompson, LucidLink CEO, said in a statement: Now, with IBM Cloud, we will be able to further offer egress fees 60 per cent lower than our previous offering to a wider audience and pass those saving directly along to our customers.

Adam Kocoloski, IBM Fellow, VP and CTO, Cloud Data Services, said: As companies adjust to a new way of working, the ability to securely share and access large amounts of data remotely while taking advantage of object storage in a cloud environment has become indispensable.

The IBM COS LucidLink bundle is designed for applications that require fast file-protocol access to massive datasets. These are stored most cost-effectively as object repositories rather than in more expensive NAS filer systems. However object stores are slower to access than filers unless they are front-ended with LucidLinks Filespaces or an equivalent software.

FileSpaces runs in a set of local nodes that cache wanted data. There network connection between this local node and an object store, translates between file and the S3 object protocols. All the file metadata is held in the local node so file:folder lookups do not incur network hop latencies.

User file sharing is supported. Julie OGrady, LucidLinks marketing director, told us. Filespaces enables teams to collaborate on the same file, no matter where they are located.

All access to the object store is carried out using parallel links to speed file reads and writes. All file reads result in partial file transmission with pre-fetching of data that is likely to be needed next. File writes are packaged by the Filespaces cache node, and then compressed, encrypted and multiplexed across parallel links to the back-end object store. The net effect of the caching, local metadata look-up and pre-fetching is that remote object access can be as fast as local filer access.

In May 2019 IBM had a partnership with file collaborator Panzura to provide a Freedom Cloud file access front-end to its object store. The focus was on file sharing. Todays LucidLink-IBM partnership has a focus on file access speed.

IBM is supporting access to all its US data centres together with its UK and Australian data centres. The bundle does not support on-premises IBM COS deployments.

The IBM COS/Filespaces bundle is available now from LucidLink. Users are charged per account at $10.00/user month, starting at 6 users. It will provide users access to all IBMs worldwide regions

Read this article:
IBM speeds up Cloud Object Storage with LucidLink Blocks and Files - Blocks and Files

Read More..

The one trick to finally cleaning out your photo gallery – Komando

The first fully digital camera was released in 1989 by Fujifilm. If you were lucky enough to own one, that means youve been taking amazing photos for over 30 years. That probably also means that you have tons of images floating around.

Even if you only started taking photos with a smartphone, that amounts to an enormous collection of digital memories. Tap or click here to see if your photos were used in AI surveillance research. At some point, you need to sort through and organize your photos in a central storage location.

But where do you begin if you have thousands upon thousands of photos? Its a daunting task, but theres a simple trick to making it easier on yourself.

Dont try to do it all at once. If youre staring at thousands of photos or just swiping through your phones gallery and thinking, How in the world do I sort all this? youre not alone. But dont think about the end goal. Take it a small chunk at a time.

That can mean sorting through one day, one week or one month of photos while youre watching TV. Or maybe you turn on your favorite podcast and sort pictures until you finish an episode.

Privacy, security, the latest trends and the info you need to live your best digital life.

No matter how you decide to split things up, stay consistent. Chip away a little at a time and before you know it, that mess is contained.

Need help getting the ball rolling? Here are some tips and tricks to cleaning up your photos.

You might have photos stored in places that you dont realize. The most obvious place would be digital storage like your smartphone or on your computers hard drive.

But what about the old camera that you threw into the junk drawer years ago? There is a good chance the memory card might still have some images on it that havent been backed up anywhere yet.

Make a list of all the places where you could still have photos stored:

The next step can be laborious, but it is necessary to save storage and money. If you have been studious about placing your photos in marked (or dated) folders, going through them shouldnt be too much of a pain. But you should make sure not to store duplicate images.

But since you could have photos spanning decades from various devices, you will need some help in sorting through them. Luckily there is an app for that, and Remo is perfect for the job.

Available for Android and iOS devices, Remo Duplicate Photos Remover is an app that scans your phone for duplicates and deletes them for you.

If you are working on a Mac, have a look at Photos Duplicate Cleaner. For Windows-based machines, Duplicate Cleaner helps you locate and delete similar photos, and it works for other file types, too.

You have gathered up all your photos and removed duplicates now what? Well, the next step in organizing is to decide the central platform that will host them. You could take a hybrid approach by storing them in the cloud and on a dedicated hard drive, or you can go with just a cloud option.

Here are some options:

If you have a hard drive on your computer with enough storage space, you can save all your images on it. But there are some risks involved, like hard drive failures or viruses. Thats why a secure cloud backup is a must.

Thats where our sponsor IDrive comes in. Give it a try right now while youre thinking about it. Tap or click here to get 50% off a year of IDrive cloud backup.

Dumping photos into your hard drive isnt the best solution. Its better to use professional software to help organize. Open-source digital photo management software like digiKam is great for this purpose. It runs on Linux, Windows, and macOS and is a tool for importing, managing, editing and sharing photos.

In combination with cloud storage, we recommend storing your photos securely on an external hard drive. External drives are less likely to fail and can easily be locked away in a safe. Some external drives to consider:

WD (or Western Digital) has been making popular external drives for years. This USB 3.0 drive is compatible with Windows PCs, Mac, PlayStation 4 and Xbox. It comes in four varieties ranging from 1TB ($49.00) to 5TB ($99.99).

Solid-state drives (SSD) are a bit more expensive than conventional ones, but they offer faster transfer speeds. Since this SSD is compatible with USB-C and USB 3.2, it has transfer speeds of up to 1050MB/s. It has drop protection and IP55 water and dust resistance.

X

Would you like the tech news summarized for you each morning?

Looking for something a bit more powerful than a normal external drive, the My Book Desktop will be perfect. With up to 28TB of RAID storage and 360MB/s read speeds, the My Book Desktop external storage also features an automatic backup function.

Read more:
The one trick to finally cleaning out your photo gallery - Komando

Read More..

EraDB Unveils Log Management Alternative That Cuts Cloud Hardware Costs by 80 Percent – Business Wire

SEATTLE--(BUSINESS WIRE)--EraDB, the database company that helps enterprises manage hyperscale, cloud-native workloads, today announced EraSearch, an Elasticsearch-compatible alternative for log management built on EraDB technology. EraSearch drastically reduces the complexity of ingesting, storing, and exploring large volumes of logs, and dramatically reduces the resources required to operate existing solutions. By improving on a traditional decoupled storage and compute architecture, EraSearch is able to provide lightning-fast access to logs in real-time, while ensuring that durable copies of data always live within object storage, such as on Amazons Simple Storage Service (S3), Google Cloud Storage (GCS), or Microsoft Azure Blob Storage. Available now, EraSearch cuts cloud hardware costs by up to 80 percent and operational costs by up to 75 percent. This new log management offering benefits companies of all sizes, whether they are ingesting gigabytes, terabytes, or petabytes of data every day.

Blog post: Transforming log management with EraSearch

From a first principles perspective, the architecture of existing solutions like Elasticsearch is fundamentally incompatible with the current and future growth of log data, particularly in a cloud-native world. Built on decades-old technology, these tools were never designed to handle data at this scale. As a result, customers are now experiencing exorbitantly high costs and crushing operational toil for what should ostensibly be a simple storage and retrieval problem, said Todd Persen, Co-Founder and CEO of EraDB. By applying a modern, cloud-native architecture, we have solved these traditional limitations by removing them entirely and instead offer customers a low-cost, limitlessly scalable solution that can be run in any environment. Our goal is for our customers to effortlessly store and explore logs from any source, at any scale, at any level of complexity, and with EraSearch we have realized that vision.

EraDB solves hard problems | Modernizes data storage for a cloud-native world

Within every organization data volumes are growing exponentially going from millions to billions or trillions of high-cardinality data points generated daily. All companies deal with these growing volumes of data across departmental silos, mainframes, and legacy systems. According to IDC, the total amount of data created and consumed during 2020 reached 59 zettabytes and will continue to grow at a 26 percent CAGR through 20241. Companies rely on data-driven insights to remain competitive, but still struggle to effectively store data at scale. As a result, engineers spend an increasing amount of time fighting with data infrastructure, while those who need to analyze it are left waiting and unable to find the insights they need. In particular, logs have become business-critical: they are essential to troubleshooting systems, can be used in machine learning and predictive workflows, and are a cornerstone of auditing and compliance.

Many companies have chosen Elasticsearch for their log management workloads because of its perceived status as an affordable, open-source option. However, once these companies see their stored log volumes increase beyond a few terabytes, they discover that Elasticsearch is painfully inefficient and difficult to scale, which leads to exploding costs. For many companies these costs begin to eat significantly into their operating margins, while also requiring them to dedicate a significant portion of their headcount to manual Elasticsearch management.

Alongside EraDB Co-Founder and CEO Todd Persen, who was previously Co-Founder and CTO of InfluxData, a team of database, distributed systems, and machine learning experts have assembled to solve a core set of problems that can be applied to all time-series data management problems, most notably log management. Their first product, EraSearch, is a demonstration of the transformative change the EraDB architecture can bring to established, inefficient segments of the larger data landscape.

Founded in June 2019, the company raised $7M in January of 2020 from venture capital firms Foundation Capital, Array Ventures, Global Founders Capital, and world-class angel investors to build a product focused on making it easier to work with massive time-series datasets. The company currently has paying customers and a solid pipeline. EraDB pinpointed the log management problem as the first data problem to solve and will then apply its core design principles to other time-series problems.

EraSearch highlights:

Supporting Quotes

At Foundation Capital, we understand that core database technology is hard to build and requires not just a re-factor, but a reboot. The market will continue to grow as every organization will need to use ridiculous amounts of data to drive insights and analytics to remain competitive, said Joanne Chen, General Partner, Foundation Capital. Todd and the EraDB team have unparalleled experience from the first generation of time-series database tech and they are uniquely positioned to usher in the next generation of core tech -- EraSearch is just the start.

EraSearch offers organizations a compelling alternative for log management that is easy to deploy and manage, while giving them real-time views of their logs, said Mike Leone, principal analyst, Enterprise Strategy Group. Built on the EraDB platform by a team that understands solving time-series problems, businesses now can adopt a cloud-native architecture that is scalable and performant, at a fraction of the cost of other offerings available today.

Additional ResourcesEraDB blog EraSearch

Connect with EraDBTwitter LinkedIn

About EraDB

EraDB is a database architecture built on the core principles of decoupled storage and compute, a true zero-schema data representation, and flexible indexing powered by machine learning, all of which allow you to significantly reduce the size, cost, and complexity of your data while still giving you lightning fast queries across vast datasets. Our first verticalized product, EraSearch, incorporates these innovations into an Elasticsearch-compatible interface that is focused on serving the enterprise log management market. Find out more at http://www.era.co and http://www.erasearch.co.

1 Global Datasphere 08 May 2020

Read this article:
EraDB Unveils Log Management Alternative That Cuts Cloud Hardware Costs by 80 Percent - Business Wire

Read More..

Global Data Center Construction Market Report 2021-2026: Growing Demand for Cloud Applications & Rising Adoption of Hyperscale Data Centers -…

DUBLIN--(BUSINESS WIRE)--The "Data Center Construction Market - Growth, Trends, COVID-19 Impact, and Forecasts (2021 - 2026)" report has been added to ResearchAndMarkets.com's offering.

The data center construction market has registered a CAGR of 8.34% during the forecast period of 2021-2026.

The development of advanced technologies such as the internet of things, software-defined data centers, and disaster recovery fueled the demand for the construction of data centers globally. Further, a new design of data center construction should be based on customer needs, business requirements, be scalable, manageable, optimized spaces, standardized (i.e., TIA, BICSI, Uptime Institute, EN 50600, CISCA, LEED, ANSI/ASHRAE Standard 90.4-2016), secure, modular, flexible and efficient.

Growing demand for cloud applications is driving the market as increasing the utilization of cloud storage is fueling the demand, and according to Oracle's Cloud Prediction 2019, approx 80% of all enterprise (and mission-critical) workloads will move to the cloud. As the definite number of files stored in the cloud has increased in a rapid state, the percentage of files that contain sensitive data has also grown, today standing at 21% with an increase of 17% over the past two years.

Further, the adoption of a software-defined data center in healthcare drives the market. Medical expenditure is increasing every year, and the cost of optimization from the customer perspective is also in focus. According to the Institute for Health Metrics and Evaluation, by 2040, healthcare spending is expected to increase to about USD 18.28 trillion, with high-income countries expected to spend around USD 9,019 per person in 2040. Whilst the healthcare organizations are in the nascent stage to adopt modern changes to IT infrastructure, the adoption of a software-defined data center in the healthcare institutes will streamline data management, unify networking, and server storage with better agility.

Furthermore, the state and local municipalities of the North America region are competing to construct more number of data centers to their markets. For instance, Texas has passed tax incentive legislation providing complete relief of sales taxes on business property. This factor is vital for data center operation in the next 10 to 15 years for large data center users.

Further, the Covid-19 pandemic has struck at a time when demand for data center capacity is expanding rapidly and has applied restrictions to the construction of new facilities. There is a certain impact on the market, such as data network traffic has gained increased traffic by increasing usage of applications such as Zoom, Microsoft Office, others.

However, according to trend force, Q2 2020 has witnessed 3.2 million of server unit shipments. Moreover, the pandemic has caused challenges for the construction of a data center due to disruption in the supply chain, which will slow down the market growth.

The data center sector needs to modify its building practices to continue to deliver new projects under this pandemic. Further, under the restrictions put in place during the pandemic, data center construction has continued, such as AWS has opened a region in South Africa, and Digital Realty's Interxion has announced plans for a campus in Paris.

Competitive Landscape

The data center construction market is fragmented in nature, as it consists of several major players. With technological advancements and product innovations, many companies are increasing their presence in the market by securing new contracts and by tapping new markets that cater to an intense rivalry. Key players are IBM Corporation, SAS Institute Inc., etc.

Key Topics Covered:

1 INTRODUCTION

1.1 Study Assumptions and Market Definition

1.2 Scope of the Study

2 RESEARCH METHODOLOGY

3 EXECUTIVE SUMMARY

4 MARKET INSIGHTS

4.1 Market Overview

4.2 Industry Attractiveness - Porter's Five Force Analysis

4.3 Technology Snapshot

5 MARKET DYNAMICS

5.1 Market Drivers

5.1.1 Growing Demand for Cloud Applications

5.1.2 Rising Adoption of Hyperscale Data Centers

5.2 Market Restraints

5.2.1 Increase in Power Outages and Power Consumption

6 MARKET SEGMENTATION

6.1 By Infrastructure

6.1.1 Electrical Infrastructure

6.1.1.1 UPS Systems

6.1.1.2 Other Electrical Infrastructure

6.1.2 Mechanical Infrastructure

6.1.2.1 Cooling Systems

6.1.2.2 Racks

6.1.2.3 Other Mechanical Infrastructure

6.1.3 General Construction

6.2 By Tier Type (Qualitative Trend Analysis)

6.2.1 Tier-I and -II

6.2.2 Tier-III

6.2.3 Tier-IV

6.3 By Size of Enterprise

6.3.1 Small and Medium-scale Enterprises

6.3.2 Large-scale Enterprises

6.4 By End-User

6.4.1 BFSI

6.4.2 IT and Telecommunications

6.4.3 Government and Defense

6.4.4 Healthcare

6.4.5 Other End-Users

6.5 By Geography

7 COMPETITIVE LANDSCAPE

7.1 Company Profiles

7.1.1 IBM Corporation

7.1.2 SAS Institute Inc.

7.1.3 Turner Construction Co.

7.1.4 DPR Construction Inc.

7.1.5 Fortis Construction Inc.

7.1.6 Hensel Phelps Construction Co. Inc.

7.1.7 HITT Contracting Inc.

7.1.8 JE Dunn Construction Group Inc.

7.1.9 MA Mortenson Company Inc.

7.1.10 AECOM Ltd.

7.1.11 Gilbane Building Company Inc.

7.1.12 Clune Construction Company, L.P.

7.1.13 Nabholz Corporation

7.1.14 RagingWire Data Centers Inc.

7.1.15 CyrusOne Inc.

7.1.16 Cyxtera Technologies Inc.

8 INVESTMENT ANALYSIS

9 FUTURE OF THE MARKET

For more information about this report visit https://www.researchandmarkets.com/r/13em3e

Excerpt from:
Global Data Center Construction Market Report 2021-2026: Growing Demand for Cloud Applications & Rising Adoption of Hyperscale Data Centers -...

Read More..

Evaluation of Private Cloud Storage Market 2021-2026: Recent Industry Developments and Growth Strategy The Bisouv Network – The Bisouv Network

Research Report on Private Cloud Storage Market added by AllTheResearch consist of Growth Opportunities, Development Trends, and Forecast 2026. The global Private Cloud Storage Market size was valued at US$ 6603.3 Mn in 2018 and is expected to grow at a CAGR of 16% for the forecast period ending 2026 reaching a Market value of US$ XX Mn. The Global Private Cloud Storage Market report covers a brief overview of the segments and sub-segmentations including the product types, applications, companies & regions. This report describes the overall Private Cloud Storage Market size by analyzing historical data and future forecast.

Private Cloud Storage Market Research Objective:

Request for Sample Copy with Complete TOC and Figures & Graphs at https://www.alltheresearch.com/sample-request/412

Top players Covered in Private Cloud Storage Market Study are:

Above mentioned companies were scrutinized to assess the competitive landscape of the global Private Cloud Storage market. The report provides company profiles of each player. Each profile includes company product portfolio, business overview, company governance, company financials, business strategies, manufacturing locations, and production facilities, company sales, recent developments, and strategic collaborations & partnerships, new product launches, company segments, application diversification, and company strength and weakness analysis.

Ask Your Queries to our Analyst regarding Private Cloud Storage Report at https://www.alltheresearch.com/speak-to-analyst/412

This Private Cloud Storage market report provides insights on new trade regulations, import-export analysis, industry value chain analysis, market size, consumption, production analysis, capacity analysis, regional and segment market share, product launches, product pipeline analysis, the impact of Covid-19 on the supply chain, key regions, untapped markets, patent analysis, product approvals, continuous innovations, and developments in the Market.

Based on type, Private Cloud Storage market report split into

Based on Application Private Cloud Storage market is segmented into

Regional Analysis Covered in this Report are:

For more Customization, Connect with us at https://www.alltheresearch.com/customization/412

Report Coverage

Major Points in Table of Content of Private Cloud Storage Market

To Buy the Full Report, Connect with us athttps://www.alltheresearch.com/buy-now/412

Important Questions Answered by Global Private Cloud Storage Market Report

For More Details Contact Us:

Contact Name: Rohan

Email: [emailprotected]

Phone: +1 (407) 768-2028

Continued here:
Evaluation of Private Cloud Storage Market 2021-2026: Recent Industry Developments and Growth Strategy The Bisouv Network - The Bisouv Network

Read More..

IBM and ExxonMobil are building quantum algorithms to solve this giant computing problem – ZDNet

Research teams from energy giant ExxonMobil and IBM have been working together to find quantum solutions to one of the most complex problems of our time: managing the tens of thousands of merchant ships crossing the oceans to deliver the goods that we use every day.

The scientists lifted the lid on the progress that they have made so far and presented the different strategies that they have been using to model maritime routing on existing quantum devices, with the ultimate goal of optimizing the management of fleets.

ExxonMobil was the first energy company to join IBM's Quantum Network in 2019, and has expressed a keen interest in using the technology to explore various applications, ranging from the simulation of new materials to solving optimization problems.

SEE: Research: Why Industrial IoT deployments are on the rise (TechRepublic Premium)

Now, it appears that part of the energy company's work was dedicated to tapping quantum capabilities to calculate journeys that minimize the distance and time traveled by merchant ships across the globe.

On a worldwide scale, the equation is immense intractable, in fact, for classical computers. About 90% of world trade relies on maritime shipping, with more than 50,000 ships, themselves carrying up to 200,000 containers each, moving around every day to transport goods with a total value of $14 trillion.

The more the number of ships and journeys increase, the bigger the problem becomes. As IBM and ExxonMobil's teams put itin a blog post detailing their research: "Logistically speaking, this isn't the 'traveling salesperson problem.'"

While this type of exponentially growing problem can only be solved with simplifications and approximations on classical computers, the challenge is well-suited to quantum technologies. Quantum computers can effectively leverage a special dual state that is taken on by quantum bits, or qubits, to run many calculations at once; meaning that even the largest problems could be resolved in much less time than is possible on a classical computer.

"We wanted to see whether quantum computers could transform how we solve such complex optimization problems and provide more accurate solutions in less computational times," said the researchers.

Although the theory behind the potential of quantum computing is well-established, it remains to be found how quantum devices can be used in practice to solve a real-world problem such as the global routing of merchant ships. In mathematical terms, this means finding the right quantum algorithms that could be used to most effectively model the industry's routing problems, on current or near-term devices.

To do so, IBM and ExxonMobil's teams started with widely-used mathematical representations of the problem, which account for factors such as the routes traveled, the potential movements between port locations and the order in which each location is visited on a particular route. There are many existing ways to formulate the equation, one of which is called the quadratic unconstrained binary optimization (QUBO) technique, and which is often used in classical computer science.

The next question was to find out whether well-known models like QUBO can be solved with quantum algorithms and if so, which solvers work better. Using IBM's Qiskit optimization module, which was released last year toassist developers in building quantum optimization algorithms, the team tested various quantum algorithms labeled with unbeatably exotic names: the Variational Quantum Eigensolver (VQE), the Quantum Approximate Optimization Algorithm (QAOA), and Alternating Direction Method of Multiplier (ADMM) solvers.

After running the algorithms on a simulated quantum device, the researchers found that models like QUBO could effectively be solved by quantum algorithms, and that depending on the size of the problem, some solvers showed better results than others.

In another promising finding, the team said that the experiment showed some degree of inexactness in solving QUBOs is tolerable. "This is a promising feature to handle the inherent noise affecting the quantum algorithms on real devices," said the researchers.

SEE: BMW explores quantum computing to boost supply chain efficiencies

Of course, while the results suggest that quantum algorithms could provide real-world value, the research was carried out on devices that are still technically limited, and the experiments can only remain small-scale. The idea, however, is to develop working algorithms now, to be ready to harness the power of a fully fledged quantum computer when the technology develops.

"As a result of our joint research, ExxonMobil now has a greater understanding of the modelling possibilities, quantum solvers available, and potential alternatives for routing problems in any industry," said the researchers.

What applies to merchant ships, in effect, can also work in other settings. Routing problems are not inherent to the shipping industry, and the scientists confirmed that their findings could easily be transferred to any vehicle optimization problem that has time constraints, such as goods delivery, ride-sharing services or urban waste management.

In fact, ExxonMobil is not the first company to look at ways to use quantum computing techniques to solve optimization problems. Electronics manufacturer OTI Lumionics, for example, has been using QUBO representations to find the most optimal simulation of next-generation OLED materials. Instead of using gate-based quantum computers to run the problem, however, the company has been developing quantum-inspired algorithms to solve calculations on classical Microsoft Azure hardware,with encouraging results.

The mathematical formulas and solution algorithmsare described in detail in the research paper, and the ExxonMobil/IBM team stressed that their use is not restricted. The researchers encouraged their colleagues to reproduce their findings to advance the global field of quantum solvers.

The rest is here:
IBM and ExxonMobil are building quantum algorithms to solve this giant computing problem - ZDNet

Read More..

Kangaroo Court: Quantum Computing Thinking on the Future – JD Supra

The promise of quantum computers is that certain computational tasks might be executed exponentially faster on a quantum processor than on a classical processor.

Quantum computing is a beautiful fusion of quantum physics with computer science. It incorporates some of the most stunning ideas of physics from the twentieth century into an entirely new way of thinking about computation. Quantum computers have the potential to resolve problems of a high complexity and magnitude across many different industries and application, including finance, transportation, chemicals, and cybersecurity. Solving the impossible in a few hours of computing time.

Quantum computing is often in the news: China teleported a qubit from earth to a satellite; Shors algorithm has put our current encryption methods at risk; quantum key distribution will make encryption safe again; Grovers algorithm will speed up data searches. But what does all this really mean? How does it all work?

Todays computers operate in a very straightforward fashion: they manipulate a limited set of data with an algorithm and give you an answer. Quantum computers are more complicated. After multiple units of data are input into qubits, the qubits are manipulated to interact with other qubits, allowing for several calculations to be done simultaneously. Thats where quantum computers are a lot faster than todays machines.

Quantum computers have four fundamental capabilities that differentiate them from todays classical computers:

All computations involve inputting data, manipulating it according to certain rules, and then outputting the final answer. For classical computations, the bit is the basic unit of data. For quantum computation, this unit is the quantum bit usually shortened to qubit.

The basic unit of quantum computing is a qubit. A classical bit is either 0 or 1. If its 0 and we measure it, we get 0. If its 1 and we measure 1, we get 1. In both cases the bit remains unchanged. The standard example is an electrical switch that can be either on or off. The situation is totally different for qubits. Qubits are volatile. A qubit can be in one of an infinite number of states a superposition of both 0 and 1 but when we measure it, as in the classical case, we just get one of two values, either 0 or 1. Qubits can also become entangled. In fact, the act of measurement changes the qubit. When we make a measurement of one of them, it affects the state of the other. Whats more, they interact with other qubits. In fact, these interactions are what make it possible to conduct multiple calculations at once.

Nobody really knows quite how or why entanglement works. It even baffled Einstein, who famously described it as spooky action at a distance. But its key to the power of quantum computers. In a conventional computer, doubling the number of bits doubles its processing power. But thanks to entanglement, adding extra qubits to a quantum machine produces an exponential increase in its number-crunching ability.

These three things superposition, measurement, and entanglement are the key quantum mechanical ideas. Controlling these interactions, however, is very complicated. The volatility of qubits can cause inputs to be lost or altered, which can throw off the accuracy of results. And creating a computer of meaningful scale would require hundreds of thousands of millions of qubits to be connected coherently. The few quantum computers that exist today can handle nowhere near that number. But the good news is were getting very, very close.

Quantum computing and classical computer are not two distinct disciplines. Quantum computing is the more fundamental form of computing anything that can be computed classically can be computed on a quantum computer. The qubit is the basic unit of computation, not the bit. Computation, in its essence, really means quantum computing. A qubit can be represented by the spin of an electron or the polarization of a photon.

In 2019 Google achieved a level of quantum supremacy when they reported the use of a processor with programmable superconducting qubits to create quantum states on 54 qubits, corresponding to a computational state-space of dimension 253(about 1016). This incredible achievement was slightly short of their mission goal for creating quantum states of 72 qubits. What is so special about this number? Classical computers can simulate quantum computers if the quantum computer doesnt have too many qubits, but as the number of qubits increases we reach the point where that is no longer possible.

There are 8 possible three-bit combinations: 000,001, 010, 011, 100, 101, 110, 111. The number 8 comes from 23. There are two choices for the first bit, two for the second and two for the third, and we might multiple these three 2s together. If instead of bits we switch to qubits, each of these 8 three-bit strings is associated with a basis vector, so the vector space is 8-dimensional. If we have 72 qubits, the number of basis elements is 2. This is about 4,000,000,000,000,000,000,000. It is a large number and is considered to be the point at which classical computers cannot simulate quantum computers. Once quantum computers have more than 72 or so qubits we truly enter the age of quantum supremacy when quantum computers can do computations that are beyond the ability of any classical computer.

To provide a little more perspective, lets consider a machine with 300 qubits. This doesnt seem an unreasonable number of the not too distant future. But 2300 is an enormous number. Its more than the number of elementary particles in the known universe. A computation using 300 qubits would be working with 2300 basis elements.

Some calculations required for the effective simulation of real-life scenarios are simply beyond the capability of classical computers whats known as intractable problems. Quantum computers, with their huge computational power, are ideally suited to solving these problems. Indeed, some problems, like factoring, are hard on a classical computer, but are easy on a quantum computer. This creates a world of opportunities, across almost every aspect of modern life.

Healthcare: classical computers are limited in terms of size and complexity of molecules they can simulate and compare (an essential process of early drug development). Quantum computers will allow much larger molecules to be simulated. At the same time, researchers will be able to model and simulate interactions between drugs and all 20,000+ proteins encoded in the human genome, leading to greater advancements in pharmacology.

Finance: one potential application is algorithmic trading using complex algorithms to automatically trigger share dealings based on a wide variety of market variables. The advantages, especially for high-volume transactions, are significant. Another application is fraud detection. Like diagnostics in healthcare, fraud detection is reliant upon pattern recognition. Quantum computers could deliver a significant improvement in machine learning capabilities; dramatically reducing the time taken to train a neural network and improving the detection rate.

Logistics: Improved data analysis and modelling will enable a wide range of industries to optimize workflows associated with transport, logistics and supply-chain management. The calculation and recalculation of optimal routes could impact on applications as diverse as traffic management, fleet operations, air traffic control, freight and distribution.

It is, of course, impossible to predict the long-term impact of quantum computing with any accuracy. Quantum computing is now in its infancy, and the comparison to the first computers seems apt. The machines that have been constructed so far tend to be large and not very powerful, and they often involve superconductors that need cooled to extremely low temperatures. To minimize the interaction of quantum computers with the environment, they are always protected from light and heat. They are shieled against electromagnetic radiation, and they are cooled. One thing that can happen in cold places is that certain materials become superconductors they lose all electrical resistance and superconductors have quantum properties that can be exploited.

Many countries are experimenting with small quantum networks using optic fiber. There is the potential of connecting these via satellite and being able to form a worldwide quantum network. This work is of great interest to financial institutions. One early impressive result involves a Chinese satellite that is devoted to quantum experiments. Its named Micius after a Chinese philosopher who did work in optics. A team in China connected to a team in Austria the first time that intercontinental quantum key distribution (QKD) had been achieved. Once the connection was secured, the teams sent pictures to one another. The Chinese team sent the Austrians a picture of Micius, and the Austrians sent a picture of Schrodinger to the Chinese.

To actually make practical quantum computers you need to solve a number of problems, the most serious being decoherence the problem of your qubit interacting with something from the environment that is not part of the computation. You need to set a qubit to an initial state and keep it in that state until you need to use it. Their quantum state is extremely fragile. The slightest vibration or change in temperature disturbances known as noise in quantum-speak can cause them to tumble out of superposition before their job has been properly done. Thats why researchers are doing the best to protect qubits from the outside world in supercooled fridges and vacuum chambers.

Alan Turing is one of the fathers of the theory of computation. In his landmark paper of 1936 he carefully thought about computation. He considered what humans did as they performed computations and broke it down to its most elemental level. He showed that a simple theoretical machine, which we now call a Turing machine, could carry out any algorithm. But remember, Turing was analyzing computation based on what humans do. With quantum computation the focus changes from how humans compute to how the universe computes. Therefore, we should think of quantum computation as not a new type of computation but as the discovery of the true nature of computation.

See the original post here:
Kangaroo Court: Quantum Computing Thinking on the Future - JD Supra

Read More..

New EU Consortium shaping the future of Quantum Computing USA – PRNewswire

Europe has always been excellent in academic research, but over the past few decades commercializing research projects has been slow compared to international competition. This is starting to change with quantum technologies. As one of the largest efforts in Europe and worldwide, Germany announced 2 Billion funding into quantum programs in June 2020, from which 120 Million are invested in this current round of research grants.

Today, IQM announced a Quantum project consortium that includes Europe's leading startups (ParityQC, IQM), industry leaders (Infineon Technologies), research centers (Forschungszentrum Jlich),supercomputing centers (Leibniz Supercomputing Centre), and academia (Freie Universitt Berlin) has been awarded 12.4 Million from the German Ministry of Education and Research (BMBF) (Announcement in German).

The scope of the project is to accelerate commercialization through an innovative co-design concept. This project focuses on application-specific quantum processors, which have the potential to create a fastlane to quantum advantage. The digital-analog concept used to operate the processors will further lay the foundation for commercially viable quantum computers. This project will run for four years and aims to develop a 54-qubit quantum processor.

The project is intended to support the European FET Flagship project EU OpenSuperQ, announced in 2018 which is aimed at designing, building, and operating a quantum information processing system of up to 100 qubits. Deploying digital-analog quantum computing, this consortium adds a new angle to the OpenSuperQ project and widens its scope. With efforts from Munich, Berlin and Jlich, as well as Parity QC from Austria, the project builds bridges and seamlessly integrates into the European quantum landscape.

"The grant from the Federal Ministry of Education and Research of Germanyis a huge recognition of our unique co-design approach for quantum computers. Last year when we established our office in Munich, this was one of our key objectives. The concept allows us to become a system integrator for full-stack quantum computers by bringing together all the relevant players. As Europe's leading startup in quantum technologies, this gives us confidence to further invest in Germany and other European countries" said Dr. Jan Goetz, CEO of IQM Quantum Computers.

As European technology leader, Germany is taking several steps to lead the quantum technology race. An important role of such leadership is to bring together the European startups, industry, research and academic partners. This project will give the quantum landscape in Germany an accelerated push and will create a vibrant quantum ecosystem in the region for the future.

Additional Quotes:

"DAQC is an important project for Germany and Europe. It enables us to take a leading role in the area of quantum technologies. It also allows us to bring quantum computing into one of the prime academic supercomputing centres to more effectively work on the important integration of high-performance computing and quantum computing. We are looking forward to a successful collaboration," said Prof. DrMartinSchulz, Member of the Board of Directors, Leibniz Supercomputing Centre (LRZ).

"The path towards scalable and fully programmable quantum computing will be the parallelizability of gates and building with reduced complexity in order to ensure manageable qubit control. Our ParityQC architecture is the blueprint for a fully parallelizable quantum computer, which comes with the associated ParityOS operating system. With the team of extraordinary members of the DAQC consortium this will allow us to tackle the most pressing and complex industry-relevant optimization problems." saidMagdalena Hauser & Wolfgang Lechner, CEOs & Co-founder ParityQC

"We are looking forward to exploring and realizing a tight connection between hardware and applications, and having DAQC quantum computers as a compatible alternative within the OpenSuperQ laboratory. Collaborations like this across different states, and including both public and private partners, have the right momentum to move quantum computing in Germany forward." saidProf. Frank Wilhelm-Mauch, Director, Institute for Quantum Computing Analytics, Forschungszentrum Jlich

"At Infineon, we are looking forward to collaborating with top-class scientists and leading start-ups in the field of quantum computing in Europe. We must act now if we in Germany and Europe do not want to become solely dependent on American or Asian know-how in this future technology area. We are very glad to be part of this highly innovative project and happy to contribute with our expertise in scaling and manufacturing processes." saidDr.Sebastian Luber, Senior Director Technology & Innovation, Infineon Technologies AG

"This is a hugely exciting project. It is a chance of Europe and Germany to catch up in the development of superconducting quantum computers. I am looking forward to adventures on understanding how such machines can be certified in their precise functioning." said Prof.Jens Eisert, Professor of Quantum Physics, Freie Universitt Berlin

About IQM Quantum Computers:

IQM is the European leader in superconducting quantum computers, headquartered in Espoo, Finland. Since its inception in 2018, IQM has grown to 80+ employees and has also established a subsidiary in Munich, Germany, to lead the co-design approach. IQM delivers on-premises quantum computers for research laboratories and supercomputing centers and provides complete access to its hardware. For industrial customers, IQM delivers quantum advantage through a unique application-specific co-design approach. IQM has raised 71 Million from VCs firms and also public grants and is also building Finland's first quantum computer.

For more information, visit http://www.meetiqm.com.

Registered offices:

IQM Finland OyKeilaranta 1902150 EspooFINLANDwww.meetiqm.com

IQM GERMANY GmbHNymphenburgerstr. 8680636 MnchenGermany

IQM: Facts and Figures

Founders:

Media Contact: Raghunath Koduvayur, Head of Marketing and Communications, [emailprotected], +358504876509

Photo - https://mma.prnewswire.com/media/1437806/IQM_Quantum_Computers_Founders.jpg Photo - https://mma.prnewswire.com/media/1437807/IQM_Quantum_computer_design.jpg Logo - https://mma.prnewswire.com/media/1121497/IQM_Logo.jpg

SOURCE IQM Finland Oy

http://meetiqm.com/contact/

Read more:
New EU Consortium shaping the future of Quantum Computing USA - PRNewswire

Read More..

Quantum Computers May Steal Bitcoin by Deriving Private Keys once Advanced Enough in 5-30 Years, Experts Claim – Crowdfund Insider

John Smith, who has been regularly keeping up with computer science, quantum computing, and cryptocurrency-related developments, claims that the future of crypto is quantum-resistant, meaning we must build systems that can protect themselves against the potential attack from quantum computers (QCs) when they become powerful enough to present a challenge to digital asset networks.

While discussing what the future threat to Bitcoin (BTC) from Quantum Computing might be, and how big of a deal it really is, Smith claims that the threat is that quantum computers will eventually be able to break Bitcoins current digital signatures, which could render the network insecure and cause it to lose value.

He goes on to question why there isnt already a solution as trivial as simply upgrading the signatures? He explains that this might not be possible due to the decentralized nature of Bitcoin and other large crypto-asset networks such as Ethereum (ETH).

While discussing how long until someone actually develops a quantum computer that can steal BTC by quickly deriving private keys from their associated public keys, Smith reveals that serious estimates range somewhere from 5 to over 30 years, with the median expert opinion being around 15 years.

Smooth added:

Banks/govts/etc. will soon upgrade to quantum-resistant cryptography to secure themselves going forward. Bitcoin, however, with large financial incentives for attacking it and no central authority that can upgrade *for* users, faces a unique set of challenges.

Going on to mention the main challenges, Smith notes that we can separate vulnerable BTC into three classes, including lost coins (which are estimated to be several million), non-lost coins residing in reused/taproot/otherwise-vulnerable addresses, and coins in the mempool (i.e., being transacted).

Beginning with lost coins, why are they even an issue? Because its possible to steal a huge number all at once and then selling them in mass quantities which could tank the entire crypto market. He added that if that seems imminent, the market could preemptively tank. He also mentioned that an attacker may profit greatly by provoking either of the above and shorting BTC.

While proposing potential solutions, Smith suggests preemptively burning lost coins via soft fork (or backwards compatible upgrade). He clarifies that just how well this works will depend on:

He further noted:

Another potential way around the problem of millions of lost BTC is if a benevolent party were to steal & then altruistically burn them. Not clear how realistic this is, given the financial incentives involved & who the parties likely to have this capability would be.

He added:

Moving on why are non-lost coins with vulnerable public keys an issue? This is self-evident. The primary threat to the wealth of BTC holders is their BTC being stolen. And as with lost coins, a related threat is that the market starts to fear such an attack is possible.

He also mentioned that another solution could be that Bitcoin adds a quantum-resistant signature and holders proactively migrate. He points out that how well this all works will depend on:

While discussing the vulnerability of coins in the mempool, Smith mentioned that it could complicate migration to quantum-resistant addresses *after* large QCs are built or it could greatly magnify the threat posed by an unanticipated black swan advance in QC.

While proposing other solutions, Smith noted:

A commit-reveal tx scheme can be used to migrate coins without mempool security. This gets around the vulnerability of a users old public key by adding an extra encryption/decryption step based on their new quantum-resistant key but w/ crucial limitations.

He added:

Considerations w/ commit-reveal migration [are that] its not foolproof unless a user starts with their coins stored in a non-vulnerable address, because attackers can steal any vulnerable coins simply by beating the original owner to the punch.

Considerations with commit-reveal migration are also that commit transactions introduce technical hurdles (vs. regular txs) & increase the load on the network. Neither of these are insurmountable by any means, but they suggest that this method should not be relied upon too heavily, Smith claims.

He also noted that how well the commit-reveal transaction type works will depend on:

He added:

One potential way around the network overhead & just plain hassle of commit-reveal migration would be if a highly efficient quantum-resistant zero-knowledge proof were discovered. Current QR ZK algorithms are far too large to use in Bitcoin, but that could change. Worth noting.

While sharing other potential solutions, Smith noted that theres the tank the attack & rebuild.

He pointed out that Bitcoins network effects are massive, so it is challenging to accurately estimate or predict what the crypto ecosystem will look like in the future, but the potential economic disruption of BTC failing may incentivize extraordinary measures to save the network.

He added:

Bitcoins ability to tank a quantum-computing-related market crash will depend on [whether theres] another chain capable of replacing BTC as the main crypto store of value [and whether] BTC [can] avoid a mining death spiral? Also, how far will stakeholders go to ensure the network survives & rebounds?

Smith also mentioned that for people or institutions holding Bitcoin, some good measures may be purchasing insurance, and/or hedging BTC exposure with an asset that would be expected to increase in value in the case of an attack.

Originally posted here:
Quantum Computers May Steal Bitcoin by Deriving Private Keys once Advanced Enough in 5-30 Years, Experts Claim - Crowdfund Insider

Read More..

2020 Quantum Communications in Space Research Report: Quantum Communications are Expected to Solve the Problem of Secure communications First on…

Dublin, Feb. 11, 2021 (GLOBE NEWSWIRE) -- The "Quantum Communications in Space" report has been added to ResearchAndMarkets.com's offering.

The modern world more and more relies on information exchange using data transfer technologies.

Private and secure communications are fundamental for the Internet, national defence and e-commerce, thus justifying the need for a secure network with the global protection of data. Information exchange through existing data transfer channels is becoming prone to hacker attacks causing problems on an international scale, such as interference with democratic elections, etc.

In reality the scale of the "hacking" problem is continual, in 2019 British companies were reportedly hit by about 5,000 "ransomware" attacks that paid out more than $200 million to cyber criminals [1]. During the first half of 2020, $144.2 million has already been lost in 11 of the biggest ransomware attacks [2]. Communications privacy is therefore of great concern at present.

The reasons for the growing privacy concerns are [3]: the planned increase of secure information (requiring encryption) data traffic rates from the current 10 to future 100 Gbit/s; annual increases in data traffic of 20-25% and the application of fibre optic cables not only for mainstream network lines by also for the "final mile" to the end-user. These developments are accompanied by [3]: growing software vulnerabilities; more powerful computational resources available to hackers at lower costs; possible quantum computer applications for encryption cracking and the poor administration of computer networks.

Conventional public key cryptography relies on the computational intractability of certain mathematical functions.

Applied conventional encryption algorithms (DH, RSA, ECDSA TLS/SSL, HTTPS, IPsec, X.509) are good in that there is currently no way to find the key (with a sufficient length) for any acceptable time. Nevertheless, in principle it is possible, and there are no guarantees against the discovery in the future of a fast factorization algorithm for classical computers or from the implementation of already known algorithms on a quantum computer, which will make conventional encryption "hacking" possible. Another "hacking" strategy involves original data substitution. A final vulnerability comes from encryption keys being potentially stolen. Hence, the demand exists for a truly reliable and convenient encryption system.

Quantum communications are expected to solve the problem of secure communications first on international and national scales and then down to the personal level.

Quantum communication is a field of applied quantum physics closely related to quantum information processing and quantum teleportation [4]. It's most interesting application is protecting information channels against eavesdropping by means of quantum cryptography [4].

Quantum communications are considered to be secure because any tinkering with them is detectable. Thus, quantum communications are only trustful and safe in the knowledge that any eavesdropping would leave its mark.

By quantum communications two parties can communicate secretly by sharing a quantum encryption key encoded in the polarization of a string of photons.

This quantum key distribution (QKD) idea was proposed in the mid-1980s [5]. QKD theoretically offers a radical new way of an information secure solution to the key exchange problem, ensured by the laws of quantum physics. In particular, QKD allows two distant users, who do not share a long secret key initially, to generate a common, random string of secret bits, called a secret key.

Using the one-time pad encryption, this key has been proven to be secure [6] to encrypt/decrypt a message, which can then be transmitted over a standard communication channel. The information is encoded in the superposition states of physical carriers at a single-quantum level, where photons, the fastest traveling qubits, are usually used. Any eavesdropper on the quantum channel attempting to gain information of the key will inevitably introduce disturbance to the system that can be detected by the communicating users.

Key Topics Covered:

1. INTRODUCTION

2. Quantum Experiments at a Space Scale (QUESS)2.1. European root of the Chinese project 2.2. Chinese Counterpart2.3. The QUESS Mission set-up 2.3.1. Spacecraft 2.3.2. Ground stations 2.3.3. Project budget 2.4. International cooperation2.5. Results2.6. Tiangong-2 Space Lab QKD

3. Future plans

4. Comparison to alternatives4.1. Small Photon-Entangling Quantum System4.2. Hyperentangled Photon Pairs 4.3. QEYSSat 4.4. Reflector satellites 4.5. GEO satellite communications 4.6. Airborne4.7. Ground4.7.1. Moscow quantum communications line4.7.2. Telephone & optical line communications

5. CONCLUSIONS

REFERENCES

Companies Mentioned

For more information about this report visit https://www.researchandmarkets.com/r/li9vd4

Originally posted here:
2020 Quantum Communications in Space Research Report: Quantum Communications are Expected to Solve the Problem of Secure communications First on...

Read More..