Page 2,212«..1020..2,2112,2122,2132,214..2,2202,230..»

The New School at SXSW: The Art, Design, and Social Good of Quantum Computing – The New School News

March 24, 2022

Quantum Computing sounds like a program you would find at a university focused on technology, like MIT or Stanford, not a school known for its design, music, and social science programs. But understanding that this new technology is positioned to shape and change the world, The New School has embarked on an initiative to explore the applications of quantum computing in art, design, education, business, and social justice. Recently the individuals bringing quantum computing to The New School spoke at the 2022 South by Southwest Conference and Festivals (SXSW) to discuss how this emerging technology can be integrated into creative arts and applied to advance social good.

During the panel, Lin Zhou, senior vice president and chief information officer at The New School; Sven Travis, associate professor of media and design at Parsons School of Design; and Maya Georgieva, senior director of The New Schools Innovation Center, discussed the importance of having artists, designers, and social researchers participate in the early development of quantum computing.

Its extremely rare for creatives to get access to technology in the early days of development. One of the things were hoping for is that the evolution of quantum could happen in a different way than, for example, artificial intelligence and machine learning, said Travis. We can go back to any number of technologies over the last couple of decades where were getting access to it or engaging with it usually after the technology is fairly fully developed.

The computing were accustomed to, which drives laptops, desktops, websites, and smartphones, takes in information coded as the value of either 1 or 0. In contrast, quantum computing can take in information that exists in more than one state, such as a 1 and a 0 at the same time. The combination of The New Schools strength in liberal arts with this cutting-edge technology makes the new course different from those in traditional STEM university programs.

Although quantum computing is still an emerging field, the importance it will have prompted the university to be proactive in bringing this subject to students. Whenever there is a technology breakthrough, usually the leading uses are not in the liberal arts. If you think about artificial intelligence [AI], the leading uses for AI are in financial technology, cyber security, and facial and voice recognition. Liberal arts is usually an afterthought. When those problems are figured out, then they say, How about music? How about design? How about fashion? said Zhou. This has to stop, because the arts, music, and design impact peoples daily lives. Whenever we have a new technology, liberal arts ought to be one of the front-runners as new technology is adopted.

Many liberal arts and design colleges look at computer coding as the new literacy, but Zhou shared how creatives, social researchers and technologists should take a more holistic view toward technology. In the past, when we talked about literacy, we usually talked about reading and writing. But for this century, its not enough. When we talk about literacy, we actually mean that everybody should be able to create harmony with technology. Quantum, as the next emerging and breakthrough technology, has profound capability to solve problems that the classical computer cannot solve today. So, from the point of view of all the higher education institutions, we have the obligation to help society adopt the technology, said Zhou. We know if we dont do it right with privacy, with social justice, those issues, which seem to be very simple, will backfire on us.

Part of Georgievas mission is to engage the community with emerging technologies. The opportunity for us is to create events where people can come together, so that students can have a real conversation about their own ideas. Its important to us to give them that space, access to tools, and opportunities to play, said Georgieva.

Bringing an emerging technology, frontier technology, and code as a language, into a creative setting is really fascinating and opens up imaginative projects that may not necessarily take place in a lab. We want to see that impact. We want to be part of explaining what it would mean to live in a world where quantum computing and art is one expression, she adds.

Citing the universitys history of innovation and commitment to social change since its founding, Zhou believes The New School has an important role to play in the development of quantum computing. With The New School, for the past 100 years, we have produced world-class thinkers, designers, and social justice movers. We will continue to focus on leveraging quantum computing, this wonderful technology, on the social front, and leveraging the technology to improve the human condition, said Zhou.

Visit link:
The New School at SXSW: The Art, Design, and Social Good of Quantum Computing - The New School News

Read More..

Elderly care? Bring in the robots! – Modern Diplomacy

What is quantum computing? Why do we need quantum computing? According to Moores law (The complexity of a microcircuit, measured for example by the number of transistors per chip, doubles every 18 months and hence quadruples every 3 years), the density of transistors per area unit on a computing chip doubles every year and a half, which poses two main problems for traditional computers. Firstly, as to computation, high-density transistors will face the problem of power consumption and thermal effects. Secondly, the reduction in size will cause the failure of the classic theory of transistors and their performance will deviate from the original design.

Both of these problems will limit the further shrinkage of transistors, thus putting an end to Moores law. However, although the traditional computer develops until the end of Moores law, it is still unable to cope with many problems that need to be solved. Let us say we calculate the fundamental state energy of N coupled two-level systems, since the number of unknowns will be proportional to 2^N. The current simulation time required for IBMs supercomputer is 2.5 days for a specific computation on Googles 53-qubit quantum computer, which takes about 200 seconds. Qubit is the contraction of quantum bit, the term coined by Benjamin Schumacher to denote the quantum bit, i.e. the basic unit of quantum information.

As the number of qubits continues to increase, conventional computers will soon reach a bottleneck. However, almost all conventional computations involving quantum mechanics face the same problems. Hence many researchers started thinking about how to use the quantum properties themselves as computational resources as early as 1970, which was then summarised by Richard Feynman in 1982.

Hence what advantages do qubits have over traditional computing? The most surprising is none other than the properties of quantum superposition and quantum entanglement. Quantum superposition is a non-classical state that contrasts with empirical intuition and the metaphor is Schrdingers Cat that is both alive and dead.

The superposition state, however, is a real state for qubits on microscopic or mesoscopic scales (spatial scales, viewpoints and the like that are intermediate between macroscopic and microscopic scales). Qubits can be found in the superposition of two characteristic quantum states, and this superposition state is a non-classical state in which being and non-being coexist in the quantum world. In this state, the qubit is neither 0 nor 1, but it is not in a state in which both sides (0 and 1) are uncertain, but rather with equal probability, like a coin before it lands on the palm of the hand.

While in visible nature it is possible to observe a phenomenon without perceptibly influencing it by observation alone (i.e. only by looking at the said phenomenon) in atomic physics and quantum mechanics, a finite and up to a certain point invisible perturbation is connected to every observation. The uncertainty principle is the recognition of absolute chance and arbitrariness in natural phenomena. On the other hand, as will become clear later, quantum mechanics does not predict a single, well-defined result for the observation or for any observer.

The fact that qubits can undergo quantum evolution in a set of superposition states which is neither 0 nor 1 implies quantum parallelism in the relevant computation. The evolution of each qubit, however, is not sufficient to construct all possible evolutions of a multi-qubit system. We must therefore

also interact with different qubits so that they can be intertwined in order to construct a satisfactory algorithm for such a computation. This special superposition is precisely called entangled quantum state.

Let us take two qubits as an example, which is a typical entangled state. Between them, the state representing the first qubit is connected to the state of the second qubit. The two connections are in quantum superposition and we cannot therefore talk about the state in which the two qubits are at that moment hence we talk about entanglement.

There is a more practical view of entanglement in quantum computing, i.e. entangled states usually arise from the control of one qubit (control qubit) over another (target qubit). The relationship between the control qubit and the target qubit is similar to the aforementioned Schrdingers Cat. According to this view, if the controlling part is in a state of superposition, the controlled part will be in a superposition of different controlled situations.

This entanglement process is an important element in quantum computing. We can say that superposition and entanglement synergistically weave the varied parallel evolution of quantum computing. Each measurement can only compute one of the possible states, and the superposition state no longer exists after the first measurement. Hence, with a view to obtaining the statistical information we need in the superposition state, we have to compute and measure results again.

Therefore, in many quantum algorithms (such as the Shors algorithm for factoring [which solves the problem of factor decomposition of integer numbers into primes] and digital quantum simulation), we need to use some interference mechanisms after the computation, so that the information of that phase containing the response in the superposition state is converted into conservation (with the implicit idea of preventing a final spill or loss) due to constructive interference (i.e. by the immediately following variation of other data produced), while further data is eliminated by destructive interference. In this way, the response can be obtained with fewer measurements. Most quantum algorithms rely heavily on the phenomenon of fluctuation and interference hence the relative phase is very important for quantum computing, which is called quantum coherence. In the hardware design of quantum computers, many considerations are related to how to protect the quantum state to prolong the coherence lifetime.

Quantum computers have a variety of hardware implementations, but the design considerations are similar. There are three common considerations: qubit operability, measurability, and protection of quantum states. In response to these considerations, a cavity quantum electrodynamics (cQED) system has been developed. A superconducting quantum system can be taken as an example to introduce the implementation of quantum computers. The difference in frequency between the resonant cavity and the qubit means that the coupling between the resonant cavity and the qubit tends not to exchange energy quanta, but only to generate entanglement, which means that the frequency of the resonant cavity will shift with the state of the qubit. Hence the state of the qubit can be deduced by measuring the microwave penetration or reflection spectrum near the resonant frequency with the bit readout line.

The entanglement mechanism between adjacent qubits is provided by the coupling relative to the electrical capacitance between cross-type capacitors. The coupling effect is controlled by the frequency difference between adjacent qubits. The oscillating behaviour reflects the quantum interference effect and its gradual disappearance leads to the decay of coherence and quantum energy.

The coherent lifetime of qubits is influenced by two factors, an intrinsic and an extrinsic one. The extrinsic influence comes mainly from the coupling between the qubit and the quantum state readout circuit. The presence of a filter-like protection mechanism in the microwave cavity between the bit and the readout line can provide a qubit-like protection mechanism because the cavity and the qubit have a frequency difference of about 718 MHz. The intrinsic influence comes mainly from the loss of the qubit itself and the sensitivity of its frequency to various types of noise, which can usually be suppressed by improved materials and processes and optimisation of the geometric structure.

Quantum computing has a wide range of applications, currently involved in the fields of decryption and cryptography, quantum chemistry, quantum physics, optimisation problems and artificial intelligence. This covers almost all aspects of human society and will have a significant impact on human life after practice. However, the best quantum computers are not yet able to express the advantages of quantum computing. Although the number of qubits on a quantum computer has exceeded 50, the circuit depth required to run the algorithm is far from sufficient. The main reason is that the error rate of qubits in the computation process is still very high, even though we can use quantum correction of qubits and fault-tolerant quantum computation. In the case of quantum computing, the accuracy which gradually improves data will greatly increase the difficulty of producing the hardware and the complexity of the algorithm. At present, the implementation of some well-known algorithms has only reached the level of conceptual demonstration, which is sufficient to demonstrate the feasibility of quantum computing, but practical application still has a long way to go.

But we should remain optimistic because, although general quantum computation still needs to be improved by quantum computer hardware, we can still find new algorithms and applications. Moreover, the development of hardware can also make great strides, just like the development of traditional computers in the beginning. In line with this goal, many existing technological industries could be upgraded in the near future. Research is running fast thanks also to significant public and private investment, and the first commercial applications will be seen in the short term.

Considering defence and intelligence issues, many governments are funding research in this area. The Peoples Republic of China and the United States of America have launched multi-year plans worth billions of yuan and dollars. The European Union has also established the Quantum Flagship Programme for an investment of one billion euros.

Related

Read the original here:
Elderly care? Bring in the robots! - Modern Diplomacy

Read More..

Get a lifetime of VPN protection plus 10TB of cloud storage and backup for $63 – ZDNet

StackCommerce ZDNet Recommends

The best mobile VPNs

Here's how to find an effective Virtual Private Network service for both iOS-powered iPhones and Android smartphones.

Read More

While our modern digital world offers almost unlimited convenience, it's nearly outweighed by the degree of risk we face, such as having our most sensitive information compromised online or losing what's stored locally on our computers. Fortunately, we can now protect our data permanently from both situations with the Lifetime Backup & Security Subscription Bundle. You can even use coupon code DOWNLOADNOW to save an additional 30% during our Best of Digital Sale and get it for only $62.99.

We know we need to back up the files on our computer because losing them would cause chaos. Degoo makes that effortless by backing up your data automatically, encrypting it and even replicating it at the same time. Degoo's Premium Plan also gives you a generous 10TB of cloud storage, which you can access for life.

Users are very happy with what Degoo offers. It's earned an average rating of 4.4 stars out of 5 from almost 600,000 reviews on Google Play and 4.5 stars out of 5 from over 6,500 reviews on Apple's App Store.

KeepSolid VPN Unlimited is the second half of this bundle, and it's the bestselling VPN of all time for many reasons. With this deal, you get unlimited speed and bandwidth on up to five devices, along with maximum security and privacy. That includes military-grade encryption, zero logging and a kill switch. And with more than 500 servers in over 80 locations around the globe, you can also enjoy your favorite content no matter where you are.

Reviewers and users alike love KeepSolid VPN. With more than 10 million customers, it has been named "Best VPN for Laptop" by Laptop Review Pro. Additionally, VPN Special notes, "KeepSolid VPN Unlimited offers amazing services, and its advanced features make it a solid VPN service provider."

Whether working from home, training for a new career orteaching online courses of your own, chances are you've been creating more files and spending more time online over the last couple of years than ever before. And that means backup storage and VPN protection are more critical than ever.

It's hard to imagine that you can buy so much peace of mind for so little. But you can use coupon code DOWNLOADNOW today to save an additional 30% during our Best of Digital Sale and get the Lifetime Backup & Security Subscription Bundle for only $62.99.

Read more:
Get a lifetime of VPN protection plus 10TB of cloud storage and backup for $63 - ZDNet

Read More..

4 Actionable Ways to Cut Your Organization’s Cloud Costs in Half – TechGenix

Enterprise cloud adoption has been soaring over the past decade. To top it off, traditional business models are still suffering from continued Covid restrictions. Thats why you may be looking for new ways to operate in a cloud-native world. Cloud expenses can also get problematic if you dont monitor them. That can lead cloud bills to skyrocket. Thats why companies must reduce cloud costs and optimize usage. That said, reductions must reflect realistic cloud workloads and not result in an excessive reduction.

Learn how you can gain control over your cloud costs in this article. You can usually reduce these by half and end expensive mistakes. Lets begin with understanding your cloud pricing and where all the money is going.

Cloud costs can rapidly rise when companies dont have good strategies to deal with them. Thats why you want to, first, plan for and manage dozens of variables. You also want to address a complex web of cloud resources and know where your money is going. Here are the 2 main expenses that result in high cloud costs.

Companies usually fail to understand the basic differences between operating and capital expenses. Knowing the operating nature of the cloud and the capital-expense cost is critical.

Budgeting and managing cloud resources is a continuous process. This is because no visible capital expense exists in cloud services. Costs accumulate over time and depend on: usage, duration, size, and workload attributes.

You have the option to scale resources based on project requirements. Yet, administrators tend to overestimate the resources they need for a given workload. This margin of safety ends up costing companies money for services that arent running.

You may see overprovisioning when resources exist but arent accessible. This happens when projects arent archived. Users either leave it available just in case or forget to clean up after themselves.

Instances end up running even after theyre no longer needed. One person might also spin up an instance for a short while and forget to shut it down for hours, days, or months at a time. That may happen when companies focus on deadlines more than efficiency.

That leads storage, data volumes, and costs to grow over time. Charges also accumulate every passing month through compounding. Developers, in turn, might have an idea of how much storage they have but not how often they use it. That leads to overpaying for high-availability access, which might not be useful.

Lets take a look at how exactly you can optimize your cloud costs.

When your cloud costs are always shifting and decision-making is remote, problems arise. Understanding expenses can get difficult along with utility. Thats why optimizing your cloud costs is becoming an important need for businesses that use it. To help you, try applying some kind of organizational policy across teams. Thatll ensure you cut your cloud costs in half.

The first thing to do is decide which applications and data need to go on the cloud. Use cloud-optimized software applications. That helps you take advantage of storage-tiering and dynamic scaling. That means leaving legacy applications locally.

Another way to optimize your cloud costs is to use cloud services that offer billing APIs. You can also use services that offer portals with visualization for budgeting.

Companies can also look at the resources and cost for applications before using them. This allows you to explore expiration dates and remove cloud resources that dont have tags.

FinOps is a new discipline that gets accountants and engineers to set usage policies. That provides a company with cost-effective decision-making. Centralized purchasing rates with various options also provide teams with cost visibility. FinOps also helps with resource accountability.

Lets now take a look at the top 4 cost-cutting methods you can use during the optimization process.

When cutting costs, you want a starting point. Thats why weve compiled the top 4 cloud cost-cutting methods for you to use.

To optimize your resource usage and cut costs, you can buy committed use discounts. They lock you into one or three-year commitments. Here, you get to decide the amount of money, memory, and other resources you want to buy at a discounted price. You can then distribute resources between machines and at a rate you specify. A committed use discount enables you to optimize your compute costs. Thats possible through analyzing your VM spending trends. You can then decide on either single or multiple contracts according to your need.

Googles Cloud Free Program (CFP) offers many features for effective cloud costing. That includes a 90-day $300 Free Trial that provides $300 in free Cloud Billing credits. The idea here is to help you to explore Google Cloud and Maps Platform services.

A Free Tier feature enables businesses to use certain Google Cloud products for free. It works by providing you with the full utility but with monthly usage limits. If you stay within the limits, you wont pay even after your trial ends.

The Projects feature helps you group cloud resources to understand your spending better.

You may also like the Google Cloud Billing feature. Thats a dashboard that explores the trends in your spending. It predicts how much youre likely to spend in the future, allowing you to change your spending habits.

Establishing budgets for cloud services and ensuring all the teams are aware of them is a must. A budget dashboard is a staple for teams using cloud services. You can set up specific budgeting for different services and display them to users.

You can also set automated alerts for spend thresholds. That enables relevant teams or users to inspect and analyze the overrun.

A cost breakdown report provides a complete view of what you spent. That includes details of on-demand costs and any usage credits. Tracking your savings can also help you become more efficient in the future.

Now you know all the key cost-cutting measures, lets look at the governance youll need.

Cloud governance defines how users work with cloud computing services daily. Some of its principles consist of budget, cost trends, and cost optimization policies. These rules help identify if a budget is enough. These also reduce costs by keeping you in the know about committed use discounts or credits.

Governance tools enable you to track the cost and cloud usage. They alert you when the total usage may exceed a limit. A robust governance solution will contain proactive protocols that automate tasks. That helps ensure user compliance with protocols. In turn, this saves your company both time and money.

When it comes to cloud costs and planning, every company has different needs. Your cloud strategy should optimize your workflow and increase efficiency. That said, managing cloud storage costs isnt as daunting as it seems.

You can benefit from cost-cutting methods like committed use discounts. These can save you money by helping you know your needs better. Googles Cloud Free Program and budget alertness can also help you. Implementing cost breakdown reporting ensures user and team accountability and transparency.

Dont pay more than you need to with cloud costs. Instead, save a lot by following the simple steps I mentioned in this post.

Googles Cloud Free Program is a free trial that provides $300 in free credits to explore features. It also provides 20+ free products and a few operations. Thats a great way for you to explore a cloud platforms capabilities without breaking the bank.

Its a cost management tool or service that provides a value-based pricing model for cloud usage. You can create an estimate with it for your cloud use cases and model a solution before building them. This is a handy tool when estimating cloud costs ahead of time.

Cloud storage is no different from virtual disk space measured in GB. You can buy storage for on-demand usage or a fixed period. GB/TB/PB measure storage usage and data transfer. These costs may seem small at the start but can spiral out of control if you dont monitor them.

Durable Reduced Availability (DRA) storage lets you store data at a lower cost. This storage type has a lower data availability than others. Its useful for longer-term storage needs for non-critical operations. You can expect DRA to have the same durability as standard storage pools. In general, it gives you more flexibility and options as you look to save on cloud storage costs.

A budget can either manage the entire Cloud Billing account or focus on a specific set of resources. That includes subaccounts, projects, products and services, and labels. This is useful to attribute costs to specific departments within your organization.

Yes, the number of projects on Google Cloud Platform (GCP) has a limit. Yet, once you reach your quota, you can request an increase by filling out a request form. In this form, youll need to specify the number of extra projects youll need. That said, thats only applicable to larger organizations. Remember you can always create separate accounts for each team to reduce the need to do this. Its also a great way to keep track of your cloud costs.

Visit the TechGenix website to stay up-to-date on popular technology news.

Understand Cloud Cost Management including its purpose, advantages, and best practices.

Read about the pricing wars between the big cloud vendors.

Check out 5 ways to stop your cloud costs from increasing here.

Get 5 smart strategies to reduce your cloud costs in this article.

Learn about the Google Cloud Free Program here.

Read more from the original source:
4 Actionable Ways to Cut Your Organization's Cloud Costs in Half - TechGenix

Read More..

Google Cloud is making a major change to its VMs – TechRadar

Google Cloud customers will now be able to suspend their virtual machines (VMs) when not in use which will help lower their cloud spending.

The software giant's cloud computing division has announced that its new Suspend/Resume feature for VMs is now generally available after launching in alpha several years ago.

The new feature works in a similar way to closing the lid of your laptop or putting your desktop PC to sleep. By suspending a Google Compute Engine VM, the the state of your instance will be saved to disk so that you can pick up later right where you left off.

The best part about Suspend/Resume in Google Cloud though is that customers will no longer need to pay for cores or RAM when their VMs are in a suspended state. However, they will still need to pay the cloud storage costs of their instance memory as well as other VM running costs like OS licensing but these may be reduced.

When a Google Cloud customer suspends an instance, an ACPI S3 signal is sent to the instance's operating system just like when you close a laptop's lid or put a PC to sleep.

The company makes the case that using this type of signal allows for broad compatibility with a wide selection of OS images so that customers don't have to use a cloud specific OS image or install daemons. At the same time, undocumented and custom OS images that respond to the ACPI S3 signal may also work with Google Cloud's Suspend/Resume feature.

It's also worth noting that storage is dynamically provisioned when Suspend is requested and is separate from the instance's boot disk. Other cloud providers require users to ensure that they have sufficient space in their boot disk to save instance states which may increase the costs of running VMs.

In a new blog post announcing the general availability of Suspend/Resume, Google Cloud also pointed out that the feature can be used by organizations to deal with demand spikes as they can initialize instances with their critical applications and then suspend them so that they can be resumed later. Although Compute Engine instances can be created quickly, resuming an instance is much faster than creating an entirely new instance.

Read the original:
Google Cloud is making a major change to its VMs - TechRadar

Read More..

Cloud Storage Market Analysis by Size, Business Strategies, Share, Growth, Trends, Revenue, Competitive Landscape and Developments Forecast by 2029 -…

The study and estimations of an excellent Cloud Storage Market report helps to figure out types of consumers, their views about the product, their buying intentions and their ideas for the step up of a product. With the market data of this report, emerging trends along with major drivers, challenges, and opportunities in the market for this industry can be identified and analysed. For the clear and better understanding of facts and figures, the data is represented in the form of graphs and charts. With the studies, insights, and analysis mentioned in the finest Cloud Storage market report; get comprehensible idea about the marketplace with which business decisions can be taken quickly and easily.

Market survey performed in Cloud Storage business report helps to unearth important information about the buyer personas, target audience, current customers, market, competition, and more e.g. demand for the product or service, potential pricing, impressions of the branding, etc. The report is prepared by using several steps such as surveys etc. This research contains a variety of question types, like multiple choice, rankings, and open-ended responses. It also has quantitative and short-answer questions that saves time and helps to more easily draw conclusions. The categories of questions that are requested in market survey while generating Cloud Storage marketing report include demographic, competitor, industry, brand, and product.

Download Sample Copy of the Report to understand the structure of the complete [emailprotected]https://www.databridgemarketresearch.com/request-a-sample/?dbmr=global-cloud-storage-market

Cloud storage market is expected to gain market growth in the forecast period of 2022 to 2029. Data Bridge Market Research analyses the market to rise up to the USD 1943.6 million by 2029 and to grow at a CAGR of 24.41% in the above-mentioned forecast period.

Cloud Storage Market Key Trends Analysis

The important factors influencing the growth of the Cloud Storage market have been examined in this report. The driving factors that are boosting demand for Cloud Storages and the restraining factors that are slowing growth of the Cloud Storage industry are addressed in depth, as well as their implications for the worldwide Cloud Storage market. In addition, the published analysis identifies and discusses in detail the trends that are driving the market and impacting its growth. In addition, other qualitative variables such as risks connected with operations and key problems faced by market players are covered in the report.

Cloud Storage Market Strategic Analysis

The market was studied using several marketing methodologies such as Porters Five Forces Analysis, player positioning analysis, SWOT analysis, market share analysis, and value chain analysis in the Cloud Storage market study. The market dynamics and factors such as the threat of a Cloud Storage substitute, the threat of new entrants into the Cloud Storage market, buyer bargaining power, supplier bargaining power to Cloud Storage providing companies, and internal rivalry among Cloud Storage providers are analysed in Porters Five Forces analysis to provide the reports readers with a detailed view of the current market dynamics.

This analysis assists report users in evaluating the Cloud Storage market based on various parameters such as economies of scale, switching costs, brand loyalty, existing distribution channels, capital investments, manufacturing rights & patents, government regulations, advertising impact, and consumer preference impact. This simplified data is expected to aid the industrys key decision-makers in their decision-making process. Furthermore, this study answers the crucial question of whether or not new entrants should enter the Cloud Storage industry.

Read Detailed Index of full Research Study @https://www.databridgemarketresearch.com/reports/global-cloud-storage-market

Leading Key Players Operating in the Cloud Storage Market Includes:

Some of the major players operating in the cloud storage market report are Amazon Web Services, Inc., Microsoft, IBM, Oracle, MongoDB, Inc., Rohde & Schwarz, Hewlett-Packard, Dell, Atlantic.Net, VMware, Cisco Systems, Inc., Data direct Networks, Swisslog Holding AG, Mecalux, S.A., KNAPP AG, Dematic, BEUMER GROUP, Bastian Solutions, Inc., TGW Logistics Group, Fritz SchAfer GmbH, Kardex Group, Daifuku Co., Ltd, Nilkamal and Murata Machinery, Ltd., and Verizon Terremarkamong others.

Key Market Segments:

Cloud Storage Market, By Region:

New Business Strategies, Challenges & Policies are mentioned in Table of Content, Request TOC @https://www.databridgemarketresearch.com/toc/?dbmr=global-cloud-storage-market

Cloud Storage Key Benefits over Global Competitors:

Some of the key questions answered in these Cloud Storage market reports:

Inquire Before Buying This Research [emailprotected]https://www.databridgemarketresearch.com/inquire-before-buying/?dbmr=global-cloud-storage-market

About Data Bridge Market Research, Private Ltd

Data Bridge Market ResearchPvtLtdis a multinational managementconsultingfirm with offices in India and Canada. As an innovative and neoteric market analysis and advisory company with unmatched durability level and advanced approaches. We are committed to uncover the best consumer prospects and to foster useful knowledge for your company to succeed in the market.

Data Bridge Market Research is a result of sheer wisdom and practice that was conceived and built-in Pune in the year 2015. The company came into existence from the healthcare department with far fewer employees intending to cover the whole market while providing the best class analysis. Later, the company widened its departments, as well as expands their reach by opening a new office in Gurugram location in the year 2018, where a team of highly qualified personnel joins hands for the growth of the company. Even in the tough times of COVID-19 where the Virus slowed down everything around the world, the dedicated Team of Data Bridge Market Research worked round the clock to provide quality and support to our client base, which also tells about the excellence in our sleeve.

Data Bridge Market Research has over 500 analysts working in different industries. We have catered more than 40% of the fortune 500 companies globally and have a network of more than 5000+ clientele around the globe. Our coverage of industries includes

Contact Us

US: +1 888 387 2818UK: +44 208 089 1725Hong Kong: +852 8192 7475Email [emailprotected]

The rest is here:
Cloud Storage Market Analysis by Size, Business Strategies, Share, Growth, Trends, Revenue, Competitive Landscape and Developments Forecast by 2029 -...

Read More..

Lori Borgman: Trying to get her head and data in the clouds – Daily Journal

Borgman

I have a long history of issues with my cloud. The main problem being that I cant wrap my head around it. Some people cant get their heads out of the clouds, I cant get mine in.

I suffer from the trap of the literal mind. I have to picture things. And not just food or sitting on a shoreline.

Once every week or so my phone tells me it failed to backup because there is not enough cloud storage. Then it prompts me to buy a bigger, better cloud. Why would I buy more of something that I cant comprehend now?

They want me to buy something I cant see. What next? A bridge in Jersey? Hey, I wasnt born yesterday.

Seeing is believing.

If I looked up at the sky and saw a cloud floating by with my name on it, or even just my initials on it, Id be good. I wouldnt even care if it were a cirrus, cumulus, stratus or nimbus although one of those huge anvil clouds would be cool.

It would also be nice to see whose cloud is next to my cloud and if there is any cloud aggression going on. That way I could yell, Hey! You! Get off of my cloud! The Rolling Stones were in cyberspace before cyberspace was cool.

Its the metaphor that is the problem. Yes, I understand that my calendar, documents, photos, emails and many things are in a cloud, but a cloud is puffy. A cloud can evaporate and dissolve into nothingness. Why would I want to store my life in something wispy? A vault or a safe room, maybe; a cloud, no.

I would do better if the message on my phone said, Your reinforced steel file cabinet in the sky is full and you need a bigger one, so pay up.

Work stored in a file cabinet is easy to imagine. A file cabinet is tangible, it holds things lots of things and you can even lock it.

For example, I know where all my tax records are. I know where my supporting receipts and invoices are. Theyre upstairs in a two-drawer file cabinet where both drawers are jammed full and completely inaccessible courtesy of a shoe rack.

I may not be able to open the file cabinet, but I know where the file cabinet is. And thats why the cloud wins. I may not know where my cloud is, but I can access its contents, which I understand are stored on a giant server called a Lexus. Or a Linux. Again, a Lexus I can picture, a Linux I cannot.

Id be happy with an arrow on a map of the sky marking the Lexus holding my large file cabinet that says, You are here.

That I can visualize.

Excerpt from:
Lori Borgman: Trying to get her head and data in the clouds - Daily Journal

Read More..

Qumulo opens the door to Kubernetes with CSI Blocks and Files – Blocks and Files

Scale-out file system supplier Qumulo has made its file services available to Kubernetes-orchestrated containers via a CSI driver plug-in.

CSI, the Container Storage Interface, enables stateless containers orchestrated by Kubernetes to request and consume storage services such as volume provisioning, data read and writes, and protection from external storage. They effectively support statefulness. Qumulos Core product provides scale-out file services and runs in its own on-premises appliances, third-party servers in its Server Q form, and also, in its Cloud Q incarnation, in the public clouds AWS, Azure, and GCP.

Sean Whalen, Qumulo senior cloud product marketing manager, wrote in a blog: Now, customers innovating using Kubernetes dont have to set up a storage interface each time a cluster is set up or knocked down the process is automatic and provides the containerized application maximum exposure to the Qumulo analytics so that customers can easily understand whats happening across their stored data.

The CSI driver is Qumulo production preview software and provides exposure to Qumulo analytics for containerized applications so that customers can understand whats happening across their stored data.

Kubernetes operates a cluster of machines, starting and stopping containers on behalf of its users. CSI allows the Kubernetes orchestrator and individual containers to connect to external (persistent) storage. Qumulo storage will automatically deploy inside a new container and supports the movement of storage from container to container and machine to machine.

Ben Gitenstein, VP of product at Qumulo, said: Qumulos new CSI driver enables customers to store unstructured data once but serve it to an infinite number of both native applications and container-based microservices all without moving data, copying it to disparate systems, or changing their workloads. Customers who store their data on Qumulo can now focus their time on building modern applications, not on moving or managing their data.

Qumulo is not alone here. CSI driver support is table stakes for external storage suppliers. Dells PowerScale/Isilon already supports CSI as do HPEs Primera and Alletra products, IBMs FlashSystem, NetApps ONTAP software, Pure Storage, and Weka with its scale-out, parallel filesystem software.

Beyond CSI, external storage software can be made into a container itself. Examples are Pures Portworx, MayaDatas OpenEBS Mayastor product, Ondat (rebranded StorageOS),and Robin.ios cloud-native storage. These storage containers execute inside a servers environment and link to the servers own physical storage or to external storage.

StorageOS, for example, aggregates the local disk storage in a cluster of servers (nodes) into one or more virtual block storage pools. Storage in the pool is carved out into virtual volumes and app containers in the nodes mount and access these virtual volumes via the storage container.

When executing in the public clouds, they would use the CSPs storage services. Either on-premises or in the public clouds Kubernetes will be used to orchestrate and manage the storage containers as well as the application containers for DevOps users.

A storage container runs like any other app container with no dependencies on proprietary kernels, hardware, storage protocols or other layered services customers are freed from lock-in to these things. In theory, a storage container should respond more quickly to app container requests for storage services as the link is direct rather than hopping across network links to an external storage system. The storage container should also scale out beyond the limit of, for example, a dual-controller array.

Storage consultant Chris Evans has said: I doubt any storage array could cope with creating and destroying hundreds of volumes per hour (or more), whereas on (Ondat) StorageOS, those constructs are mainly in software on the same node as the application, so can be created/destroyed in milliseconds.

It seems possible that there will be a phase 2 in Qumulos support of containerization, with its Core software eventually becoming cloud-native itself.

Visit link:
Qumulo opens the door to Kubernetes with CSI Blocks and Files - Blocks and Files

Read More..

Cloud Native Backup as a Service (BaaS) How Do We Get It Right? – thenewstack.io

Subbiah Sundaram

Subbiah Sundaram spearheads product management at HYCU. He has been instrumental in enabling the company to deliver HYCU Protg along with the best-in-class multicloud solutions for both on-premises and public cloud environments. Prior to joining HYCU, Subbiah held senior executive positions at BMC, CA, DataGravity, EMC, NetApp and Veritas and had extensive experience in product development, planning and strategy. He holds an M.S. in computer engineering from the University of Iowa and an MBA from the Kellogg School of Management at Northwestern University.

Do you have data applications that are distributed across multiple clouds? Is your multicloud strategy requiring you to rethink your approach to data protection? Are you currently in the process of evaluating Backup as a Service (BaaS) providers? Will you be investing in multicloud data apps in 2022?

If you answered yes to any of these questions, then you need to read on.

As IT data centers migrate to a hybrid cloud infrastructure, companies are having to take a hard look in the mirror and identify how they approach, manage and modernize their multicloud data protection strategy. And with so much evolution taking place within the technology space, the need to restructure your digital transformation strategy with a BaaS solution should not be considered a luxury, but a necessity.

Managing data protection in a multicloud environment is uniquely different than in a traditional, on-premises data center model. This makes the importance of choosing the most effective BaaS solution over manual deployment options paramount. BaaS offers the flexibility and agility that makes the journey to the cloud easier. To support this, the solution needs to support all elements within the infrastructure. That means being able to support both on-prem clouds and public clouds. That includes VMs, applications, buckets, containers and Kubernetes. That way, organizations will be ready to accommodate any and all backup scenarios that the business may require.

The bottom line is that cloud native BaaS is critical to any successful digital transformation strategy. Yet, questions still remain.

There are a number of ways to get BaaS right to help ensure the most effective data protection with the least risk of surprises. Lets take a look at the nine steps I recommend for ensuring your organization is getting BaaS right.

Deploying backup and recovery to the cloud can be risky. As your demands change, so do the capabilities and shortcomings of each cloud you are using.

Any true cloud native BaaS solution should be both simple and agile to use with the ability to turn off and on as your needs change. Isnt that the reason you moved to the cloud in the first place?

As your data protection infrastructure needs continue to evolve, how will you adapt? You have two options:

One, you can waste valuable time, talent and resources by having your in-house team constantly updating your backup infrastructure. Or two, you can offload the task to someone else by adding a Backup as a Service (BaaS) solution to your digital stack.

Having the data protection capabilities on every cloud without the sizing and resizing headaches is a necessity. Any BaaS solution should be able to eliminate the nuances and complexities of sizing exercises and adapt to your changing needs both effortlessly and seamlessly.

When it comes to setting up your backups, you can take a few different approaches.

One, you can endure the time-consuming task of manually setting up agents/connectors, backup configurations, backup jobs and backup targets. Or two, implement automated, policy-based backups by using BaaS.

By moving to the cloud, your goal was to eliminate or significantly reduce the burden on your IT department correct?

In an ideal world, BaaS should provide one-click backups based on flexible policies. We like to call it the set and forget approach.

Multicloud environments are profoundly different than static on-premises environments. Protecting the critical applications and data that live there is as well.

Application consistency and the ability to discover new applications automatically should be at the forefront of any BaaS solution under consideration or already implemented. Manual tasks such as configuring backups or assigning backup policies should be a thing of the past.

Make sure your backup solution provides both automated and default assignments.

How important is business continuity and resiliency across your organization?

Having to perform VM-level backups and manual data recovery tasks is a tedious and cumbersome process, causing logistical roadblocks and disruptions company-wide. Ideally, endusers should be able to recover their data themselves and take the pressure off the busy admins.

The demand is high, and companies are needing their data restored NOW! Having an easy-to-use, fully automated BaaS solution will speed up the restoration process and slow down the pressure level.

Additionally, your BaaS solution should offer the granularity to recover ALL your applications, databases, files and folders.

If you are running multiple infrastructures for development, testing, analytics, forensics and so on, having an efficient BaaS solution to create application-consistent (cloned copies) of your production environment is critical.

Not only does the cloning functionality reduce time, but it delivers real value by adding the extra flexibility and level of granularity needed to clone applications, VMs, Kubernetes clusters or containers.

The ability to migrate workloads around from one on-prem infrastructure to another, from on-prem to public cloud or from one public cloud to another (cross-cloud migration), should be an easy one-click process. If not and you are still relying on big service engagements to move workloads, then youve lost the speed and agility advantages of the cloud.

What about disaster recovery (DR)? Do you have DR functionality for all your workloads or just for mission-critical, tier-one workloads? Are you being limited due to cost constraints and/or budget restrictions?

The only way to deliver a cost-efficient DR is when its done intelligently. A smart DR solution will store your backed-up copy but only use compute resources in the public cloud when you are in a DR situation. Traditional DR software is limitedly designed to duplicate a full environment on the DR site.

Make sure your BaaS solution offers the speed, agility and intelligence functionalities for both data migration and disaster recovery (DR).

What if you could control costs, limit budget constraints and meet the growing organizational needs of your company without having to expand or burden your IT team?

Its a no-brainer, right? When looking for a backup and recovery solution, it must support self-service. This allows endusers to restore their own files without having to rely on IT. Think about how ATMs transformed financial services by increasing customer convenience and reducing overhead costs for banks. Your backup solution should deliver that same advantage.

Additionally, we must consider the importance of multitenancy. Having a BaaS solution that supports multitenancy out of the box helps support organizational scaling.

It always comes down to the power of the mighty dollar. When you are running multicloud environments, assessing the cost efficiency of data protection is critical.

It is important to only pay for what you need. When evaluating a BaaS solution, make sure the pricing scales with your usage. A smart BaaS solution recognizes the characteristics and functionalities of each cloud. This allows you to optimize your backup strategy while minimizing costs. Make sure your BaaS solution is intelligent enough to use the right kind of storage in the right way. Cloud vendors tend to offer a range of storage options to meet a variety of needs. You will want the ability to leverage available cloud storage economics.

So, when evaluating a BaaS solution, make sure it:

Without sounding like a clich, Backup as a Service is an approach that just makes sense. Done correctly, it can provide the critical data protection that organizations need while minimizing the cost and maintenance roadblocks that no one can live without.

Feature image via Pixabay.

See more here:
Cloud Native Backup as a Service (BaaS) How Do We Get It Right? - thenewstack.io

Read More..

Save up to 200 on HP Envy x360 and get free Adobe subscription – Tech Advisor

If youre in the market for a new laptop, HP is likely to be one of the main brands youre considering. With a long and illustrious history of making top-class computers, its easy to see why.

With that in mind, now is a great time to buy a new HP laptop in the UK. The companys spring sale has discounts on many of its leading devices, including the Envy x360 convertible. An Intel Core i5 model with 8GB of RAM and a large 512GB SSD is now down to just 799.99 - thats a 199 saving on the usual asking price.

Get this Envy x360 deal on the HP website now

If saving almost 20% on the RRP wasnt enough, HP is also offering free 12-month subscriptions to some of the leading Adobe Creative Cloud apps. Thats a full year of Premiere Pro or the Photography plan, which includes Lightroom, Lightroom Classic, Photoshop and 20GB of cloud storage. This is an additional saving of up to 239, representing superb value for money.

All these apps will run smoothly on the i5 Envy x360, but the same can be said for the AMD Ryzen 7 configuration it's down to just under 850 with a 150 discount. The Ryzen 5 variant is the cheapest of the lot, with a 69 saving dropping the price under 730.

But this excellent offer also applies to HPs high-end Spectre x360, as well as the traditional Envy 13, Envy 15 and Envy 17 clamshell laptops. Check out the full range on the HP website.

All orders are available with free UK delivery, but we dont know how long this deal will last. If you need a new laptop and usually pay for Photoshop, Lightroom or Premiere Pro, now is the perfect time to take the plunge.

Get the HP and Adobe app bundle with any of these laptops

Go here to see the original:
Save up to 200 on HP Envy x360 and get free Adobe subscription - Tech Advisor

Read More..