Page 4,216«..1020..4,2154,2164,2174,218..4,2304,240..»

Secure 2TBs of cloud storage space for life for under $50 – TNW

Just take a quick look at the state of your computer desktop. If youre one of those users with random pics, videos, attachments and other assorted file detritus littered across your work surface from corner to corner, you should probably consider a serious digital de-cluttering.

Rather than undertaking the arduous process of determining whether to keep or toss each individual item, go the simple route and just push it all to the cloud with 2TBs of Zoolz Cloud Storage space. Right now, you can get that space for life for just $49.99 (over 90 percent off) from TNW Deals.

Zoolz was named the no. 1 Best Business Cloud Storage Service by TopTenReviews because its a simple, efficient means of clearing out files you dont need to revisit very often. Just bundle up all those old files and zap em into Zoolz encrypted cold storage servers. Your content will be fully protected and fully accessible within a few hours if you ever need to haul that info back out of their digital deep freeze.

Beyond simple file care-taking, Zoolz goes one better by offering a host of added features with your subscription, including automatic backup scheduling, bandwidth throttling, icon overlay, file retention and more. You can even have two computers hooked up to your storage plan to keep both systems tidy and running smoothly.

Normally, 2TB of lifetime Zoolz storage would run over $3,600, so get in on the limited time offer to get Zoolz for just $49.99.

Get this deal

Read next: Pokemon Go players are furiously protesting Niantic's new exclusive raids

See more here:
Secure 2TBs of cloud storage space for life for under $50 - TNW

Read More..

Results: do you pay for extra cloud storage? How much do you need? – Phone Arena

We live in the era of mobile computing, or even post-PC era if you will. We access our data from a lot of different devices and at the same time, we need to micromanage the limited storage our phones provide. Of course, the cloud is a great way to do that simple, easy, and integrated with most popular apps. It makes sharing easier, helps you store your endless stream of photos and video, and a lot more. The cloud is cool, sure... but not free.

Yeah, Dropbox, OneDrive, Drive, and even iCloud, as well as others, will start you off with some room for free. That ranges from 5 GB to 15 GB or more if you snatch some promotion. But, more often than not, you might find that's just not enough. And then you click the upgrade storage option and it's time to choose. How much space do you need? How much money are you willing to give monthly for it? We asked, here's how you answered!

Phonearena polls

2220 votes

Follow this link:
Results: do you pay for extra cloud storage? How much do you need? - Phone Arena

Read More..

XenData Debuts CX-10 Hybrid Cloud Storage Appliance – TV Technology

Includes 10TB of on-premises RAID and manages Microsoft Azure blob storage August 16, 2017

WALNUT CREEK, CALIF.XenData is providing consumers with a storage system that manages files both on-premises and in the cloud with the launch of its CX-10 Appliance. The hybrid file appliance manages a single system across two storage tiers: 10TB of on-premises RAID and unlimited Microsoft Azure blob storage.

The CX-10 features configurable RAID retention policies that determine what files are written to the cloud, local RAID or both, allowing users to set policy rules for different types of files and folders. RAID retention policies keep frequently accessed files on local storage. When a project is completed, retention policies may be updated to replace the project files held on disk with sparse files, freeing up local disk storage while maintaining immediate access to the content from the cloud.

XenData Cloud File Gateway software is run on a Windows 10 operating system for the CX-10. The system is presented as a logical drive letter that can be accessed locally or as one or more Windows network shares. It also includes certified integration options for a range of media applications and supports partial file restore. Additional features include scalability, file sharing and security, and internet bandwidth optimization.

The 1U rack mount unit comes with two mirrored enterprise class 10TB disks for the on-premises storage and a solid-state boot volume and two USB 3.0 ports. The base model has two 1 GbE network ports, with options available for additional 10 GbE ports.

XenData is offering the CX-10 for $6,950, including 12 months of onsite support and subscription for the XenData Cloud File Gateway. XenData will showcase the CX-10 at IBC 2017.

Read this article:
XenData Debuts CX-10 Hybrid Cloud Storage Appliance - TV Technology

Read More..

Datrium Splits Provisioning to Boost Private Cloud Storage – SDxCentral

Datrium added new scaling functionality to its DVX 3.0 private cloud infrastructureplatform designed to better match storage capacity to user needs in private cloud deployments.

The Split Provisioning feature can separate scaling of host storage speed and persistent capacity. The software supports scaling from one to 128 compute nodes, or input/output (IO) bandwidth up to 200 Gb/s, and up to 10 data nodes with capacity up to 1.7 petabytes in a single system.

Datriums previous product supported just a single data node.

The compute nodes run all workloads in local commodity flash, which results in lower latency. Each compute node writes persistent data to network-attached and managed data nodes. With the compute nodes operating as stateless, data availability is not impacted should some of them fail.

Datrium DVX version 3.0 split provisioning architecture. (PRNewsfoto/Datrium)

Datriums premise has always been that their open convergence method delivers better overall scalability, along with independent scalability of performance and capacity, compared to hyperconverged architectures in essence, private clouds with virtually limitless performance and scale with the ease of public clouds, said Arun Taneja, founder and consulting analyst at Taneja Group, in a statement.

Datriums open converged infrastructure (OCI) is different from hyperconverged infrastructure (HCI) systems that can maintain persistent data on every server, according to the company. HCI systems are also limited to up to three simultaneous server failures before a data outage.

These limitations require HCI users to run smaller clusters of between eight and 16 nodes, according to Gartner. To support larger data storage needs, companies would need to deploy multiple clusters, which requires increased configuration and management.

Performance is on compute nodes with flash, and persistence is erasure coded across a cost-optimized object store pool (data nodes), as nature intended, said Datrium CEO Brian Biles. So it doesnt suffer HCIs erasure coding performance problems for hot data.

Pricing for such capabilities is set at $12,000 per compute node.

Datrium last month began offering new storage support for various open source platforms. This included Red HatEnterprise Linux (RHEL) servers and kernelvirtual machines(KVM) to its core software platform that supportedVMwareservers and vSphere virtual machines.

The additional support garnered applause from analysts, who noted benefits for enterprises being able to manage multiple environments from a single source.

Mike Matchett, senior analyst at Taneja Group, said the Datrium approach is a good option for enterprises to consider in order to accelerate and open up existing compute clusters.

It represents a safe, forward-looking investment even if you are focused on just VMware today, Matchett said. Given that storage refresh cycles are traditionally three to five years, its a big risk these days to not build in some agility and adaptability into a current refresh.

Matchett did note the firm would have to show it can scale to the needs of larger enterprise customers.

Datrium will need to prove out large-scale scalability and protection claims, but significant performance improvement with server side flash is almost assured, Matchett said.

Datrium, which launched in 2012,late last year scored$55 million in Series C funding, boosting its capital haul to $110 million since its founding.

See the original post:
Datrium Splits Provisioning to Boost Private Cloud Storage - SDxCentral

Read More..

Migrating from server to cloud storage – THE BUSINESS TIMES

FOR the past several years, having a cloud computing strategy has become a de rigueur for most companies, ranging from large enterprises with global presence to tiny startups struggling to establish themselves. Cloud computing is in the process of becoming integral to enterprise IT set-ups just as ERP (enterprise resource planning) and other productivity tools are.

Even though the term "cloud computing" is used in a overarching sense to capture any activity associated with the "cloud", in actuality there are several important differentiations within cloud computing. While two companies may be both using cloud services, there may be no similarity or relation between the kind of services they use the cloud for.

In order to understand how this works there's a need to give an overarching definition of what constitutes cloud computing.

To put it simply, cloud computing is delivering services like storage, networking, software and server capacity over the Internet. Instead of companies buying their own servers and computers, they rent the services from specialised service providers which provide them through a secure connection over the Internet.

Speed, the ability to scale elastically, performance and productivity are all gains that could be realised if a corporation switches its operations to the cloud.

A company or even individuals may already be using the cloud without being aware of it. Storing a document, a photo or a song on an online storage service like Dropbox means is already "cloud computing".

In the Asia-Pacific region, businesses are constantly looking for ways to deliver digital capabilities and offerings to serve ever-growing numbers of smartphone- and tablet-toting customers, as computing goes increasingly mobile. Adopting a mobile-first, cloud-first approach helps companies differentiate themselves from competitors. Cloud computing enables businesses to build their digital offerings rapidly and deploy them quickly.

A case study highlighted by market research company Forrester showcased how important cloud services were to modern companies. An Indian e-commerce startup chose to host its enterprise resource planning (ERP) application on a public cloud infrastructure provider in Singapore. The company needed speed and flexibility in managing, among other things, its logistics, online orders and digital payments, and cloud services provided that agility. The company eventually benefited from cloud services by getting their new products to market faster, and manage their warehousing, logistics, online orders and digital payments far more efficiently.

US$236 billion market

In a market report, Forrester says the global cloud security market is estimated to reach US$3.5 billion within the next four years. Meanwhile, in less than half a decade, the company predicts the public cloud services market will reach US$236 billion, growing 30 per cent year-on-year over the next three years.

Cloud usage by governments and businesses in the Asia-Pacific region is forecasted to grow by 3.3 per cent in 2017 and 5.7 per cent in 2018, according to Forrester. Compared to the US and Europe, software and technology consulting services have grown little in the Asia-Pacific as the countries in the region are still getting up to speed in assembling the hardware infrastructure required to run complex cloud servers.

Despite the fledgling progress in the Asia-Pacific, cloud usage growth in the region is projected to be strong and constant throughout 2018 and beyond. India's tech spending will maintain the highest growth rate in the Asia-Pacific, according to Forrester. A strong economy, combined with government-led initiatives, will boost investment in the software, services and outsourcing segments.

Closer to home, Singapore is investing heavily in infrastructure to support its Smart Nation initiative. The country is chasing productivity gains by deploying robotics and automation at industry level, helping the software and tech outsourcing services to grow to support these initiatives.

In constant currency terms, software and services spending will rise by 7 per cent to 8 per cent in the Asia-Pacific region, with business process apps seen to be the largest and fastest growing software category, according to Forrester. Hardware spending will expand by 4-6 per cent, but telecom and hardware maintenance spending will see modest or even flat growth as spending is focused on other portions of the business like cloud infrastructure.

Cloud computing is growing as businesses transform their operations into nimbler, more responsive solutions, yet still keep costs down. It should no longer be a secondary consideration for a business looking to deepen customer experience (CX) bonds - it should drive it instead.

To enable this digital transition, however, there is a requirement for a lot more than just buying a roomful of servers and plonking them somewhere and expecting them to work. Companies and organisations require a robust underlying IT infrastructure, according to an IDC research report. Having strong foundations provides the "necessary flexible and agile yet resilient and secure services structure, that supports new and existing workloads seamlessly", the report said.

In the Asia-Pacific, IDC has made several predictions with regard to the future of cloud deployment across companies, and how the landscape of cloud providers will shape up to be, compared to the present. Some of them are:

What this means: Increased data consumption and the need to integrate multiple datasets will drive companies to optimise and streamline their APIs so as to modernise their applications and more crucially, increase security across their offerings.

What this means: This is an important point to note. Companies in the near future will emphasise less on building their own aforementioned giant server rooms, and instead focus on a "cloud-first" approach, working with their IaaS/PaaS to expand globally without sacrificing scalability. IDC notes that Asia-Pacific companies have yet to catch on to this trend, with just over 51 per cent of budgets currently going to traditional IT deployment, but IDC expects this to change within the next two years as competition heats up and companies vie for a differentiating factor.

What this means: Asia-Pacific organisations will see greater cloud adoption, with spending on cloud set to triple.

The future of cloud

Due to the Asia-Pacific's heterogenous nature, countries are adopting cloud computing at a different pace, says the IDC's report. Regulations, lack of infrastructure and a pressing need for relevant IT skills all hamper the deployment and adoption of widescale cloud adoption, according to IDC.

In the near future, large countries like China and India are projected to spend the largest amounts on tech in the region. China's tech market will continue its shift from hardware to software and services, says the Forrester research. China's tech market spending is forecasted to grow at 7 per cent in yuan terms, and adoption of cloud computing will reduce the need for capital expenditure and business technology will ramp up significantly as China's traditional industries learn from the media giants like Tencent, Alibaba and Baidu. Chinese firms will increasingly rely on big data and Internet of Things (IoT), and technology service partners will continue to reap the benefits from increased tech adoption, says Forrester.

Cloud computing should feature heavily in any enterprise's goal of international expansion. Reaching new markets demands flexibility and agility to react to changing environments, and cloud computing is a viable option to achieve that goal. Companies can leverage the security and scalability to keep their businesses humming in these uncertain times, where the only constant is change.

Read more:
Migrating from server to cloud storage - THE BUSINESS TIMES

Read More..

Machine learning tackles quantum error correction – Phys.Org

The neural decoder architecture. Credit: Torlai et al. 2017 American Physical Society

(Phys.org)Physicists have applied the ability of machine learning algorithms to learn from experience to one of the biggest challenges currently facing quantum computing: quantum error correction, which is used to design noise-tolerant quantum computing protocols. In a new study, they have demonstrated that a type of neural network called a Boltzmann machine can be trained to model the errors in a quantum computing protocol and then devise and implement the best method for correcting the errors.

The physicists, Giacomo Torlai and Roger G. Melko at the University of Waterloo and the Perimeter Institute for Theoretical Physics, have published a paper on the new machine learning algorithm in a recent issue of Physical Review Letters.

"The idea behind neural decoding is to circumvent the process of constructing a decoding algorithm for a specific code realization (given some approximations on the noise), and let a neural network learn how to perform the recovery directly from raw data, obtained by simple measurements on the code," Torlai told Phys.org. "With the recent advances in quantum technologies and a wave of quantum devices becoming available in the near term, neural decoders will be able to accommodate the different architectures, as well as different noise sources."

As the researchers explain, a Boltzmann machine is one of the simplest kinds of stochastic artificial neural networks, and it can be used to analyze a wide variety of data. Neural networks typically extract features and patterns from raw data, which in this case is a data set containing the possible errors that can afflict quantum states.

Once the new algorithm, which the physicists call a neural decoder, is trained on this data, it is able to construct an accurate model of the probability distribution of the errors. With this information, the neural decoder can generate the appropriate error chains that can then be used to recover the correct quantum states.

The researchers tested the neural decoder on quantum topological codes that are commonly used in quantum computing, and demonstrated that the algorithm is relatively simple to implement. Another advantage of the new algorithm is that it does not depend on the specific geometry, structure, or dimension of the data, which allows it to be generalized to a wide variety of problems.

In the future, the physicists plan to explore different ways to improve the algorithm's performance, such as by stacking multiple Boltzmann machines on top of one another to build a network with a deeper structure. The researchers also plan to apply the neural decoder to more complex, realistic codes.

"So far, neural decoders have been tested on simple codes typically used for benchmarks," Torlai said. "A first direction would be to perform error correction on codes for which an efficient decoder is yet to be found, for instance Low Density Parity Check codes. On the long term I believe neural decoding will play an important role when dealing with larger quantum systems (hundreds of qubits). The ability to compress high-dimensional objects into low-dimensional representations, from which stems the success of machine learning, will allow to faithfully capture the complex distribution relating the errors arising in the system with the measurements outcomes."

Explore further: Blind quantum computing for everyone

More information: Giacomo Torlai and Roger G. Melko. "Neural Decoder for Topological Codes." Physical Review Letters. DOI: 10.1103/PhysRevLett.119.030501. Also at arXiv:1610.04238 [quant-ph]

2017 Phys.org

See original here:
Machine learning tackles quantum error correction - Phys.Org

Read More..

Quantum Internet Is 13 Years Away. Wait, What’s Quantum Internet? – WIRED

A year ago this week, Chinese physicists launched the worlds first quantum satellite. Unlike the dishes that deliver your Howard Stern and cricket tournaments, this 1,400-pound behemoth doesnt beam radio waves. Instead, the physicists designed it to send and receive bits of information encoded in delicate photons of infrared light. Its a test of a budding technology known as quantum communications, which experts say could be far more secure than any existing info relay system.

Theyve kept the satellite busy. This summer, the group has published several papers in Science and Nature in which they sent so-called entangled photons between the satellitenicknamed Micius, after an ancient Chinese philosopherand multiple ground stations. If quantum communications were like mailing a letter, entangled photons are kind of like the envelope: They carry the message and keep it secure. Jian-Wei Pan of the University of Science and Technology of China, who leads the research on the satellite, has said that he wants to launch more quantum satellites in the next five years. By 2030, hes hoping that quantum communications will span multiple countries. In 13 years, you can expect quantum internet.

Which means what exactly? In the simplest terms, it will involve multiple parties pinging information at each other in the form of quantum signalsbut experts havent really figured out what it will do beyond that. Quantum internet is a vague term, says physicist Thomas Jennewein of the University of Waterloo. People, including myself, like to use it. However, theres no real definition of what it means.

Thats because so much of the technology is still in its infancy. Physicists still cant control and manipulate quantum signals very well. Pans quantum satellite may have been able to send and receive signals, but it cant really store quantum informationthe best quantum memories can only preserve information for less than an hour. And researchers still dont know what material makes the best quantum memory.

They also arent sure how theyd transmit signals efficiently between the nodes of the future quantum web. Blanketing Earth in quantum satellites is expensivePans cost $100 million. Ground-based transmission via optical fiber isnt perfect either: Quantum signals die out after about 60 miles of transmission. The signals cant be amplified like an electronic signal, either. So researchers are developing special devices known as quantum repeaters that can transmit signals over long distances.

That research will take time. Even if Pan gets his international network up and running by 2030, its not like itll be handling your social media feed by then. And maybe we wouldnt want it to, either. Just because something is quantum doesnt mean its automatically better, says physicist Kai-Mei Fu of the University of Washington. In many cases, it doesnt make a lot of sense to communicate quantum mechanically, she says. Quantum signals have weird properties like superposition, where a particles location is a probability distribution, and it has no precise location. Most communication between humans would still be far easier to express by encoding regular old 1s and 0s in blips of electricity.

So whats the point of it? In the near future, the quantum internet could be a specialized branch of the regular internet. Research groups all over the world are currently developing chips that might allow a classical computer to connect to a quantum network. People would use classical computing most of the time and hook up to the quantum network only for specific tasks.

For example, says physicist Renato Renner of ETH Zurich, you might connect a classical personal computer to a quantum network to send a message using quantum cryptographyarguably the most mature quantum technology. In quantum cryptography, a sender uses a cryptographic key encoded in a quantum signal to encrypt a message. According to the laws of quantum mechanics, if someone tried to intercept the key, they would destroy it.

The quantum internet could also be useful for potential quantum computing schemes, says Fu. Companies like Google and IBM are developing quantum computers to execute specific algorithms faster than any existing computer. Instead of selling people personal quantum computers, theyve proposed putting their quantum computer in the cloud, where users would log into the quantum computer via the internet. While running their computations, they might want to transmit quantum-encrypted information between their personal computer and the cloud-based quantum computer. Users might not want to send their information classically, where it could be eavesdropped, Fu says.

But itll take a whileif everbefore a quantum network gets as big or as versatile as our current internet. To get to the point where billions of quantum devices are connected to the same network, where any connected device can talk to any other device, wed be lucky to see it in our lifetime, Jennewein says.

The incremental progress doesnt bother Renner. Hes just excited that these experiments inspire physicists to think about quantum mechanics in new ways. All these developments will certainly help our understanding of physics, he says. As a physicist, I want to stress that we are not only application-driven, but also driven by our search for understanding. As consumers, though, well be waiting for our new gadgets.

Originally posted here:
Quantum Internet Is 13 Years Away. Wait, What's Quantum Internet? - WIRED

Read More..

Hackers hit dermatology practice through cloud vendor – Health Data Management

A cyber attack that affected a cloud-hosting and service provider resulted in access to patient data of Surgical Dermatology Group in Alabama, a specialty practice that has offices in Birmingham, Montgomery and Huntsville.

Hackers were able to penetrate TekLinks, which provides cloud services to Surgical Dermatology; the cloud provider notified the practice of the about the intrusion in early June.

TekLinks has assured us that all unauthorized access was terminated on May 1, 2017, and that monitoring by TekLinks from April 22 through May 1 showed no further malicious activity during that time period, the practice told patients in a notification letter.

Also See: Dermatology practice struck by ransomware attack

Information that was compromised included patient names, addresses, home and mobile telephone numbers, email addresses, Social Security numbers, medical record numbers, physician names, health plans, and charges and payments for services rendered.

Drivers license numbers and financial information were not affected. Surgical Dermatology Group is offering affected individuals one year of credit monitoring and identity theft protection services.

The practice declined to provide additional details about the incident, including the number of affected individuals. If the information of more than 500 individuals was compromised, the practice will be reporting further details of the breach on the website of the HHS Office for Civil Rights.

The rest is here:
Hackers hit dermatology practice through cloud vendor - Health Data Management

Read More..

HostHatch launches new Cloud Servers – 5x faster than the giants, including AWS & DigitalOcean – PR Web (press release)

Benchmark: Up to 5 times faster

Tampa, FL (PRWEB) August 15, 2017

HostHatch, a Cloud SSD VPS provider based in Tampa, FL with operations across Asia, US and Europe, recently announced the general availability of their new KVM-powered Cloud Servers. Using some of the fastest NVMe SSDs in the world, they were able to deliver performance up to 5 times faster than others like AWS and DigitalOcean.

"It sounds like a bold marketing claim, like companies saying 'we are the best in the world', but this is not that. We worked hard for months, running lots of different optimizations and benchmarks on our NVMe-based servers and were able to create a product that really delivers," said Emil Jnsson, CEO at HostHatch. "It delivers up to five times better performance than the market giants. On our website, we provide transparent proof of the benchmarks we ran so customers can run their own to verify our claim," Jnsson continued.

The new Cloud Servers are already available in Amsterdam, Los Angeles and Stockholm with more locations planned.

Additionally, HostHatch announced the general availability of its new cloud control panel, which comes fully equipped with a simple and easy-to-use user interface.

For more information, head over to https://hosthatch.com

Share article on social media or email:

See the original post here:
HostHatch launches new Cloud Servers - 5x faster than the giants, including AWS & DigitalOcean - PR Web (press release)

Read More..

Oracle Exadata Cloud lands on bare-metal servers – Computer Business Review

Add to favorites

Big Red promises complete compatibility to ensure a smooth move to the cloud.

Oracle has made its Exadata Cloud available on its next-generation bare-metal compute and storage services

The announcement means that customers will be able to self-provision multiple bare-metal servers, which the company says are each able to support over four million IOPS, block storage that linearly scales by 60 IOPS per GB, and run on the same low latency Virtual Cloud Networks.

The Oracle Exadata offering, which is an on-premises and public cloud database platform, has the company singing its praises, with Big Red saying: These integrated and fully programmable cloud services enhance all stages of application development and deployment through faster connectivity, provisioning, processing, and database access with unmatched technology and industry-leading price performance.

Big Red points to the benefits of using the Exadata Cloud for things like high-demand applications, such as those using real-time targeting, analytics, and so on, as perfect use cases for using the technology.

Oracles next-generation cloud infrastructure is optimized for enterprise workloads and now supports Oracle Exadata, the most powerful database platform, said Kash

Iftikhar, vice president of product management, Oracle.

With the power of Oracle Exadata, customers using our infrastructure are able to bring applications to the cloud never previously possible, without the cost of re-architecture, and achieve incredible performance throughout the stack. From front-end application servers to database and storage, we are optimizing our customers most critical applications.

One of the big benefits to the product is that it offers complete compatibility with Oracle Databases that are deployed on-premises. Given that Oracle is keen to move its customers to the cloud a compatible on-premises to cloud offering should mean that a migration will go smoothly.

With Oracle OpenWorld just around the corner, Big Red is likely to continue with its aggressive shift towards a more cloud dominated portfolio, and with technologies thatll make a cloud migration easier.

Original post:
Oracle Exadata Cloud lands on bare-metal servers - Computer Business Review

Read More..