Page 3,994«..1020..3,9933,9943,9953,996..4,0004,010..»

Allscripts Healthcare Solutions (NASDAQ:MDRX) Upgraded to Hold by ValuEngine – Slater Sentinel

Allscripts Healthcare Solutions (NASDAQ:MDRX) was upgraded by stock analysts at ValuEngine from a sell rating to a hold rating in a research report issued to clients and investors on Wednesday, December 11th, ValuEngine reports.

A number of other equities analysts have also recently weighed in on MDRX. Cantor Fitzgerald reissued a neutral rating on shares of Allscripts Healthcare Solutions in a research report on Tuesday, November 5th. Jefferies Financial Group began coverage on Allscripts Healthcare Solutions in a research note on Friday, August 23rd. They issued a buy rating and a $13.00 price target on the stock. TheStreet raised shares of Allscripts Healthcare Solutions from a d+ rating to a c- rating in a report on Friday, September 20th. Deutsche Bank initiated coverage on shares of Allscripts Healthcare Solutions in a research report on Thursday, September 26th. They set a hold rating and a $11.00 price target on the stock. Finally, Zacks Investment Research upgraded Allscripts Healthcare Solutions from a hold rating to a buy rating and set a $11.00 price target for the company in a report on Tuesday, December 3rd. Two equities research analysts have rated the stock with a sell rating, five have assigned a hold rating and three have given a buy rating to the company. The stock has a consensus rating of Hold and a consensus price target of $11.63.

MDRX stock traded up $0.12 during midday trading on Wednesday, hitting $9.87. The stock had a trading volume of 80,372 shares, compared to its average volume of 2,466,840. Allscripts Healthcare Solutions has a 12-month low of $8.54 and a 12-month high of $12.40. The company has a debt-to-equity ratio of 0.51, a current ratio of 0.66 and a quick ratio of 0.66. The companys 50-day moving average price is $10.41 and its 200 day moving average price is $10.45. The firm has a market cap of $1.59 billion, a PE ratio of 17.95, a price-to-earnings-growth ratio of 1.90 and a beta of 1.40.

A number of hedge funds have recently added to or reduced their stakes in MDRX. Vanguard Group Inc. lifted its stake in Allscripts Healthcare Solutions by 2.2% during the second quarter. Vanguard Group Inc. now owns 16,523,536 shares of the software makers stock valued at $192,169,000 after buying an additional 363,005 shares in the last quarter. Fisher Asset Management LLC raised its stake in Allscripts Healthcare Solutions by 3.4% during the 3rd quarter. Fisher Asset Management LLC now owns 4,647,875 shares of the software makers stock valued at $51,034,000 after purchasing an additional 154,114 shares during the period. Tamarack Advisers LP lifted its holdings in shares of Allscripts Healthcare Solutions by 7.6% in the third quarter. Tamarack Advisers LP now owns 4,225,000 shares of the software makers stock valued at $46,391,000 after purchasing an additional 300,000 shares in the last quarter. Northern Trust Corp lifted its holdings in shares of Allscripts Healthcare Solutions by 9.7% in the second quarter. Northern Trust Corp now owns 2,906,636 shares of the software makers stock valued at $33,804,000 after purchasing an additional 256,892 shares in the last quarter. Finally, Paradigm Capital Management Inc. NY boosted its position in shares of Allscripts Healthcare Solutions by 1.9% in the second quarter. Paradigm Capital Management Inc. NY now owns 1,903,700 shares of the software makers stock worth $22,140,000 after buying an additional 35,400 shares during the period.

About Allscripts Healthcare Solutions

Allscripts Healthcare Solutions, Inc provides information technology solutions and services to healthcare organizations in the United States, Canada, and internationally. It offers electronic health records (EHR), connectivity, private cloud hosting, outsourcing, analytics, patient engagement, clinical decision support, and population health management solutions.

Featured Story: How accurate is the Rule of 72?

To view ValuEngines full report, visit ValuEngines official website.

Receive News & Ratings for Allscripts Healthcare Solutions Daily - Enter your email address below to receive a concise daily summary of the latest news and analysts' ratings for Allscripts Healthcare Solutions and related companies with MarketBeat.com's FREE daily email newsletter.

More:
Allscripts Healthcare Solutions (NASDAQ:MDRX) Upgraded to Hold by ValuEngine - Slater Sentinel

Read More..

The 3 biggest storage advances of the 2010s – ZDNet

I've been looking at storage technology for over 40 years, beginning with choosing mass storage for the original Apple ][ I bought in 1978. $800 for a 140KB floppy, or $50 for a Panasonic cassette deck? Yep, I bought the cassette deck, which taught me the meaning of random access storage.

Every decade since, the pace of change in storage has accelerated, and the 'teens were no exception. Drive that change are 3 key technologies.

The first time I wrote about flash memory for ZDNet, one reader complained that he was expecting to hear about Adobe Flash, the now obsolete graphics standard. No one makes that mistake today!

Flash was invented in the 1980s by Toshiba, whose storage division was spun off last year as Kioxia. Flash was slow to take off because, as a semiconductor, it took decades to build the volumes that allowed costs to drop. In 1991, I paid $400 for a 10MB Compact Flash card - $40/MB - for my favorite notebook of all time, the HP Omnibook 300.

Flash enabled the iPhone, and replaced 8mm magnetic video tapes - I still have a drawer full - and pretty much killed 35mm film cameras. As the industry invested billions in new fabs, the price has continued to decline, making flash the dominant solid state storage in the world today.

It wasn't until circa 2005 that flash became as cheap as DRAM, and that's when designers woke up to its potential in the data center, where FusionIO's fast but costly PCIe SSDs were popular. But the disastrous floods in Thailand in 2011, which destroyed almost half of the world's hard drive production capacity, forced a spike in HDD prices that suddenly made flash SSDs look relatively affordable.

While Apple led the charge to SSDs, offering them in the first MacBook Air back in 2008, it was the spike in HDD prices that made many people realize that despite their high price per GB, SSDs really improved their system performance. With their widespread adoption in even entry level machines, the market for client side HDDs collapsed, with volumes dropping for the last five years.

But fear not, HDDs, like tape, will continue to have a home in data centers for decades to come. They are still significantly cheaper than flash SSDs per GB, which is what will keep the cloud vendors buying them.

The biggest storage story of the teens was the advent of cloud storage, along with cloud infrastructure. Cloud has dramatically reduced the business and technology friction of acquiring and managing IT infrastructure.

Cloud has had knock-on effects in several areas:- Cloud has took the wind out of the once-robust storage array market. EMC, once the behemoth of data storage, sold itself to Dell. The remaining independents, especially startups, tend to focus on either direct sales to the cloud giants, or selling cloud-like infrastructure to enterprises. - Cloud has made storage advances largely proprietary and hidden. When you are operating at 10000x the scale of major enterprises, employ armies of PhDs, and control your entire stack, there is no reason to share tech discoveries.- Likewise, suppliers make investments, such as in 100GB Blu-ray discs, that seem unjustified by consumer demand. I call this the shadow IT market.

Cloud has made data infrastructure a utility. Just as we don't know where the power in our homes comes from, neither do we know where much of our computing takes place, especially for mobile devices. Cloud won't replace on premise systems totally, but it is highly competitive.

As vast as these changes have been, the 20's will dwarf them. More on that later. Happy New Year!

Comments welcome!

Read more here:
The 3 biggest storage advances of the 2010s - ZDNet

Read More..

Amazon’s second act, Microsoft’s revival and red-hot IPOs highlighted the decade of the cloud – CNBC

Microsoft CEO Satya Nadella and Salesforce CEO Marc Benioff in 2014.

Source: Microsoft

There were a lot of tech trends in the 2010s, from mobile computing and web-delivered content to the technology-powered gig economy. But no story was more powerful and pervasive than the emergence of the cloud.

Dropbox and Slack became household names during the past decade, Salesforce gained enterprise ubiquity, and Microsoft and Adobe revitalized their businesses by shifting from packaged software to cloud-based subscriptions, lifting their stock prices to record highs.

Formerly a side project, Amazon Web Services now generates $35 billion in annual revenue by allowing clients to offload their storage and computing needs to a third party, while ServiceNow, whose technology helps IT managers improve productivity, joined the S&P 500 last month after its market cap topped $50 billion.

In the past, companies, schools and government agencies operated their own data centers and bought expensive licenses to use software on their equipment, adding in hefty maintenance and update fees. The cloud changed all that, switching the applications that employees use every day as well as all the underlying databases, servers and communications equipment into services that can be delivered remotely to a host of devices over powerful networks. Customer service was upgraded, with a focus on user feedback, to keep clients from quitting their subscriptions and moving to rivals.

For investors, the paradigm shift presented an opportunity to put money into older companies positioned to make the transition, as well as a whole new crop of start-ups poised to take market share from the legacy providers. Slack, Twilio, Zoom and Okta were all founded in 2008 or later and are each now valued at over $10 billion on the public market. A bunch more are in the $5 billion range, and more still are filling up the IPO pipeline for 2020 and beyond.

Brad Gerstner, founder of Altimeter Capital, counts Salesforce as one of his top holdings and was a venture investor in Okta and Twilio. In a TV interview this month alongside Okta co-founder Frederic Kerrest, Gerstner told CNBC that the big bet has paid off.

"It really comes down to something that we talked about nearly a decade ago," said Gerstner, whose firm oversees more than $5 billion in assets, referring to his initial conversations with Okta. "We have a once in probably our lifetime rearchitecture of the entire enterprise stack into the cloud."

According to Synergy Research, 2019 revenue from enterprise software-as-a-service (SaaS) will exceed $100 billion, up from less than $4 billion in 2009. Adding up all layers of the stack, from the underlying infrastructure to the applications, IT service firm Gartner says cloud revenue will end the year at $214.3 billion, jumping to $331.2 billion by 2022.

In a $3.7 trillion global IT market with low single-digit expansion, annual cloud growth of greater than 15% is leading investors to bid up the cloud standouts in both the public and private markets.

If you're looking for the poster child of the cloud evolution, you may find it on the outskirts of Seattle.

In 2014, facing sluggish growth and disappointing investor returns, Microsoft turned to Satya Nadella to succeed Steve Ballmer as CEO, the first change at the top in 14 years. Nadella, who had previously run Microsoft's cloud and enterprise group, told employees on day one of his tenure, "Our job is to ensure that Microsoft thrives in a mobile and cloud-first world."

Weeks later Nadella announced that Office apps were coming to Apple iPads, giving customers more flexibility and showing that it was a new day at Microsoft. Office 365, the cloud version of Microsoft's flagship product, was launched in 2011, but the Apple integration was critical in bringing Word, Excel, PowerPoint and SharePoint to people who were choosing a competitor's hardware.

By 2017, commercial revenue for Office 365 had exceeded Office license revenue.

"I think the biggest event of the decade was Microsoft launching Office 365," said Todd McKinnon, CEO and co-founder of Okta who previously spent five years at Salesforce. "It was very clear that the largest software company in the world is saying, 'Cloud is good, Cloud will work, Cloud is sanctioned.' It changed the mindset of the IT industry."

Okta Conference hosts Facebook VP of Platform Partnerships Sean Ryan, Slack VP of Product April Underwood, Okta CEO Todd McKinnon, Box CEO Aaron Levie Zoom, and CEO Eric Yuan and moderator Brad Stone

Source: Harriet Taylor

McKinnon saw the movement firsthand. His company provides identity management software so businesses can securely control all of the cloud applications that employees are using.

"Companies of every size and every industry that we'd been having conversations with for years came back to us and said, 'This is real, this is happening, we need a real identity story,'" McKinnon said.

Meanwhile, Microsoft was also building Azure, its cloud infrastructure service that would eventually become the clear No. 2 to AWS, attracting as customers large retailers, health-care providers, banks and the U.S. Department of Defense along the way. Microsoft doesn't disclose Azure revenue, but it does report growth, which reached 59% in the third quarter.

Since the end of 2009, Microsoft's stock has jumped 417%, beating the S&P 500's 189% gain. This year it became the third company to reach a $1 trillion market capitalization.

Amazon isn't far behind at $889 billion, as of Monday's close. Much of Amazon's 1,233% stock surge over the last decade can be attributed to AWS, which in the latest quarter accounted for 71% of its parent company's operating income and 13% of revenue. Analysts at Jefferies said in a November report that AWS could be worth about 40% of the company's market cap, and the unit has gotten so big that it's now reportedly attracting antitrust scrutiny.

Salesforce, the company most synonymous with SaaS, has also taken advantage of investments made by the infrastructure players. In 2016, Salesforce said it would use AWS to expand its Sales Cloud and Service Cloud internationally and has since announced plans to use some services from Google and Microsoft's cloud.

While Salesforce is the biggest company that was born in the cloud, Adobe is the largest software maker to transition the majority of its business to the new model. Investors have rewarded the company, pushing the stock up ninefold since the beginning of the decade.

In 2009, subscriptions represented 3% of revenue. Two years later, Adobe introduced Creative Cloud, ushering in monthly and annual plans for access to apps like PhotoShop, along with cloud storage. Now, subscriptions account for about 90% of sales, and the company is growing at rates not seen since 1991.

"What we were able to do in terms of moving to this new way of delivering software was unshackle our product teams from the burdens of delivering products every 12 or 18 months and they could deliver at the pace at which they could innovate," CEO Shantanu Narayen said at Adobe's financial analyst meeting in November. "We were able to attract new customers to the platform, we were able to price these products globally differently."

Autodesk was founded in 1982, just like Adobe. It's undertaken a similar endeavor, moving its popular design and architecture software to the cloud. Carl Bass, Autodesk's CEO from 2006 to 2017, said in 2013 that the company "can get pretty close to subscriptions being the vast majority of our business."

He was proven right. In the most recent quarter, subscriptions accounted for 85% of sales, pushing total revenue up 28% from a year earlier. The stock has gained 620% since the end of 2009.

As Microsoft, Adobe and Autodesk were revamping their businesses, new venture-backed SaaS vendors were popping up by the month, unbundling the old software suites with targeted applications and solutions. The attrition rate has been high, but there are notable successes.

Videoconferencing company Zoom, which went public this year, reported revenue growth of 85% in the most recent quarter to $166.6 million. Twilio, a provider of communications infrastructure that went public in 2016, generated growth of 75% to $295.1 million in the third quarter. Newly public companies Elastic, Smartsheet and Coupa each reported growth in excess of 50%.

They're among the top performers in the BVP Nasdaq Emerging Cloud Index, a group of public companies that get most of their revenue from cloud products and services. Venture capital firm Bessemer Venture Partners launched the index in 2013 to bring more attention to cloud companies and provide metrics so private cloud companies could better understand public markets.

The index has risen 458% since it was formed, topping the Nasdaq's 146% jump over that stretch. In September, asset manager WisdomTree launched the WisdomTree Cloud Computing Fund, making it possible for people to bet on the group.

"We'd get tweets every week of, 'How can I trade this? How can I trade this?'" said Byron Deeter, who invests in cloud at Bessemer and sits on Twilio's board.

Rob Bernshteyn, CEO of Coupa, has been tracking cloud software since its infancy. While working at Siebel Systems in the early 2000s, he met Salesforce co-founder Marc Benioff and was skeptical of whether the company could provide cloud-based technology for many different purposes without extensive customization, even though Salesforce was already winning deals against Siebel.

"It wasn't really definitively clear to me that it could really work," Bernshteyn said in an interview at Coupa's Silicon Valley headquarters, where the server closets are filled with beanbags that employees use as chairs.

Over time, Bernshteyn said Salesforce fixed its technical issues. He considered joining the company but went to a younger cloud software provider called SuccessFactors, which was later acquired by SAP.

Bernshteyn left in 2009, in the middle of the financial crisis, and joined a small start-up that was helping companies track their spending to make sure they weren't being fleeced by vendors. That company, Coupa, is now worth over $9 billion and generating revenue of over $100 million a quarter.

But not all cloud stocks have delivered for investors.

Dropbox is 15% below its IPO price from 2018. Growth at the one-time venture darling has slowed amid competition from Google and Microsoft in the cloud storage and collaboration market.

Business intelligence software company Domo is up just 10% from its IPO in mid-2018 and way below where it was valued in the private markets before the offering. Yext, whose service helps businesses keep information like their addresses and hours up to date on Google and Amazon Alexa, is up 32% since its debut in 2017, underperforming the major indexes.

At 22% and 30% sales growth, respectively, Domo and Yext are expanding at a slower pace than many of their cloud counterparts, while still racking up big losses. It's a tough recipe for investors.

Yext CEO Howard Lerman is bullish on the broader sector. "Obviously at some point over the next decade, spending on cloud software will surpass licensed software," he said.

For venture investors, there's also plenty of money still to be made, assuming the public markets are on board. Deeter of Bessmer Ventures said there are 66 private cloud companies worth more than $1 billion.

"That's your future IPO pipeline," he said. "You're going to see this cloud index explode."

WATCH: Coupa CEO says there is a $50 billion addressable market in cloud expense management

See original here:
Amazon's second act, Microsoft's revival and red-hot IPOs highlighted the decade of the cloud - CNBC

Read More..

Big Data Professionals Give 11 Predictions for Cloud’s Evolution in 2020 – Database Trends and Applications

The cloud was on everyones mind this past year; with so many questions rising surrounding how to secure cloud environments to what type of cloud is best for the organization.

Cloud computing has revealed countless new dimensions to IT. There are public clouds, private clouds, distributed clouds, and hybrid, multi-cloud architectures.

An actual hybrid cloud will allow for large and small and critical and casual workloads to be seamlessly transitioned between on-premise private cloud infrastructure and any public cloud employed by any organization based on whatever criteria a customer architects. The current output of new technologies has this space exploding with possibilities.

Here, executives of leading companies offer 11 predictions for what's ahead in 2020 for cloud.

The Cloud Disillusionment blossoms because the meter is always running: Companies that rushed to the cloud finish their first phase of projects and realize that they have the same applications they had running before that do not take advantage of new data sources to make them supercharged with AI. In fact, their operating expenses actually have increased because the savings in human operators were completely overwhelmed by the cost of the cloud compute resources for applications that are always on. Ouch. These resources were capitalized before on-premise but now hit the P&L. - Monte Zweben, CEO, Splice Machine

Multi-cloud strategies increase the demand for application management tool adoption: Multi-cloud strategies are here to stay. Companies are increasingly adopting more than one platformeither for financial leverage or to create a time-to-market or feature race between the platforms. To remain competitive, public cloud providers must offer unique features or capabilities differentiating them from competitors. This has created an upsurge in new and more complex technologies, increasing the need for application performance management tool adoption. 2020 will bring an ever-increasing demand for APM tools and services.- David Wagner, senior manager, product marketing application management, SolarWinds

The Rise of the Hybrid Cloud Infrastructure -- Putting the Right Data in the Right Place: Today when people refer to the cloud, they usually mean the public cloud. In 2020, the term cloud might become more nuanced as private clouds rise in popularity and organizations increasingly pursue a hybrid cloud storage strategy. Organizations with large-scale storage needssuch as those in healthcare, scientific research, and media and entertainmentface unique challenges in managing capacity-intensive workloads that can reach tens of petabytes. Private clouds address these challenges by providing the scale and flexibility benefits of public clouds along with the performance, access, security and control advantages of on-premises storage. In 2020, well see more organizations taking advantage of private clouds in a hybrid cloud infrastructure storing frequently used data on-prem while continuing to utilize the public cloud for disaster recovery.- Jon Toor, CMO, Cloudian

Best-of-Breed cloud is coming under the name of Hybrid: Public cloud vendors have extortionately high prices. The public cloud makes sense for small-and-medium sized businesses. Those businesses dont have the scope to amortize their engineering spend. Public clouds dont make sense for technology companies. Companies like Bank of America have gone on record as saving 2 billion dollars per year by not using the public cloud. A best-of-breed architecture envisions building blocks within the technical stack, then selects not from a single cloud vendor, but from the variety of service providers. Assumptions that a given cloud provider has the lowest or best prices, or that the cost of networking between clouds is prohibitive, becomes less and less true. - Brian Bulkowski, CTO at Yellowbrick Data

Organizations will grapple with scaling multi-cloud, hybrid, edge/fog and more: In 2020, in-memory computing will disrupt both NoSQL and traditional database technologies, and streaming analytics will emerge as the preferred approach for data integration. Low-latency in-memory platforms for streaming will define a new paradigm for performance in this space, further disrupting traditional approaches. Multi-cloud will also emerge as the preferred strategy to build and integrate applications. In response, enterprises will increasingly need to support and scale multi-cloud, hybrid cloud and edge/fog, and turn to new approaches to achieve real-time machine learning at enterprise scale. - John DesJardins, VP of solution architecture & CTO, Hazelcast

More enterprises will have production cloud data lakes. With the maturation of the technology stack overall and more ML frameworks becoming mainstream, the cloud data lake trend, which began a few years ago, will continue to accelerate. Well see more enterprises with production data lakes in the cloud running meaningful workloads for the business. This trend will pose more pressure on the data privacy and governance teams to make sure data is being used the right way. - Okera CTO and co-founder, Amandeep Khurana

The biggest advantage presented by modern cloud technology is the ability for small to mid-size companies to level the playing field: Thanks to the cloud, organizations no longer require the assets previously required to implement enterprise solutions and technology large budgets, massive server farms, and a workforce dedicated to maintenance. Typically, when organizations want to implement new tech, they analyze the infrastructure cost associated to determine what is fiscally possible. Instead, organizations that want to harness the benefits provided by the cloud should start by defining strategic objectives and recognize that the cloud is going to provide access to solutions and new technology at a fraction of the on-premises cost. Dont let infrastructure costs be the impeding factor to implementing new tech. What the cloud now does is disintermediate the bar of access to, and drive adoption of, new technology. This is why the cloud growth line has been exponential, not linear. So, in 2020 and beyond we can expect cloud to be a huge asset that will allow small to mid-size businesses to get access to the same solutions, information, and data that was only before available to large enterprises. - Himanshu Palsule, chief product & technology officer, Epicor

Cloud data warehouses turn out to be a Big Data detour: Given the tremendous cost and complexity associated with traditional on-premise data warehouses, it wasnt surprising that a new generation of cloud-native enterprise data warehouse emerged. But savvy enterprises have figured out that cloud data warehouses are just a better implementation of a legacy architecture, and so theyre avoiding the detour and moving directly to a next-generation architecture built around cloud data lakes. In this new architecture data doesnt get moved or copied, there is no data warehouse, and no associated ETL, cubes, or other workarounds. We predict 75% of the global 2000 will be in production or in pilot with a cloud data lake in 2020, using multiple best-of breed engines for different use cases across data science, data pipelines, BI, and interactive/ad-hoc analysis. - Dremio's CEO Tomer Shiran

IT will begin to take a more methodical approach to achieving cloud native status: Running cloud native applications is an end goal for many organizations, but the process of getting there can be overwhelming especially because many companies believe they have to refactor everything at once. More IT departments will realize they dont need to take an all or nothing approach, and a process founded on baby steps is the best way to achieve cloud native goals. In other words, well start to see more IT teams forklift applications into the cloud and then implement a steady, methodical approach to refactoring them. - Chris Patterson, senior director of product management, Navisite

Major Cloud Providers Will Find a Bullseye on Their Backs: As more and more organizations move their critical systems and data to the cloud for efficiency, scalability, and cost reduction, cloud provider infrastructure will increasingly become a high payoff target. A target, that if compromised, could have devastating effects on the economy and national security. In 2020, we believe state adversaries will redouble their efforts to attack cloud systems. Whether the defenses in place will withstand the attacks remains to be seen. - Greg Conti, senior security strategist, IronNet Cybersecurity

A Meteoric Rise: Cloud Security Adoption to Accelerate in 2020: The coming year will usher in an even greater adoption of cloud security, with a material change in attitude and organizations fully embracing the cloud. As organizations increasingly access enterprise applications like Box, Salesforce, etc., its no longer practical for them to VPN back to the stack to remain secure while accessing these services in the cloud. With this move to the cloud comes countless security risks. Not only will we see more companies jump on the bandwagon and shift their applications and operations to the cloud, but we will also see the security stack move to the cloud and more resources dedicated to securing the cloud, such as cloud councils. -Kowsik Guruswamy, CTO,Menlo Security

Read more:
Big Data Professionals Give 11 Predictions for Cloud's Evolution in 2020 - Database Trends and Applications

Read More..

Google Drive vs OneDrive: Which is better? – ValueWalk

There is no shortage of cloud storage services. You can choose from Google Drive, OneDrive, Apple iCloud, Amazon Drive, Dropbox, and many others. But the two most popular services are Google Drive and Microsofts OneDrive. Both Google and Microsoft have deeply integrated their cloud offerings with other services to give you a better user experience. If you cant decide which one to opt for, this Google Drive vs OneDrive comparison should help you decide.

Google Drive and OneDrive have become platform-agnostic. You can use them on Android, iOS, Windows, Mac, and other platforms without any issues. They let you access your files across devices. Both services also have a bunch of collaboration tools to let you share files and collaborate with others.

Both Google and Microsoft have a free plan with a limited amount of cloud storage. The free plans are good enough for most users who use cloud only to store or back up photos and documents. Google Drive offers 15GB of free storage, which is significantly higher than OneDrives 5GB. But the Google Drive storage is used for all of Googles services including Gmail. It means you could run out of storage faster than you expect.

If you need more storage, Google charges $2 per month or $20 per year for 100GB of cloud storage. If you want 200GB, its going to cost $3 per month or $30 per year. Googles 2TB plan costs $10 per month or $100 per year. The 10TB plan costs $100 per month and 20TB plan is going to set you back by $200 per month. Its worth pointing out you can extend Googles storage to other people in your Google Family.

OneDrive is relatively more expensive, mainly because Microsoft uses a different pricing strategy. The Redmond-based software giant has bundled OneDrive into the Microsoft Office subscription. The Office 365 Home costs $100 per year. It gives you access to Word, PowerPoint, Outlook, Excel, Access, and Publisher for PC along with 1TB of OneDrive storage.

You can also share the Office 365 Home plan with five of your family members, each of whom will get their own 1TB of cloud storage. For those who want a personal plan, Microsoft has Office 365 Personal for $70 per year. It gives you 1TB of cloud storage along with the Office tools.

If you only want the OneDrive storage without any Office tools, you can get 100GB of cloud storage for $2 per month. 1TB of OneDrive storage costs $7 per month or $70 per year, and 6TB cloud storage will set you back by $10 per month or $100 per year.

Both cloud services allow you to access their file management features via a web browser. The user interface is intuitive and easy to navigate. The file management system is similar to that of the desktop file managers. They both have a variety of viewing options such as thumbnails and list, and give you quick access to your recent files.

The search function is much better in Google Drive. It shows the search results live as you type each letter. It also has an advanced search option that you can toggle on. The advanced search lets you filter search results by date, keyword, file type, etc. The search function in OneDrive is still in its infancy. You wont be able to view the search results until you hit theEnter button.

The file sharing system is similar on the two services. You can share a direct link or a persons email address to give them access to a file. Both services allow you to set permissions for anyone accessing the files you shared. In Google Drive, you can let others View, Comment, or Edit for free users. The advanced permission settings are available only to paid users.

Microsofts OneDrive comes with block-level copying technology, which breaks files into smaller packages for uploading and saving to the cloud. If you make a change in your file, only the packages that have been modified are re-uploaded to the cloud. It speeds up the uploading process.

Both services send your files to the cloud via HTTPS encryption. Microsoft and Google encrypt files using their own keys. It makes it incredibly hard for hackers to decrypt files you have stored with Google or Microsoft even if they break into the servers. But it also means that if someone gains access to your email and password, they can access all your files.

Since Microsoft and Google have encryption keys to your files, they can decrypt them or give law enforcement agencies access to your files. Of course, you can use the E2EE option or other services to encrypt all your Google Drive or OneDrive files yourself.

Google scans the files you upload to Google Drive to mine data that it could use for targeted advertising. It doesnt use that data for malicious purposes. Microsoft doesnt do that. Recently, the Redmond-based software giant introduced a feature called Personal Vault that adds an extra layer of security to your sensitive files.

Free OneDrive users can add only three files to their Personal Vault. The Office 365 subscribers can add as many files as they want, up to the storage limit of their plan. The Personal Vault is locked automatically upon 20 minutes of inactivity. You can access your Personal Vault within OneDrive using a PIN, fingerprint scan, facial scan, an authenticator app, or by entering the authentication code you get via SMS or email.

If you want maximum cloud storage for the price, Google Drive is the way to go. Its plans are relatively cheaper, and Google Drive integrates well with other Google services. Microsofts OneDrive is for people who use its Office tools such as Word, Excel, PowerPoint, and Access. The Personal Vault also gives OneDrive an edge over Google Drive in terms of security. You should try out the free plans of both to decide which one is better for your needs.

Follow this link:
Google Drive vs OneDrive: Which is better? - ValueWalk

Read More..

Big Data Predictions: What 2020 Will Bring – Datanami

(ju_see/Shutterstock)

With just over a week left on the 2019 calendar, its now time for predictions. Well run several stories featuring the 2020 predictions of industry experts and observers in the field. It all starts today with what is arguably the most critical aspect of the big data question: The data itself.

Theres no denying that Hadoop had a rough year in 2019. But is it completely dead? Haoyuan HY Li, the founder and CTO of Alluxio, says that Hadoop storage, in the form of the Hadoop Distributed File System (HDFS) is dead, but Hadoop compute, in the form of Apache Spark, lives strong.

There is a lot of talk about Hadoop being dead, Li says. But the Hadoop ecosystem has rising stars. Compute frameworks like Spark and Presto extract more value from data and have been adopted into the broader compute ecosystem. Hadoop storage (HDFS) is dead because of its complexity and cost and because compute fundamentally cannot scale elastically if it stays tied to HDFS. For real-time insights, users need immediate and elastic compute capacity thats available in the cloud. Data in HDFS will move to the most optimal and cost-efficient system, be it cloud storage or on-prem object storage. HDFS will die but Hadoop compute will live on and live strong.

As HDFS data lake deployments slow, Cloudian is ready to swoop in and capture the data into its object store, says Jon Toor, CMO of Cloudian.

In 2020, we will see a growing number of organizations capitalizing on object storage to create structured/tagged data from unstructured data, allowing metadata to be used to make sense of the tsunami of data generated by AI and ML workloads, Toor writes.

The end of one thing, like Hadoop, will give rise the beginning of another, according to ThoughtSpot CEO Sudheesh Nair.

(Swill Klitch/Shutterstock)

Over the last 10 years or so, weve seen the rise, plateau, and the beginning of the end for Hadoop, Nair says. This isnt because Big Data is dead. Its exactly the opposite. Every organization in the world is becoming a Big Data company. Its a requirement to operate in todays business landscape. Data has become so voluminous, and the need for agility with this data so great, however, that organizations are either building their own data lakes or warehouses, or going directly to the cloud. As that trend accelerates in 2020, well see Hadoop continue to decline.

When data gets big enough, it exerts a gravitational-like force, which makes it difficult to move, while also serving to attract even more data. Understanding data gravity will help organizations overcome barriers to digital transformation, says Chris Sharp, CTO of Digital Realty.

Data is being generated at a rate that many enterprises cant keep up with, Sharp says. Adding to this complexity, enterprises are dealing with data both useful and not useful from multiple locations that is hard to move and utilize effectively. This presents enterprises with a data gravity problem that will prevent digital transformation initiatives from moving forward. In 2020, well see enterprises tackle data gravity by bringing their applications closer to data sources rather than transporting resources to a central location. By localizing data traffic, analytics and management, enterprises will more effectively control their data and scale digital business.

All things being equal, its better to have more data than less of it. But companies can move the needle just by using available technology to make better use of the data they already have, argues Beaumont Vance, the director of AI, data science, and emerging technology at TD Ameritrade.

As companies are creating new data pools and are discovering better techniques to understand findings, we will see the true value of AI delivered like never before, Vance says. At this point, companies are using less than 20% of all internal data, but through new AI capabilities, the remaining 80% of untapped data will be usable and easier to understand. Previous questions which were unanswerable will have obvious findings to help drive massive change across industries and societies.

Big data is tough to manage. What if you could do AI with small data? You can, according to Arka Dhar, the CEO of Zinier.

Going forward, well no longer require massive big data sets to train AI algorithms, Dhar says. In the past, data scientists have always needed large amounts of data to perform accurate inferences with AI models. Advances in AI are allowing us to achieve similar results with far less data.

(Drendan/Shutterstock)

How you store your data dictates what you can do with it. You can do more with data stored in memory than on disk, and in 2020, well see organizations storing more data on memory-based systems, says Abe Kleinfled, the CEO of GridGain.

In 2020, the adoption of in-memory technologies will continue to soar as digital transformation drives companies toward real-time data analysis and decision-making at massive scale, Kleinfled says. Lets say youre collecting real-time data from sensors on a fleet of airplanes to monitor performance and you want to develop a predictive maintenance capability for individual engines. Now you must compare anomalous readings in the real-time data stream with the historical data for a particular engine stored in the data lake. Currently, the only cost-effective way to do this is with an in-memory data integration hub, based on an in-memory computing platform like Apache Ignite that integrates Apache Spark, Apache Kafka, and data lake stores like Hadoop.2020 promises to be a pivotal year in the adoption of in-memory computing as data integration hubs continue to expand in enterprises.

Big data can make your wildest business dreams come true. Or it can turn into a total nightmare. The choice is yours, say Eric Raab and Kabir Choudry, vice presidents at Information Builders.

Those that have invested in the solutions to manage, analyze, and properly action their data will have a clearer view of their business and the path to success than has ever been available to them, Raab and Choudry write. Those that have not will be left with a mountain of information that they cannot truly understand or responsibly act upon, leaving them to make ill-informed decisions or deal with data paralysis.

Lets face it: Managing big data is hard. That doesnt change in 2020, which will bring a renewed focus on data orchestration, data discovery, data preparation, and model management, says Todd Wright, head of data management and data privacy solutions at SAS.

(a-image/Shutterstock)

According to the World Economic Forum, it is predicted by 2020 that the amount of data we produce will reach a staggering 44 zettabytes, Wright says. The promise of big data never came from simply having more data and from more sources but by being able to develop analytical models to gain better insights on this data. With all the work being done to advance the work of analytics, AI and ML, it is all for not if organizations do not have a data management program in place that can access, integrate, cleanse and govern all this data.

Organizations are filling up NVMe drives as fast as they can to help accelerate the storage and analysis of data, particularly involving IoT. But doing this alone is not enough to ensure success, says Nader Salessi, the CEO and founder of NGD Systems.

NVMe has provided a measure of relief and proven to remove existing storage protocol bottlenecks for platforms churning out terabytes and petabytes of data on a regular basis, Salessi writes. Even though NVMe is substantially faster, it is not fast enough by itself when petabytes of data are required to be analyzed and processed in real time. This is where computational storage comes in and solves the problem of data management and movement.

Data integration has never been easy. With the ongoing data explosion and expansion of AI and ML use cases, it gets even harder. One architectural concept showing promise is the data fabric, according to the folks at Denodo.

Through real-time access to fresh data from structured, semi-structured and unstructured data sets, data fabric will enable organization to focus more on ML and AI in the coming year, the Denodo company says. With the advancement in smart technologies and IoT devices, a dynamic data fabric provides quick, secure and reliable access to vast data through logical data warehouse architecture. Thus, facilitating AI-driven technologies and revolutionizing businesses.

Seeing how disparate data sets are connected using semantic AI and enterprise knowledge graphs (EKG) provide other approaches for tackling the data silo problem, says Saurav Chakravorty, the principal data scientist at Brillio.

An organizations valuable information and knowledge is often spread across multiple documents and data silos, creating big headaches for a business, Chakravorty says. EKG will allow organizations to do away with semantic incoherency in fragmented knowledge landscape. Semantic AI with EKG complement each other and can bring great value overall to enterprise investments in data lake and big data.

2020 holds the potential to be a breakout year for storage-class memory, argues Charles Fan, the CEO and co-founder of MemVerge.

With an increasing demand from data center applications, paired with the increased speed of processing, there will be a huge push towards a memory-centric data center, Fan says. Computing innovations are happening at a rapid pace, with more and more computation techfrom x86 to GPUs to ARM. This will continue to open up new topology between CPU and memory units. While architecture currently tends to be more disaggregated between the computing layer and the storage layer, I believe we are headed towards a memory-centric data center very soon.

We are rapidly moving toward a converged storage and processing architecture for edge deployments, says Bob Moul, CEO of machine data intelligence platform Circonus.

Gartner predicts there will be approximately 20 billion IoT-connected devices by 2020, Moul says. As IoT networks swell and become more advanced, the resources and tools that managed them must do the same. Companies will need to adopt scalable storage solutions to accommodate the explosion of data that promises to outpace current technologys ability to contain, process and provide valuable insights.

Dark data will finally see the light of day in 2020, according to Rob Perry, the vice president of product marketing at ASG Technologies.

(PictureDragon/Shutterstock)

Every organization has islands of data, collected but no longer (or perhaps never) used for business purposes, Perry says. While the cost of storing data has decreased dramatically, the risk premium of storing it has increased dramatically. This dark data could contain personal information that must be disclosed and protected. It could include information subject to Data Subject Access Requests and possible required deletion, but if you dont know its there, you cant meet the requirements of the law. Though, this data could also hold the insight that opens up new opportunities that drive business growth. Keeping it in the dark increases risk and possibly masks opportunity. Organizations will put a new focus on shining the light on their dark data.

Open source databases will have a good year in 2020, predicts Karthik Ranganathan, founder and CTO at Yugabyte.

Open source databases that claimed zero percent of the market ten years ago, now make up more than 7%, Ranganathan says. Its clear that the market is shifting and in 2020, there will be an increase in commitment to true open source. This goes against the recent trend of database and data infrastructure companies abandoning open source licenses for some or all of their core projects. However, as technology rapidly advances it will be in the best interest of database providers to switch to a 100% open source model, since freemium models take a significantly longer period of time for the software to mature to the same level as a true open source offering.

However, 2019 saw a pull back away from pure open source business models from companies like Confluent, Redis, and MongoDB. Instead of open source software, the market will be responsive to open services, says Dhruba Borthakur, the co-founder and CTO of Rockset.

Since the public cloud has completely changed the way software is delivered and monetized, I predict that the time for open sourcing new, disruptive data technologies will be over as of 2020, Borthakur says. Existing open-source software will continue to run its course, but there is no incentive for builders or users to choose open source over open services for new data offerings..Ironically, it was ease of adoption that drove the open-source wave, and it is ease of adoption of open services that will precipitate the demise of open source particularly in areas like data management. Just as the last decade was the era of open-source infrastructure, the next decade belongs to open services in the cloud.

Related Items:

2019: A Big Data Year in Review Part One

2019: A Big Data Year in Review Part Two

Excerpt from:
Big Data Predictions: What 2020 Will Bring - Datanami

Read More..

VC Investments In Enterprise Tech And AI – Forbes

According to Toptal, the venture capital sector has grown by 12.1% annually since the financial crisis. The same source tells us that the amount of capital raised per year has grown by 100% over the decade.

Hundreds of venture capitalists are backing up startups and entrepreneurs with billions of dollars each year. Many businesses rely on these VC investments and entire economies depend on it.

When choosing to back up projects, most investors look for innovation, expertise, and profitable opportunities. Im going to take a look at some of the top tier venture capitalists and their investments in the field of enterprise tech and AI.

Companies with a focus on AI have collected over 9.3 billion dollars in the US during 2018. The number of venture capital investments keeps growing on a global scale, opening up new opportunities for startups and entrepreneurs who are looking for their golden ticket to the enterprise tech and AI space.

As stated on Kurtosys, venture capital deals ranged between $10 million and $25 million in the US ten years ago. Today, there is a trend of $50 million plus deals getting a greater share of total investment.

Top tier macro venture capitalists in the startup ecosystem include Benchmark, Index Ventures, Felicis Ventures, and Union Square Ventures.

Even micro and local venture capitalists such as Northstar Ventures and Base Ventures are hitting these large numbers. On the local micro VC side, Aybuben Ventures, the first Pan-Armenian venture capitalist fund focused on Armenian tech entrepreneurs.

With a fund of over $50 million, Aybuben Ventures is not limited to people in Armenia only. On the contrary, the fund is open to Armenians all over the world who are engaged in enterprise tech business and development. Armenians live all over the world and they are proud of their culture and dont want to lose their identity. Potentially this creates a huge global pool of entrepreneurs, professionals, capital, companies and knowledge which can be leveraged and scaled in any of the world's economies. That said, we welcome interest in our foundation, from any organization and without regard to nationality, said Alexander Smbatyan, one of the founding partners of Aybuben Ventures.

Overall, the venture capital space keeps growing, providing technology startups with sufficient funding for growth and expansion. There is an innate disposition to develop companies that make extensive use of technologies such as artificial intelligence, machine learning, biotechnology and more, Smbatyan added as one of the reasons why it is worth to invest in the space of enterprise tech and AI.

Link:
VC Investments In Enterprise Tech And AI - Forbes

Read More..

VMware to use MinIO object storage in Kubernetes embrace – Blocks and Files

VMwarelooks likely to provision storage to Kubernetes Pods using MinIO open source object storage, if its own slide is to be believed.

But first some background. VMware is embracing Kubernetes containers, an alternative, more granular form of server virtualization than its own vSphere virtual machines.

VMware is the dominant form of server virtualisation on-premises, and is also available in the cloud with, for example, VMware Cloud Foundation. But cloud-native workloads use containers, not virtual machines and that is a fundamental threat to VMware.

When applications are containerised their code exists in many small pieces, called containers. These deliver micro-services to each other and have standard interfaces, permitting the code inside a container to change without prejudicing its interaction with other containers. When an application is run the component containers have to be loaded into a servers memory in the right sequence.

Orchestrator code has to be used for this and the Google-originated Kubernetes has become the most popular orchestrator. In its terminology the set of containers that make up an application is called a Pod and Kubernetes is used to provide storage to a Pod.

VMware is working on an internal Project Pacific development to add a Kubernetes control plane to vSphere. This will enable vSphere admins to manage containerised apps and it will embed Kubernetes concepts in vSphere so that VM-based apps are also orchestrated using Kubernetes.

A VMware blogger, Jared Rosoff, senior director, product management for workload management in vSphere, wrote in August: The key insight we had at VMware was that Kubernetes could be much more than just a container platform, it could be the platform for ALL workloads.

He added: This brings the great Kubernetes developer experience to the rest of our datacenter. It means developers can get the benefits of Kubernetes not just for their cloud native applications, but for ALL of their applications. It makes it easy for them to deploy and manage modern applications that span multiple technology stacks.

A developer interacts with Project Pacific as if it were Kubernetes. On the other hand a VMware admin sees Project Pacific as vSphere. It gives vSphere the ability to manage complete Kubernetes Pods as well as individual virtual machines that make up applications in existing vSphere environments.

A briefing slide showed MinIOs positioning in VMwares Kubernetes Pods:

MinIO is used in this way, Minio CEO AB Periasamy said, because the software is S3-compliant, fast, widely used across enterprises, and also extensively used by containerised applications.

He said: We are the native storage when it comes to Kubernetes for VMware. VMware is betting on Kubernetes.

There have been 288.8 million Docker pulls (downloads) of MinIO instances. Sixty two per cent of all MinIO instances are containerised with Docker, and 27 per cent of all MinIO instances are managed using Kubernetes, meaning 43 per cent of the containerised instances. MinIO is deployed in 84 Fortune 100 enterprises. Blocks & Files understands Apple also has a multi-PB instance of MinIO running.

As VMware users adopt Kubernetes Pods through vSphere, MinIO should be pulled along in its wake.

See original here:
VMware to use MinIO object storage in Kubernetes embrace - Blocks and Files

Read More..

SolarWinds Gears New Backup for Office 365 Solution Toward MSPs – ITPro Today

SolarWinds last week announced a cloud-based backup solution for Office 365 it says protects data managed from the same web-based dashboard typically used for servers and workstations.

That approach, which allows managed service providers (MSPs) to see the status of their backups across all customers and locations, sets Backup for Office 365 apart from other Office 365 cloud-based backup solutions from vendors such as Veeam, Acronis, Backupify and CodeTwo. Those solutions may require a separate console or separate product for Office 365.

"Technicians dont need different credentials, have to switch interfaces or look in two places to troubleshoot issues or recover files," said Alex Quilter, SolarWinds' vice president of product management for security.

Backup for Office 365 also includes virtually unlimited cloud storage in SolarWinds' private cloud to protect Microsoft Exchange, OneDrive and SharePoint. SolarWinds maintains more than 30 data centers worldwide to maintain locality requirements.

"We think its important to allow backups to reside in a country of the companys choice, so that company data wont cross borders without their permission," Quilter said.

The goal, Quilter said, is to help reduce the potential impact of malicious external attacks or internal user error. Related data retention, recoverability and the ability to demonstrate regulatory compliance are becoming even more critical, he added, as more businesses adopt software-as-a-service (SaaS) applications and shift resources to the cloud.

There is a good reason why companies have so many choices when it comes to protecting Office 365 files. According to a recently released report, Microsoft Office represented 73% of the most commonly exploited applications worldwide during the third quarter of 2019. Another report from earlier this year found that 60% of sensitive cloud data is stored in Office documents, and 75% of that data is not backed up.

"Office 365 email and documents shared and stored in SharePoint, OneDrive and Teams are the new business-critical data," said the 451 Research report. " Many business continuity/disaster recovery (BC/DR) plans start with protecting key databases and other mission-critical applications, but the unstructured data generated by SaaS-based products is starting to grow faster than traditional database information and can have just as critical of an impact when lost or destroyed."

Go here to see the original:
SolarWinds Gears New Backup for Office 365 Solution Toward MSPs - ITPro Today

Read More..

These shockingly low prices on Blink security systems and Amazon Fire TVs only last one day – BGR

Celebrating the holidays is definitely made better when you know that you got a great price on your gifts, even if they were for yourself. Theres hardly a time, during this season, that you should be walking away from buying something and thinking you paid too much. If youre in the market for the gift of home security or home entertainment for someone on your list, we have deals that will blow you away. With the latest one-day sale from Woot!, youll be so amazed at how low these prices have gone, youll think theyre a holiday miracle.

Just for today, youll be able to nab either a Blink XT Home Security Camera System or an Amazon Fire TV with 4K and an Alexa Voice Remote for crazy low prices. The Blink system starts at only $59.99 and the Amazon Fire TV costs a mere $24.99! Our favorite deals site has done it again with one of their fantastic daily sales. Both of these are brand new and are just two of the amazing deals that are available today on Woot!

When you pick up the Blink XT Home Security System, youll be getting a wireless camera meant to keep your home safe. It can be placed either inside or outside and the system can be expanded to up to 10 cameras. It features a built-in motion detector that sends an alert to your smartphone when its triggered. A short clip will also be recorded to the cloud, allowing you to see exactly what caused it. Speaking of the cloud, there arent any monthly fees for cloud storage when you have this security system.

For the Amazon Fire TV, youll be able to experience life-like picture quality in 4K Ultra HD and HDR. With the Alexa-enabled voice remote, youll be able to just talk and tell it what you want to watch next. You can even check local movie times or order a pizza using this device. Youll have access to all of your shows and streaming apps and Amazon Prime members get unlimited access to thousands of movies and TV episodes. It also gives you access to live sporting events and shows.

Here is the important product information from the Amazon pages:

Make sure you dont miss these deals, as theyll only last today. Keep checking throughout the week and every day on Woot!, as theyre continuing to blow out items like sunglasses, apparel, cookware, laptops and tablets all holiday season long.

Follow @BGRDeals on Twitter to keep up with the latest and greatest deals we find around the web. Prices subject to change without notice and any coupons mentioned above may be available in limited supply. BGR may receive a commission on orders placed through this article.

Original post:
These shockingly low prices on Blink security systems and Amazon Fire TVs only last one day - BGR

Read More..