Page 4,000«..1020..3,9994,0004,0014,002..4,0104,020..»

Amazon Releases A New Tool To Improve Machine Learning Processes – Forbes

One of Amazons most recent announcements was the release of their new tool called Amazon Rekognition Custom Labels. This advanced tool has the capability to improve machine learning on a whole new scale, allowing for better data analysis and object recognition.

Amazon Rekognition will help users train their machine learning models more easily and allow them to understand a set of objects out of limited data. In other words, this capability will make machines more intelligent and capable of recognizing items with far less data sets than ever before.

Employees stand near an The Amazon Inc. logo is displayed above the reception counter at the ... [+] company's campus in Hyderabad, India, on Friday, Sept. 6, 2019. Amazon's only company-owned campus outside the U.S. opened at the end of August on the other side of the globe, thousands of miles from their Seattle headquarters. The 15-storey building towers over the landscape in Hyderabad's technology and financial district, signaling the giant online retailer's ambitions to expand in one of the world's fastest-growing retail markets. Photographer: Dhiraj Singh/Bloomberg

The Benefits of Machine Learning with Amazon Rekognition

Machine learning includes a scientific study and adoption of algorithms that allow computers to learn new information and functionalities without needing direct instructions. In other words, machine learning can be understood as the capability of computers to learn on their own.

Thus far, machine learning models required large data sets in order to learn something new. For instance, if you wanted a device to recognize a chair as a chair, you would have to provide hundreds, if not thousands of pieces of visual evidence of what a chair looks like.

However, with Amazons new recognition tool, machine learning models will be able to work with very limited data sets and still effectively learn the difference between new objects and items.

Computers will now be able to recognize a group of object based on as little as ten images, which is a significant improvement compared to previous requirements. Amazon is slowly but surely stepping on a fresh and untrodden path of machine learning development.

Why Amazon Rekognition Matters

Having limited data to work with used to be a challenge in machine learning. Today, new models will be able to learn efficiently without large sets of data all thanks to Amazons recently announced tool.

Instead of having to train a model from scratch, which requires specialized machine learning expertise and millions of high-quality labeled images, customers can now use Amazon Rekognition Custom Labels to achieve state-of-the-art performance for their unique image analysis needs, announced Amazon in their blog post.

The new Amazon Rekognition featured on December 3rd and it is expected to bring significant changes to machine learning all throughout 2020. The release of the new tool also took place in the AWS re:Invent conference that was held in Las Vegas.

Read the original here:

Amazon Releases A New Tool To Improve Machine Learning Processes - Forbes

Read More..

Machine Learning Answers: If BlackBerry Stock Drops 10% A Week, Whats The Chance Itll Recoup Its Losses In A Month? – Forbes

Blackberry Limited Chairman & CEO John Chen, right, watches as company employees take pictures with ... [+] their phones after Chen rang the opening bell to mark his company's stock transfer from Nasdaq to the New York Stock Exchange, Monday, Oct. 16, 2017. (AP Photo/Richard Drew)

The markets have largely remained divided on BlackBerry stock. While the companys revenues have declined sharply over the last few years, driven by its exit from the smartphone business and the decline of its lucrative BlackBerry services business, it has been making multiple bets on high-growth areas ranging from cybersecurity to automotive software, although they have yet to pay off. This uncertainty relating to BlackBerrys future has caused the stock to remain very volatile.

Considering the significant price movements, we began with a simple question that investors could be asking about BlackBerrys stock: given a certain drop or rise, say a 10% drop in a week, what should we expect for the next week? Is it very likely that the stock will recover the next week? What about the next month or a quarter? You can test a variety of scenarios on the Trefis Machine Learning Engine to calculate if the BlackBerry stock dropped, whats the chance itll rise.

For example, if BlackBerry Stock drops 10% or more in a week (5 trading days), there is a 27% chance itll recover 10% or more, over the next month (about 20 trading days). On the other hand, after a 5% drop over a week (5 trading days), the Trefis machine learning engine says chances of an additional 5% drop over the next month, are about 36%. This is quite significant, and helpful to know for someone trying to recover from a loss. Knowing what to expect for almost any scenario is powerful. It can help you avoid rash moves.

Below, we also discuss a few scenarios and answer common investor questions:

Question 1: Does a rise in BlackBerry stock become more likely after a drop?

Answer:

The chances of a 5% rise in BlackBerry stock over the next month:

= 37%% after BlackBerry stock drops by 5% in a week

versus,

= 41% after BlackBerry stock rises by 5% in a week

Question 2: What about the other way around, does a drop in BlackBerry stock become more likely after a rise?

Answer:

Consider two cases

Case 1: BlackBerry stock drops by 5% in a week

Case 2: BlackBerry stock rises by 5% in a week

Turns out the chances of a 5% drop after Case 1 or Case 2 has occurred, is actually quite similar, both pretty close to 35%.

Question 3: Does patience pay?

Answer:

According to data and Trefis machine learning engines calculations, only to an extent.

Given a drop of 5% in BlackBerry stock over a week (5 trading days), while there is only about 24% chance the BlackBerry stock will gain 5% over the subsequent week, there is a 45% chance this will happen in 6 months, and 41% chance itll gain 5% over a year (about 250 trading days).

The table below shows the trend:

Trefis

Question 4: What about the possibility of a drop after a rise if you wait for a while?

Answer:

After seeing a rise of 5% over 5 days, the chances of a 5% drop in BlackBerry stock are about 44% over the subsequent quarter of waiting (60 trading days). This chance increases to about 53% when the waiting period is a year (250 trading days).

Whats behind Trefis? See How Its Powering New Collaboration and What-Ifs ForCFOs and Finance Teams|Product, R&D, and Marketing Teams More Trefis Data Like our charts? Exploreexample interactive dashboardsand create your own

Read more:

Machine Learning Answers: If BlackBerry Stock Drops 10% A Week, Whats The Chance Itll Recoup Its Losses In A Month? - Forbes

Read More..

Machine Learning Answers: If Seagate Stock Drops 10% A Week, Whats The Chance Itll Recoup Its Losses In A Month? – Forbes

Seagate Technology's hard disk drive assembly plant in Singapore, Monday, Feb. 5, 2007. ... [+] Photographer: Jonathan Drake/Bloomberg News

Seagate (NASDAQ: STX) stock has seen significant volatility over the last few years. While the demand for data storage is expanding, considering the growth of cloud computing and other technologies such as artificial intelligence and machine learning, the companys focus on hard-disk drive technology, which is cost-effective but slower and less power-efficient compared to newer solid-state drives has likely weighed on its valuation.

Considering the significant price movements, we began with a simple question that investors could be asking about Seagates stock: given a certain drop or rise, say a 10% drop in a week, what should we expect for the next week? Is it very likely that the stock will recover the next week? What about the next month or a quarter? You can test a variety of scenarios on the Trefis Machine Learning Engine to calculate if the Seagate stock dropped, whats the chance itll rise.

For example, if Seagate Stock drops 10% or more in a week (5 trading days), there is a 27% chance itll recover 10% or more, over the next month (about 20 trading days). On the other hand, after a 5% drop over a week (5 trading days), the Trefis machine learning engine says chances of an additional 5% drop over the next month, are about 31%. This is quite significant, and helpful to know for someone trying to recover from a loss. Knowing what to expect for almost any scenario is powerful. It can help you avoid rash moves.

Below, we also discuss a few scenarios and answer common investor questions:

Question 1: Does a rise in Seagate stock become more likely after a drop?

Answer:

The chances of a 5% rise in Seagate stock over the next month:

= 38% after Seagate stock drops by 5% in a week

versus,

= 45% after Seagate stock rises by 5% in a week

Question 2: What about the other way around, does a drop in Seagate stock become more likely after a rise?

Answer:

The chances of a 5% drop in Seagate stock over the next month:

= 31% after Seagate stock drops by 5% in a week

versus,

= 24% after Seagate stock rises by 5% in a week

Question 3: Does patience pay?

Answer:

According to data and Trefis machine learning engines calculations, absolutely!

Given a drop of 5% in Seagate stock over a week (5 trading days), while there is a 38% chance the Seagate stock will gain 5% over the subsequent week, there is more than 58% chance this will happen in 6 months, and 68% chance itll gain 5% over a year (about 250 trading days).

Question 4: What about the possibility of a drop after a rise if you wait for a while?

Answer:

After seeing a rise of 5% over 5 days, the chances of a 5% drop in Seagate stock are about 30% over the subsequent quarter of waiting (60 trading days). However, this chance drops slightly to about 27% when the waiting period is a year (250 trading days).

The table below shows the trend:

Trefis

Whats behind Trefis? See How Its Powering New Collaboration and What-Ifs ForCFOs and Finance Teams|Product, R&D, and Marketing Teams More Trefis Data Like our charts? Exploreexample interactive dashboardsand create your own

Read the original:

Machine Learning Answers: If Seagate Stock Drops 10% A Week, Whats The Chance Itll Recoup Its Losses In A Month? - Forbes

Read More..

Machine Learning Answers: If Twitter Stock Drops 10% A Week, Whats The Chance Itll Recoup Its Losses In A Month? – Forbes

The Twitter logo appears on a phone post on the floor of the New York Stock Exchange, Thursday, Oct. ... [+] 27, 2016. (AP Photo/Richard Drew)

Twitter stock has seen significant volatility over the last few years. While the stock is benefiting from an expanding international user base and improving monetization, slowing growth rates and concerns about its valuation have hurt the stock. Considering the recent price movements, we began with a simple question that investors could be asking about Twitters stock: given a certain drop or rise, say a 10% drop in a week, what should we expect for the next week? Is it very likely that the stock will recover the next week? What about the next month or a quarter? You can test a variety of scenarios on the Trefis Machine Learning Engine to calculate if the Twitter stock dropped, whats the chance itll rise.

For example, after a 5% drop over a week (5 trading days), the Trefis machine learning engine says chances of an additional 5% drop over the next month, are about 31%. This is quite significant, and helpful to know for someone trying to recover from a loss. Knowing what to expect for almost any scenario is powerful. It can help you avoid rash moves.

Below, we also discuss a few scenarios and answer common investor questions:

Question 1: Does a rise in Twitter stock become more likely after a drop?

Answer:

Not really.

Specifically, chances of a 5% rise in Twitter stock over the next month:

= 34% after Twitter stock drops by 5% in a week.

versus,

= 36.5% after Twitter stock rises by 5% in a week.

Question 2: What about the other way around, does a drop in Twitter stock become more likely after a rise?

Answer:

Yes, Slightly more likely. Specifically, chances of a 5% decline in Twitter stock over the next month:

= 30.7% after Twitter stock drops by 5% in a week

versus,

= 34.5% after Twitter stock rises by 5% in a week

Question 3: Does patience pay?

Answer:

According to data and Trefis machine learning engines calculations, largely yes!

Given a drop of 5% in Twitter stock over a week (5 trading days), while there is only about 23% chance the Twitter stock will gain 5% over the subsequent week, there is more than a 40% chance this will happen in 3 months.

The table below shows the trend:

Trefis

Question 4: What about the possibility of a drop after a rise if you wait for a while?

Answer:

After seeing a rise of 5% over 5 days, the chances of a 5% drop in Twitter stock are about 45% over the subsequent quarter of waiting (60 trading days). However, this chance drops slightly to about 42.5% when the waiting period is a year (250 trading days).

The table below shows the trend:

Whats behind Trefis? See How Its Powering New Collaboration and What-Ifs ForCFOs and Finance Teams|Product, R&D, and Marketing Teams More Trefis Data Like our charts? Exploreexample interactive dashboardsand create your own

Read the rest here:

Machine Learning Answers: If Twitter Stock Drops 10% A Week, Whats The Chance Itll Recoup Its Losses In A Month? - Forbes

Read More..

AI and machine learning platforms will start to challenge conventional thinking – CRN.in

As we draw closer to 2020, Rick Rider, Senior Director, Product Management, Infor shares predictions on AI.

Moving to Intellectual Digital Assistants. To meet growing enterprise user expectations, AI Digital Assistants will evolve into Intellectual Digital Assistants. Users no longer are satisfied with just telling Digital Assistants what to do and having them automatically execute certain tasks or basic configurations. 2020 will be the year when these digital assistants, using AI and machine learning (ML), start to understand the context of what users are doing, recommend potential next steps (based on completed actions), identify mistakes and auto-correct inputs, and start to engage with users in dynamic, on-the-fly conversations.

AI helps define a new normal. In 2020, AI and machine learning platforms will start to challenge conventional thinking, when it comes to enterprise business processes and expected outcomes. In other words, these systems will re-define our default assumptions about what is normal. This will make business process re-engineering and resource training more efficient. When examining supply chain processes, for example, AI platforms have observed that default values related to expected delivery dates and payment dates typically are used only 4 percent of the time. Users almost always plug in their own values. Therefore, AI and machine learning systems will start enabling us to disregard default values, as we understand them today, and act more quickly through trust in our data. We no longer will be beholden to predefined rules, defaults, or assumptions.

Operationalizing AI. Industry-specific templates will make AI easier to use and deploy in 2020. In manufacturing, AI and machine learning systems, will take advantage of templated processes to help enterprises better manage their parts inventories, improve demand forecasting and supply chain efficiency, and improve quality control and time-to-delivery. In healthcare, organizations will leverage AI and machine learning to better integrate data thats segregated in application silos, exchange information with partners across the care continuum, and better use that data to respond to regulatory and compliance requirements. And, in retail, companies will use AI and ML to better predict demand patterns and shipment dates, based on defined rules, and improve their short- and long-term planning processes.

Read more:

AI and machine learning platforms will start to challenge conventional thinking - CRN.in

Read More..

Can Google Ever Catch Amazon And Microsoft In The Cloud? – Benzinga

Can Alphabet Inc.'s (NASDAQ: GOOGL) Google Cloud compete with the two industry giants? And if it cant, will it shift its focus elsewhere?

A report in The Information suggested Google is pushing to have its cloud computing services unit beat at least one of the big two in the space, Amazon.com Inc.'s (NASDAQ: AMZN)Amazon Web Services and Microsoft Corporation's(NASDAQ: MSFT)Azure cloud computing services, by 2023.

If it doesnt, the division could lose funding, the article, based on an unnamed sources account of a 2018 meeting, suggests. It's a suggestion Google denies.

"Reports of these conversations from 2018 are simply not accurate, Google said in a statement it sent to The Information after the piece ran.

Cloud services where businesses and organizations essentially rent the hosting company's computing infrastructure capacity rather than investing in their own is becoming huge business in a world increasingly reliant on enormous amounts of data, and where systems that are becoming increasingly interconnected.

Observers and analysts tend to think that while Google has been willing to kill off some services (remember Google Plus?), it seems unlikely it would move away from the increasingly essential and potentially very lucrative cloud services business, especially as the world transitions to 5G.

The worldwide cloud services market grew by nearly 40% in the third quarter of 2019.

"They will need to be in cloud," said Tigress Financial analyst Ivan Feinseth. "I dont think they get out of the cloud hosting business theyre doing very well in it."

Google has been touting the growth in its cloud business for several months, with Google executives noting, for example, on a July earnings call that the Google Cloud platform is one of Alphabet's fastest-growing businessesand the third-largest revenue driver for the company.

Data tracking firm Canalys reported in October that Amazon remains the dominant player in the space even as it is seeing its cloud growth slow a bit.

AWS has about 32% of the market share in cloud, while Microsoft's Azure is at about 17%. Google is a distant third at 7%.

But Google's cloud growth was about 70% in the September quarter to just under $2 billion, according to Canalys, edging it into the list of companies legitimately vying for front-runner status.

Google is investing heavily in that growth, which doesn't signal that it may be considering eventually pulling the plug, Feinseth said.

The analyst said in a November note to investors that cloud infrastructure and machine learning are likely to be the future drivers of the company's growth.

"Google continues to invest in the buildout of data centers, along with the hiring of salespeople and engineers, to support its cloud services platform," he said.

Canalys said that in addition to building new cloud data centers, Google's made "major investment in internal sales and partner resources."

The huge promise of, and need for, the cloud computing sector was highlighted most recently in a story that didn't involve Google, as Amazon lost out to Microsoft on a $10-billion, 10-year contract to runcloud computing for the Pentagon, a contract known as Jedi.

While Amazon's AWS, an early pioneer in cloud, has been the industry leader, Microsoft's win on the Pentagon contract instantly put it in the same category.

Amazondisagrees, alleging in a formal protest that the government gave the contract to Microsoft because President Donald Trump wanted "to screw Amazon" because he doesn't like CEO Jeff Bezos. Amazon's still the sales leader in the space.

Related Links:

Microsoft In The Clouds With Q1 Earnings Report On Continued Strong Azure Performance

Microsoft Could Win Next Phase Of Cloud Battle, Wedbush Says

View More Analyst Ratings for GOOG View the Latest Analyst Ratings

2019 Benzinga.com. Benzinga does not provide investment advice. All rights reserved.

See the original post here:
Can Google Ever Catch Amazon And Microsoft In The Cloud? - Benzinga

Read More..

GDPR and the Cloud – Helpful DPC Guidance for Organisations – Lexology

Are you a controller of personal data under the General Data Protection Regulation ("GDPR") who uses a cloud services provider ("CSP"), or are you a CSP who acts as a processor to a controller customer who has engaged you to provide it with cloud computing services ("CCS")?

If you answered yes to either question, you are required to be aware of the data protection risks associated with the provision and receipt of CCS and to comply with GDPR obligations appropriate to your status as controller or processor of personal data. Helpfully, the Data Protection Commission ("DPC") has issued a CCS guidance note dated October 2019: "Guidance for Organisations Engaging Cloud Service Providers" which is a useful addition to the range of advice issued by the DPC and provides useful clarification for both customers and suppliers of CCS.

CCS OBLIGATIONS UNDER GDPR

Controllers have an obligation under GDPR to process personal data in a way that ensures appropriate security (as per the data protection principles of integrity, confidentiality and security). The DPC highlights that organisations must ask whether they have appropriate technical and organisational measures in place and ensure their processors do too. The DPC has separately issued guidance

for controllers of personal data on data security, which is a reference guide to assessing whether appropriate security measures exist or are required to be implemented. As the DPC states in the CCS guidance "the use of any cloud services as part of [data controllers'] business is an important area in which organisations need to ensure there is adequate security for the personal data they process".

CLOUD COMPUTING UNDER GDPR

The DPC notes that "people often mean different things when they talk of processing data `in the cloud'", which is undoubtedly true. The CCS guidance is not intended as a detailed guide to cloud computing or different types of CCS and thus generally describes cloud computing, for both controllers and processors, as "usually involves" an external CSP doing some or all of the processing or storage of personal data "on servers and/or in a data centre" under that CSP's control. The DPC notes that CSPs' will "in many cases" be acting as data processors and reminds CSPs to be aware of their obligations as processors, which are less onerous than those that apply to controllers. Whether a CSP is a data processor or controller is a question of fact, which can be a difficult analysis.

TYPES OF CLOUD COMPUTING

The DPC identifies three CCS models, which may involve the provision of a physical infrastructure, operating system, and/or processing software:

The DPC also discusses the distinction between a private, public and hybrid model CCS. It points out the possibility of a chain of CCS applying, where a CSP, acting as a sub-processor, provides CCS to another CSP, who has the ultimate contract relationship with the customer, generally the data controller. The DPC also points to the complicated scenario arising where CSPs "are also data controllers, or `joint controllers'". Again, the question is always one of fact. Overall, the DPC references to CCS service models and architecture models accord to most common industry categorisations.

CCS ASSOCIATED RISKS

The recent CCS boom has offered businesses of all sizes a range of new and favourable storage options. The DPC states it is essential for businesses looking into CCS (or those already engaged with a CSP) to ensure adequate security of personal data being stored in the cloud. Issues may arise where controllers relinquish control of data to their CSP, where there is insufficient information around the service and its safeguards, or where the CSP is unable to adequately support the controller's obligations and/or data subjects' rights. The CCS guidance mainly focuses on CCS risks and recommended steps to remove and reduce such risks.

THE MEANING OF "CONTROL"

The CCS guidance is clear that a data controller "must remain in control of the personal data it collects when it subcontracts the processing to a cloud provider". This

is a key obligation, which cannot be waived or contracted out of. If the data controller cannot demonstrate control, it may potentially be in breach of GDPR. The DPC states that control requires:

SECURE CLOUD COMPUTING

Under GDPR, a controller may only engage a processor if the latter provides sufficient guarantees to implement appropriate technical and organisational measures. Controllers and processors are responsible for ensuring that such measures are commensurate to the risk. In practice, this is a key area of customer difficulty with the procurement of CCS, which in essence is output measured. Customers do not have, and generally the CSP will not or cannot allow its customers, visibility of what goes on under the hood, whether in real time or on ad-hoc basis (e.g. by way of inspection or audit). The DPC states that "a controller must therefore be satisfied that personal data will be secure if it is outsourced to a cloud provider". The reference to outsourcing is interesting and it was long challenged by the cloud industry that this was a form of outsourcing, which is a well understood commercial sector in terms of risk management and commercial arrangements. The industry has largely succeeded in creating a commercial and contractual model, as well as a financial model, unique to itself.

The DPC states that with reference to security, controllers must be satisfied in two main areas. That the CSP:

CSP ASSURANCES

The DPC states that controllers must seek assurances from potential CSPs on key issues, including:

Controllers must be satisfied with such assurances both in advance of entering the contract with the CSP and throughout the arrangement. This may be achieved by:

As mentioned above, customer inspection or audit is a difficult topic in the CCS sphere. In practice, more sophisticated CSPs will commission third party audit-style reports which can be made available to customers. Overall, it is difficult for the customer of a CSP to obtain much if any change to the established supplier financial, technical and contractual model. This is especially true with reference to the large service providers. In certain market sectors, CSPs are more willing to engage in some degree of dialogue, or have pre-prepared responses to the type of requirements listed above, the financial services sector being a prime example. That is arguably as much due to sector specific regulatory requirements as the market leverage of the customer base. For customers lacking leverage, or regulatory requirements to reference, contracting with market leading CSP's is challenging. This includes the public sector, where individual agencies in Ireland are in most cases of modest enough size and thus represent modest enough spend. These specific guidance statements are perhaps the most difficult part of the CCS guidance for data processors to comply with. The more important, but broad, statements in relation to data controllers remaining in control are perhaps not so difficult, if only because CSP contracts deliberately do not express or imply CSP control, which is a condition CSPs strongly argue against as a matter of fact.

TRANSPARENCY REQUIREMENTS

Under GDPR, the CSP as a processor may avail of approved codes of conduct or certification mechanisms to help demonstrate compliance of elements of their processing. This allows a controller to assess if the arrangement is appropriate to the processing operations being contracted. A high level of transparency is required between a controller and data subjects when that controller is processing those data subjects' personal data through a CSP. The CSP must be able to account for its processing operations. The DPC states that a controller must be satisfied as to the CSP's:

LOCATION, LOCATION, LOCATION

Personal data held in the EEA benefits from a common standard of EU protection. Such protection may extend to data transferred outside of the EEA by relying on one of the following mechanisms under GDPR:

CONTRACT PARTICULARS

The DPC states that a number of key points must be covered in the contract between a controller and its CSP, including details of how the CSP will:

The contract must also outlined the subject-matter, scope, nature, context, purpose and duration of the processing, and how types and categories of personal data are dealt with at commencement, transfer, routine processing and `end-oflife' (including return or deletion).

CONCLUSION

Overall, the DPC's guidance offers welcomed clarity to those seeking to engage or renew their commitments to a CSP in the age of GDPR. In doing so, organisations should keep the DPC's main message in mind and ask whether they (or their CSP) have the appropriate technical and organisational measures in place. We recently published an article on public sector procurement of CCS, which can be read in conjunction with this GDPR related article here.

Go here to read the rest:
GDPR and the Cloud - Helpful DPC Guidance for Organisations - Lexology

Read More..

HPE goes on the warpath, seeks to scalp AWS over vendor lock-in – The Register

Interview Migrating your data to the machine that is Amazon Web Services is a little like booking into the Hotel California, the title track from The Eagles late 1970's album of the same name, the rub being that customers, like guests at the hotel, can check out anytime they like but never truly leave.

This is the view of HPE CEO Antonio Neri, who told The Register that contrary to early claims from the likes of AWS, the cloud is not democratising IT, nor is it an open environment, just the latest form of vendor lock-in.

At the Discover event in Munich, Neri who started out at HPE in a Netherlands call centre and rose through the ranks to replace Meg Whitman as the boss of HPE on 1 February 2018 noted the General Availability of AWS Outposts in early December: an on-premises rack running native AWS or VMware environments that hooks up to to AWS's public cloud.

"The first message there is I think [AWS] has finally recognised the world is hybrid," he said. "The world is hybrid and apps and data live everywhere, and so for them to continue to drive the growth they need to bring the cloud to the data."

According to HPE's estimates, three-quarters of data will be created outside of the data centre and outside of the cloud, at the "edge" - everywhere else except the bit barn or the cloud.

"It is cheaper and physically easier to move the cloud to where the data is, not the data to where the cloud is. What the public cloud really wants is your data, [providers] don't care about your workloads, and so once you check your data in the public cloud you are kind of locked in, it is like checking into Hotel California - you check in and you never check out."

Data, he said, has a "gravitational force and once that data gravitates to a place it's very hard to move it I think the customer is realising now that the cost of [taking] that data out of the cloud is the biggest part of expense."

He said: "Maybe five years ago or 10 years ago the cost of [public cloud] compute was very attractive to [customers] because they don't have to deal with labor, power, cooling, de-appreciation, all the things that you normally go through when you deploy infrastructure, and that was appealing. Plus the simplicity, right, I don't have to deal with all the software, the maintenance aspect of this. I don't have the skill sets to move to the cloud - the cloud should be all about speed and agility, enabling faster delivery of services, all that was interesting and good."

Neri claimed AWS, via Outposts, is giving customers a "tentacle" to extend its public cloud into their own data centre.

"The reality, they've [AWS] already told you, [is it] will cost you more, at least 30 per cent more than moving your data into the public cloud. So it is a way to attract you back to the public cloud. What we want is an open approach, a true, multi-cloud approach where you have choices, you have the flexibility to move data and apps to where it makes more sense, whether it is for security purposes or for experience purposes or cost purposes."

Lots of easier workloads have moved to the cloud email, dev work but the vast majority of traditional enterprise scale workloads remain on-premise. IDC reckons that as of 2023, 70 per cent of these workloads will remain on site.

What the public cloud really wants is your data, [providers] don't care about your workloads, and so once you check your data in the public cloud you are kind of locked in...

Neri is an engineer by profession but as CEO he seamlessly slips into salesman talk. "We believe we have a better approach, an open, cloud native approach. Intelligent and secure. And this is why we announced our unique partnership with a company that is led by my friend, John Chambers, called Pensando, and in order to deliver that edge cloud experience you need to implement different architectures."

In October, HPE led a Series C round of investment of $145m in Pensando Pensando sells programmable software-defined cloud, computing, networking, storage and security services at the edge. This cash came from HPE's Pathfinder programme that seeks to invest in startups. As part of the move, HPE CTO Mark Potter joined Pensando's board.

HPE has bet the farm on edge computing, and is directing 75 per cent of its R&D spending over the next few years to develop further innovations that let clients capture, analyse and use real time data at the edge rather than sending it all down the pipes to a cloud, which can be prohibitively expensive and relatively slow.

Data continues to "explode", the CEO told us.

"What a customer realises now is that it's not as simple to manage multiple estates which are siloed. I have a public cloud here, I have on-prem there. I have a traditional set of applications which are not going anywhere. How do I bring that unification experience and have control of that estate? [IT managers have] lost control - first because the line of business swiped the credit card and went to the public cloud. Now as they start doing these trials and moving data to the public cloud, they say 'Oh, this is way more expensive'. And so they realise that the world is hybrid and more and more of those workloads are moving to the edge."

CIOs are trying to figure out if they can be the service provider to their organisation and take over control of "expedience, cost, SLA, compliance and security," Neri said.

HPE's big push in this area has been GreenLake, infrastructure sold as a service, configured and managed by the vendor and paid for via a mix of subscription, pay-per-use and consumption-based models. A private cloud-as-a-service, if you will.

By 2022, HPE wants to sell its entire portfolio as a servicethough its hardware and software will still be able to be bought in the classic way.

As of June last year, just 5 per cent of HPE's turnover was transacted as-a-service, so its got a lot of mileage to make up.

With this in mind, HPE recently confirmed availability of GreenLake Central, a self-service portal and ops console intended to let heads of IT working for end user businesses monitor costs, security, compliance and the use of tech resources in terms of clouds (AWS, Azure and - soon - Google Cloud Platform) and what they have on-premises. As we pointed out at the time, the tool has limitations.

Neri told us that through this tool, HPE is passing "back control" to the department, and then "giving them the choice to either be in the running [things] side of IT or be in the innovation side of IT."

Customers want choices, "not locking in", he said. "If you think about the past, customers used to have two vendors on premises, sometimes three. Now we can give them multiple cloud options we still give them an experience, just as a service."

HPE and multiple other vendors have talked about customers trying to escape AWS but very few have gone public with their frustrations and the cost of doing so. Neri said customers are asking for help and the advice is take a "very app and data centric approach". He said the splitting of HP let HPE change the way it uses tech, with a plan to be more nimble.

"We're shutting apps all the time. We are consolidating a number of apps because we are consolidating the way we run the company. So the first question [for customers] is how many apps you can get rid of? How many apps can you run in SaaS? And then ultimately, of the apps that are remaining: how many apps can you put in the cloud?

"Then, based on the data and the experience you want to provide... for customers it can be a hybrid model, but we give them the full experience so customers are asking us for help from an advisory perspective first of all, and then re-platforming of certain apps in a cloud native approach."

This plays to HPE's container launch that happened in November, as does most of the things that Neri says during our 30-minute discussion with him. He isn't alone in this respect. Most CEOs try to discuss issues that conveniently dovetail with the latest and greatest thing they are selling.

AWS is by far the largest public cloud seller in the industry, but how about Microsoft and Google: do they lack an open approach in the same way HPE claims AWS has?

"Google is trying to become hybrid but they have some technology challenges in terms of scale because, you know, the way they architected the cloud was for Google, and if you need a small cloud, like say in a closet, Google does not scale at the rack level or the server level," Neri tells us.

"I think Microsoft definitely has a much more clear strategy on hybrid and that's why Hewlett Packard Enterprise is the biggest provider of Microsoft Azure today because we have done deep engineering integration of Azure," said Neri.

This is true. Sadly for HPE, Microsoft didn't keep up its end of the bargain.

When the alliance over Azure was struck in 2015 after HPE ditched its own public cloud Microsoft pledged to use HPE servers to bulk out its own data centres, but then opted for cheaper kit.

AWS, Neri says, has been "very clear since the beginning - its the public cloud, the public cloud, until a year ago when it came up with the concept of Outposts, but that I think is nothing more than just a way to still bring you in the public cloud, particularly on the pricing".

Vendor lock-in isn't a new term that pertains to cloud vendors. It is one that refers to a range of technology companies that started life in the last century.

Neri concedes that in mission-critical systems, HPE has not always been so open.

"But even then we run on an open-source Linux solution, so I think the reality is you have to define what lock-in means. In my view of HPE, that is a definition of optimising a stack for a specific workload, which is to provide the best possible experience for the lowest possible cost, from silicon all the way to the management layer of that infrastructure, all the way to the PaaS, potentially. And that can be a full stack of HPE. I don't call that lock-in because we are trying to do the right thing."

HPE seems to have a much clearer strategy since its split with HP Inc, and the sale of the Enterprise Services and Software sides of the house. But this isn't necessarily feeding through into the financial results yet.

In fiscal '19 ended 31 October, total revenues were down 6 per cent Neri prefers to highlight that revenues were down just 2 per cent in constant currency (because 60 per cent of HPE's business is "transacted outside the US") and excluding the fall in sales of servers to tier one cloud builders.

Sales were down for each of the divisions - Hybrid IT; Pointnext; and Intelligent Edge. Financial Services shrank too. Operating margins were, however, were up in the hundreds of basis points, and were up to 13.2 per cent in Hybrid IT, a historic high. Of course HPE Next has reduced overheads, including through redundancies, and this, in addition to a mix of higher-margin products, has helped improve profit.

HPE today remains a hardware company, as evidenced by the profile of its revenues, but the future looks software and services shaped.

For 2020, HPE has told investors to expect "sustainable, profitable growth". And judging by the direction of travel with regard to acquisitions MapR, Blue Data, Cloud Cruiser and Red Pixie will there be more investments in services and software businesses?

"Correct, absolutely spot on... as I think about inorganic, which is always on the premises of return on invested capital, and bringing in IP and talent, definitely it's more software and more services oriented."

So one-time CEO Leo Apotheker maybe wasn't so wrong in his plans for HP all those years ago when he plotted to reduce the company's reliance on boxes. Maybe he lacked finesse in execution.

The Reg has asked AWS for comment.

Sponsored: Beyond the Data Frontier

Follow this link:
HPE goes on the warpath, seeks to scalp AWS over vendor lock-in - The Register

Read More..

XRP is Down 95% from Its 2018 Peak; Whats Next for the Embattled Crypto? – newsBTC

Despite what the news may imply, Bitcoin (BTC) is up on the year, having posted an over 70% gain. XRP, Ethereum, and other altcoins, on the other hand, have suffered, plunging under the weight of a volatile Bitcoin.

Some cryptocurrencies have been hurt more than others. The most hurt altcoin in the top 10 has to be XRP, which was recently noted by analysts to have collapsed by 95% since the euphoria of the 2018 peak.

Just look at the chart above, which shows that the cryptocurrency really hasnt had the best past 24 months.

While 95% is already a painful loss for a cryptocurrency, let alone an asset in general, some say that it may get worse from here. Heres more on why.

According to prominent cryptocurrency trader and commentator Jacob Canfield, it may still be a ways to go before it makes sense to even try investing in the cryptocurrency for the long haul. He even went as far as to say that one of his achievements over the past few years is telling his friends and family to not buy XRP in 2018 and 2019 due to the technical outlook on the charts.

Canfield added that to even think of considering recommending a long, he would wait until the cryptocurrency trades at $0.10 to $0.15 50% to 25% lower than current prices.

Thats not to mention that XRP-related sentiment recently took a hit.

Speaking to Forbes, Joe DiPasquale, CEO of cryptocurrency hedge fund manager BitBull Capital, remarked that the aforementioned altcoin is likely to suffer more, despite XRP already trading at its lowest price in over two years. The prominent Bitcoin investor remarked that:

XRP has been historically very sensitive to adoption-related news pertaining to banking and money services partners, since they represent the biggest clientele for Ripples services We expect the cryptocurrency to slide further due to this news.

Not all is lost forRipple, though.

According to areport from Fortune, the company has justcompleted a massive funding roundthat values the company at $10 billion, making Ripple one of the largest blockchain companies on the market at the moment. The investment that valued it at $10 billion, $200 million sourced from global investment firm Tetragon, Japanese pro-XRP finance-centric conglomerate SBI, and Route 66 Ventures.

Apparently, the funds, while not needed to fund operations, will give the company Balance sheet flexibility as it looks to hire upwards of 150 new employees in 2020 and introduce new overseas offices to facilitate said employee additions.

This Series C raise should help Ripple increase the adoption of solutions using XRP, potentially increasing demand for the cryptocurrency and, therefore, price with time.

Read the original post:
XRP is Down 95% from Its 2018 Peak; Whats Next for the Embattled Crypto? - newsBTC

Read More..

BitGo Will Comply with FATF – Cryptocurrency Regulation – Altcoin Buzz

Custodial platform BitGo is going to launch a streamlined, opt-in solution to comply with the Travel Rule regulation suggested by the FATF. It will start functioning inApril 2020.

BitGo has announced it is ready to protect its customers from the potential impact of the Travel Rule. Thus, it released a specialized solution.

This solution will minimize workflow changes and allow exchanges and other clients to continue leveraging BitGos robust, cross-blockchain API platform by simply adding well-defined parameters when sending funds. We are targeting April 2020 to complete the implementation. This will put us ahead of the June 2020 deadline and give our clients time as well. the company advises in its blog.

Currently, BitGo is processing 20% of all Bitcoin transactions.

The Financial Action Task Force (FATF) is an intergovernmental organization that develops global standards against money laundering and terrorism financing.

It was established in 1989 by the decision of the G7 countries. Its members include35 countries and 2 organizations.

It also assesses compliance with national AML / CFT legislation. On June 21 it issued its recommendation known as Travel Rule. According to it, all transactions exceeding $1,000 must contain details of both the sender and the beneficiary. The new rule demands VASPs (virtual asset service providers) to collect information about their clients. It also refers to crypto exchanges as wallet providers. The goal is to stop money laundering and other illegal activity while digital assets transfer.

At the moment, there is no global consensus that would regulate cryptocurrency transactions worldwide. Exchanges in different countries work according to different legislation. And a clear mechanism for crypto regulation is still to be developed.

At the same time, the FATF asks the exchanges for a report by June 2020 to check on progress.

In October, Altcoin Buzz reported about Binances steps toward compliance with the FATF regulations.

Go here to see the original:
BitGo Will Comply with FATF - Cryptocurrency Regulation - Altcoin Buzz

Read More..