Page 4,011«..1020..4,0104,0114,0124,013..4,0204,030..»

Why Prudential’s Top Line Is Likely To Shrink Nearly $1 Billion In 2019 – Nasdaq

Prudential Financial (NYSE:PRU) is a global financial company with operations in the United States, Asia, Europe, and Latin America. Trefis details the key components of Prudential Financials Revenues in an interactive dashboard, along with our forecast for 2019-2020. While the companys Total Revenue has jumped 7% from $58.8 billion in 2016 to almost $63 billion in 2018, we expect it to drop 1% (roughly $900 million) to $62.1 billion in 2019 primarily due to a $1.6-billion reduction in revenues for its U.S. Workspace solutions division. This decrease would be partially offset by growth in other divisions like International Insurance ($400 million), U.S Individual Solutions ($100 million), Corporate & Closed Block ($100 million) and Global Investment Management ($100 million).

Notably, we expect Prudentials International Insurance segment to contribute the biggest chunk of revenues (36% of the total) for 2019 surpassing the U.S. Workspace solutions division as the companys largest operating segment.You can make changes to our forecast for individual revenue streams in the dashboard to arrive at your own forecast for Travelers Revenues.

What To Expect From Prudential Financials Revenues?

Details about how trends in Prudential Financial revenues compare with peers MetLife, AIG and Hartford Financial are available in our interactive dashboard.

(A) International Insurance revenues would cross $23.2 billion by 2020

(B) U.S. Workspace Solutions grew 24% over the last 2 years from $18.2 billion in 2016 to $22.5 billion in 2018.

(C) U.S Individual Solutions would cross $11 billion by 2020, although it would grow at a slower pace.

(D) Although Corporate & Closed Block revenues have decreased 37% over the last 2 years, it would increase 4% in 2019.

(E) Global Investment Management revenues are expected to hover around $3.4 billion by 2020.

Our interactive dashboard for Prudential Financial details what is driving changes in revenues for Prudential Financials U.S Individual Solutions, Corporate & Closed Block and Global Investment Management.

Trefis estimates Prudential Financials stock (shows cash and valuation analysis) to have a fair value of $108, which is roughly 15% higher than the current market price (Our price estimate takes into account Prudential Financials earnings release for the third quarter).

Whats behind Trefis? See How Its Powering New Collaboration and What-IfsForCFOs and Finance Teams|Product, R&D, and Marketing TeamsAll Trefis DataLike our charts? Exploreexample interactive dashboardsand create your own

The views and opinions expressed herein are the views and opinions of the author and do not necessarily reflect those of Nasdaq, Inc.

Read more here:
Why Prudential's Top Line Is Likely To Shrink Nearly $1 Billion In 2019 - Nasdaq

Read More..

Google Photos vs iCloud Photos: Which One to Use on iPhone – Guiding Tech

You must have heard a lot of buzz around Google Photos. And you must be wondering why is everyone talking about it when we already have iCloud Photos on iPhone. Since thats the question that brought you here, you wont be disappointed. You will get to know the similarities and the differences between Google Photos and iCloud Photos.

While Google Photos is a proper app on iPhone, you will find the iCloud functionality inside the Apple Photos app. There is no separate app known as iCloud Photos. The iCloud is an online storage service offered to Apple users, and iCloud Photos is part of it. Even though Google Photos also saves your images online in the cloud, it doesnt use iCloud. The Google cloud saves the images.

Lets start the comparison between Google Photos and iCloud.

Both Google Photos and Apple Photos are gallery apps with added functionalities. You can use both of them to view the photos taken on iPhone or iPad without uploading them to the cloud. The feature to take a backup to the cloud is optional in both.

On Google Photos, you need to activate that from the app settings itself by enabling Back up and sync. Once enabled, find out who can see your images in Google Photos.

In the Apple Photos app, you need to open the Settings on your phone and go to Photos. Then turn the toggle on for iCloud Photos.

There's another difference between the two. In Google Photos, the existing images on the Google Photos cloud automatically sync to your iPhone even if Back up and sync is disabled. They do not occupy any storage on your phone as they are present in the cloud. YOn the other hand, you need to enable iCloud Photos to sync your existing photos in iCloud to your iPhone.

iCloud service is limited to Apple devices such as iPhone, Mac, and iPad, where it comes preinstalled. Fortunately, you can access iCloud Photos via the web too. Google Photos is available on iOS and Android. You can even use it on the web.

When it comes to the account, iCloud relies on your Apple ID. Google Photos, as is evident from its name, uses Google accounts only.

Since the main use of both of them is to save your pictures in the cloud, the available storage space plays an important role. On iCloud, you get only 5GB of free storage, and that includes space for all apps such as Photos, Mail, Notes, Contacts, and even Drive. If you want, you can upgrade your storage plan or free up iCloud storage.

On the other hand, Google offers 15GB of free space. Here also, the cloud storage is shared between various Google products such as Photos, Drive, Keep, and more. Read in detail about what counts in Google Drive storage.

However, Google Photos has a major advantage over iCloud as it lets you upload an unlimited number of images. For that, you will have to go with the High-Quality setting instead of Original Quality when asked how you want to upload.

The catch is that your photos are compressed to 16MP and videos to 1080p. For a regular user, that would be sufficient. But if you are a photographer, you might want to go with the Original Quality mode and then later upgrade your storage. Another benefit is that you can change your existing uploaded photos to High Quality too without uploading them again.

At first glance, both the apps look similar to various tabs at the bottom. But when you look closely, there are huge differences. For starters, Google Photos shows the most recent photos at the top, making it easy to access them. On the other hand, the Apple Photos app displays them in the reverse chronological order with the recent ones at the bottom.

Secondly, while the search is present at the top in the case of Google Photos, it appears as a tab in the bottom row in Apple Photos. Further, both let you change the layout of your photos. In Google Photos, it has to be done via settings, and on the Apple Photos, you can change it via the back button at the top.

One helpful thing about Google Photos is that it makes quite clear whats backed up to the cloud and whats not. You will see a small cloud icon with a slash over it on the photos that are saved on your phone only and not in the cloud. The images and videos saved in the cloud will not have any icon on them. Theres no way to know whats syncing to Apple Photos.

While both let you edit your offline and online photos, Apple Photos provides more editing options such as changing curves and colors.

However, once you have edited your pictures, Apple Photos replaces the existing photo on your phone and in iCloud (if enabled). Your old photo is gone. Fortunately, Google gives you an option to save the edited photo as a copy instead of replacing the original one.

If you dont like the default editing features, you can download third-party editing apps on both Android and iOS.

As an iPhone user, if you like the searching capability in Apple Photos, you will love it in Google Photos as you can search for anything, and it will show you the results without fail.

However, theres a drawback in Google Photos. The search only works for photos that are saved online. So if you are using it as a simple gallery app on your iPhone, the search wont show offline photos. Thats not the case with Apple Photos. It works for photos both on your phone and in iCloud.

While both offer facial recognition, location tagging, and grouping of photos bases on themes, Google Photos is slightly ahead as it applies incredible machine learning features. For instance, it will automatically create a video based on important moments from your photos. Similarly, it includes a built-in Google Lens that further offers intelligent features such as copy text, identify pictures, and more.

When the online capability is enabled in the apps, they are continuously uploading your new images to their respective cloud services. In both the apps, photos are stored both on your phone and in a cloud by default.

If your iPhone runs low on storage, you can free up local storage. Even though the both offer the feature, it differs in working. On Google Photos, when you use the Free up space option, the local photos are deleted completely from your phone. But, they stay in the cloud. In the app, you only see a thumbnail. Tapping on the photo will get it from the cloud and show it to you. You need a working internet connection to view such photos.

On the other hand, when iCloud is enabled, you get an option in the form of Optimize iPhone Storage to free up local space.

When enabled, your phone keeps the original photos and videos in iCloud and replaces it with a compressed version to save space on the phone. Unlike Google Photos, where the images are deleted entirely from your phone taking no space afterward, the pictures continue to take a small amount of space by using Optimize Storage in iCloud as the local copy (even though small) still exists.

Google Photos lets you share your images and videos in multiple ways. You can either share them via the installed apps or post them to social media directly. You can even create a link and share that with others via any platform.

Moreover, you can share photos with Google Photos contacts or create shared albums where others can add photos too. Besides that, you can even share your entire library with someone using the Partner account feature. Know in detail how Google Photos sharing works.

Apple Photos or iCloud offers fewer ways to share. You can either create a shared album that works only on Apple devices and Windows. Or you can share the pictures via the installed apps such as WhatsApp, Facebook, etc. In the latter case, the recipient doesnt need to have an iCloud account as the photo is shared directly.

In a nutshell, you can say Google Photos is a backup and sync service while iCloud Photos only offers sync capability. Meaning, once you delete the photo, its removed from everywhere. Thats not the case with Google Photos, where the photos are backed up to the cloud, and you can free up space on your phone. So if you are looking for a backup service, Google Photos is a right choice.

Further, its available across all platforms. You can access it on all the devices, unlike iCloud, thats limited in its nature. Whats the point of saving pictures in the cloud if you cant access them everywhere? Last but not least, you get unlimited storage in Google Photos. What else does one want from a cloud service?

If you feel like disabling iCloud, know what happens when you do it. You can even uninstall Google Photos if it doesn't suit your taste. And if you liked both, you can continue using both of them together. Each works separately, so it wouldnt be an issue. And if you dont like either, do check their alternatives on iPhone.

Next up: Added a wrong account in Google Photos? Know how to remove an account and what happens by doing so from the given link.

Last updated on 16 Dec, 2019

Continue reading here:
Google Photos vs iCloud Photos: Which One to Use on iPhone - Guiding Tech

Read More..

A Data Leak Exposed The Personal Information Of Over 3,000 Ring Users – BuzzFeed News

This gives a potential attacker access to view cameras in somebodys home, thats a real serious potential invasion of privacy right there.

Posted on December 19, 2019, at 10:58 a.m. ET

The log-in credentials for 3,672 Ring camera owners were compromised this week, exposing log-in emails, passwords, time zones, and the names people give to specific Ring cameras, which are often the same as camera locations, such as bedroom or front door.

Using the log-in email and password, an intruder could access a Ring customers home address, telephone number, and payment information, including the kind of card they have, and its last four digits and security code. An intruder could also access live camera footage from all active Ring cameras associated with an account, as well as a 30- to 60-day video history, depending on the users cloud storage plan.

We dont know how this tranche of customer information was leaked. Ring denies any claims that the data was compromised as a part of a breach of Rings systems. A Ring spokesperson declined to tell BuzzFeed News when it became aware of the leak, or whether it affected a third party that Ring uses to provide its services.

Our security team has investigated these incidents and we have no evidence of an unauthorized intrusion or compromise of Rings systems or network, the spokesperson said. It is not uncommon for bad actors to harvest data from other company's data breaches and create lists like this so that other bad actors can attempt to gain access to other services.

It is not clear what other company's data breaches the spokesperson was referring to.

The Ring spokesperson added that the company will notify customers who were affected and require them to reset their passwords. An affected customer told BuzzFeed News that they received a notice on December 18.

Security experts told BuzzFeed News that the format of the leaked data which includes username, password, camera name, and time zone in a standardized format suggests it was taken from a company database. They said data obtained via credential stuffing when previously-compromised emails and passwords are used to get access to other accounts would likely not display RIng-specific data like camera names or time zone.

One could argue that the person maybe got these through credential stuffing, Cooper Quintin, a security researcher and Senior Staff Technologist at the Electronic Frontier Foundation, told BuzzFeed News. But if that was the case, why did that person go through and add the information about names of camera and time zones?

Quintin described the leak as stunning.

This gives a potential attacker access to view cameras in somebodys home in some of these cases, thats a real serious potential invasion of privacy right there, he said.

Screenshots of the email sent to Ring customers on December 18.

BuzzFeed News was alerted to the leak by a security researcher, who claimed he used a web crawler to search the internet for any data leaks pertaining to Ring accounts. The security researcher found the list of compromised credentials posted anonymously on a text storage site.

The security researcher called Rings customer support number, according to a call log screenshot shared with BuzzFeed News. He said that a representative told him that they were unable to assist. After posting about the leak on a cybersecurity-focused subreddit on December 16, a person who claimed to be a member of Ring's security team messaged him. According to screenshots shared with BuzzFeed News, the self-identified member of Ring's security team said that the leak represented compromised data that the company previously did not know about.

The security researcher said he wasnt surprised that Rings data was exposed, because WiFi-enabled devices smart home devices are inherently vulnerable to hacks and data leaks.

"Its an open door and they just dont realize it."

"Its an open door, the security researcher said, and they just dont realize it.

BuzzFeed News verified the leak by confirming the exposed information with four individuals whose log-ins were compromised. When contacted, all of these individuals said that Ring did not notify them that their log-ins were exposed. None of them had two-factor authentication enabled on their Ring accounts.

Ring does not alert users of attempted login from an unknown IP address, or tell users how many others are logged into an account at one time. Because of this, there is no obvious way to know whether any bad actors have logged into peoples compromised Ring accounts without their consent.

I never thought that this would happen with a security company, one of the affected users told BuzzFeed News. Im a little taken back from it.

If there was a breach all that information is out there and you had a list of the cameras and camera names they need to alert customers, and that information needs to be taken care of, the affected user added.

All of the affected users said that they had changed their passwords, but that they had no plans to uninstall their security cameras or stop using Rings products and services.

This illustrates that when you bring an internet connected camera in to your home, youre also potentially bringing anyone on the internet into your home, Quintin said.

Over 700 police departments in the US have signed contracts with Ring. These contracts give police access the companys Law Enforcement Portal, which allows police to request camera footage from residents without receiving a warrant. In exchange, Ring often gives police free cameras, and it offers police more free cameras if they convince enough people to download its neighborhood watch app, Neighbors.

This data leak is the latest in a string of incidents involving compromised Ring accounts. The home surveillance camera company, which Amazon acquired in 2018, has been targeted by hackers, who used the cameras to harass children and families while documenting their actions on podcast livestreams. In November, Cybersecurity company Bitdefender published a white paper describing a now-resolved vulnerability that allowed hackers to physically intercept communications between Ring Video Doorbell Pros and a person's WiFi network.

There have been a number of pretty stunning breaches with Ring devices in the last few weeks, Quintin said, and it seems to me like Ring is more interested in making friends with and providing information to police than it is in actually protecting its customers security.

Visit link:
A Data Leak Exposed The Personal Information Of Over 3,000 Ring Users - BuzzFeed News

Read More..

QLD TAFE to rip out legacy Veritas tech in cloud shift – ARNnet

Queensland TAFE is planning to ditch its legacy Veritas back-up and disaster recovery software systems following its move towards a centralised and cloud infrastructure.

The educational provider is seeking a partner to unify its mishmash of data protection solutions into its recently centralised NextDC data centres, as well as Amazon Web Services and Microsoft Azure.

The migration will see the partner replace TAFEs Veritas Backup Exec and NetBackup Appliances, which are held within 19 servers spread across 11 sites throughout Queensland.

According to an invitation to offer, backup software is at different stages of the life cycle, with some outside of vendor support and with no cloud storage for back-up in place.

The technology stack has evolved from legacy and without a single consistent model resulting in a diverse set of technologies being deployed, the document read.

The reduced support of the technology stack has meant that it now has limited capacity, complex structures and large amounts of equipment that has reached end-of-life.

Specifics include a centralised management system of data storage, plus cloud capability for either primary or secondary backups.

In doing so, TAFE intends to reduce the number of vendors, products and services involved in the solution.

Despite the migration, some data backup will remain in six regional hub sites outside the data centres for a period of time into the future.

However, over time most data will reside in either private cloud (central DCs) or public cloud with minimal backup requirements at other sites, TAFE claimed.

Potential suppliers have until January 27 to respond to the tender with a projected start date of 27 February 2020 and completion by the end of the financial year.

Error: Please check your email address.

Tags queenslandTAFEveritasnetbackupBackup Exec

Read this article:
QLD TAFE to rip out legacy Veritas tech in cloud shift - ARNnet

Read More..

The three Cs of Actifio 10c: Clouds, containers and copy data – Blocks and Files

Actifio has released a major update of its eponymous copy data manager software.

Actifio 10c focuses on three Cs: cloud, containers and copy data. The company emphasises backup as a means of getting data into its orbit and using it for archive, disaster recovery, migration and copy data provisioning within and between public clouds and on-premises data centres.

CEO Ash Ashutosh said in a press briefing in Silicon Valley last week: Copy data begins with backup. And goes all the way to archive. He said cloud backup is traditionally a low-cost data graveyard but Actifio provides instant access and re-use.

Warming to his theme, Ashutosh added: The disaster recovery workload is just metadata, claiming 10c offers single-click DR. In this worldview, Virtual Machine recover and migration and disaster recovery are just another form of copy data management.

Peter Levine, general partner at Andreessen Horowitz, the venture capital firm which has invested in Actifio, said in the same briefing that where software eats the world, data eats software.Actifio is in the exact right place hybrid cloud, he said. Hybrid and multi-cloud have to work together to move data seamlessly between all these repositories It was ahead of its time but the time has now grown into Actifio [which is] right on the cusp of cracking open a new layer in the software stack.

10c backs up on-premises data to object storage in the cloud and supports seven public clouds: Alibaba, AWS, Azure, GCP, IBM COS, Oracle and VMware

Actifios objects can be used to instantly recover virtual machines. However, at this point, only AWS, Azure, GCP and IBM COS are supported for direct-to-cloud backups of on-premises VMware virtual machines.

The 10c product stores database backups in the cloud using their native format and can clone them. Actifio positions this as a facility for in-cloud test and development. Developers and testers will use Jenkins, Git, Maven, Chef or Ansible and request fresh clones through them, via a 10c API. The Actifio SW sends database clones from these backups to the testers containers running in AWS, Azure, GCP and the IBM COS clouds.

10c also brings simple wizards for SAP HANA, SAP ASE, Db2, MySQL database backup and recovery and external snapshot support for Pure Storage and IBM Storwize arrays.

Actifios objects are self-describing which aids their movement between clouds as they carry their metadata within them. Ashutosh said. You cant scale with a separate object metadata database. He noted object storage supplier Cloudian, for example, uses a Cassandra database for metadata.

The 10c speed angle is strengthened by Actifios ability to create and provision 50TB clones of an Oracle database from a 17TB object, and deliver them to five test developers as virtual object copies in eight minutes. It can deliver five production copies, in block format, in 13 minutes. (An IBM/ESG document describes this test.) Actifio said Oracle RACs own procedures would take 90 minutes at best and possibly days to produce five block-based copies.

An on-premises cache can be used to speed self-service on-premises recoveries and lower cloud egress charges. The device uses SSD storage and can cache reads-from and writes-to cloud object storage to increase the overall IO speed.

Actifio 10c is generally available in the first quarter of 2020. The enhancements in Actifio 10c will also be available in deployments of Actifio GO, the companys multi-cloud copy data management SaaS offering, as well as Actifios Sky and CDX products.

Actifio has more than 3,600 enterprise customers in 38 countries. Hitachi and NEC are big resellers in Japan and Lenovo is also a reseller. IBM resells Actifios software as its Virtual Data Pipeline, and this competes somewhat with IBMs own Recover software. There is no partnership with HPE and nor with NetApp but, Ashutosh says: Were friendly with Dell EMC.

It will sell software in the SMB market through resellers.

See the rest here:
The three Cs of Actifio 10c: Clouds, containers and copy data - Blocks and Files

Read More..

Endurance International hosted Cloudbazaar 2019 the premier event for cloud, domains and web hosting – ETCIO.com

Cloudbazaar 2019, held at NESCO Grounds in Mumbai on December 6, aimed to highlight new opportunities in the SMB market for web professionals. The 8th edition of the Cloudbazaar conference & tradeshow attracted web designers, web developers and web solution providers. Each year, the conference hosts global tech leaders and decision makers to discuss the emerging tech trends in the web ecosystem. As the thought-leadership platform, it helps the web professional community advance, upskill and win as well as provide an opportunity for the attendees to interact with key decision makers and discover new partnerships with peers from the industry.

This year, the conference was themed around growing internet economy and imperatives of digitalisation for the SMB market in India. The event saw a whooping 3000+ pre-event registrations by developers, designers and digital entrepreneurs.

The tone was set by Manish Dalal, SVP & GM APAC, Endurance International Group (EIG), as he underlined web presence and growth of digital streams in India. He also shared the success story of the web professional community. His keynote address was followed by an expert session of Romain Cholat, VP & GM, Verisign. Cholat highlighted the wave of domain names and mapped down the growth of the internet economy.

Touted as India's biggest event for the Internet Infrastructure Industry, the event saw a footfall of over 1000++ attendees.It promoted diversity of audience and speakers from different business channels such as domains, web development, cloud storage, web designing and more. Sidharth Malik, Chief Revenue Officer, Freshworks; Harish Vellat, SMB & Corporate Business Leader, Microsoft; Sanjay Mehta, Joint CEO, Mirum; Pari Natarajan, CEO, Zinnov; Ruslan Synytsky, CEO, Jelastic; Rama Aleti, Co-founder & Design Director Think Design and Yvonne Doll, Sr. UX Designer, Jetpack were among the prominent speakers at the conference.The speakers shared their knowledge and expertise on topics ranging from technology, design, digital marketing, business and more.

The panel discussion highlighted several challenges faced by SMEs while adopting digital technologies. They also emphasised on the growth of opportunities in the Indian market for web pros and the importance of digital transformation.

The event also saw 20+ partners including ORG, Verisign, ZOHO, Google Cloud, Jetpack, the .CO registry and others who showcased their products and solutions at the exhibition centre. Workshops and parallel track sessions on digital marketing, application development, etc engaged the delegates with business use cases and industry insights.

The sessions and presentations held in the conference deliberated on the inevitable digital transformation that businesses are adopting and investing in 2019. Most of the discussions highlighted the future impact of digitalisation on the web pro community and the new opportunities emerging in the market.

This year, Cloudbazaar had a few new things. Firstly, they lent their support to the India STEM Foundation, an NGO that works to popularise Science, Technology, Engineering and Mathematics through workshops, events, teacher's training and more. Students and volunteers from the NGO were present at the event and given an opportunity to interact with the attendees. Secondly, Cloudbazaar hosted speed networking sessions for attendees to network in a formalised manner to meet large numbers of new potential business partners in a very short period of time. Both of these were first-time initiatives and were well-received.

Read the original post:
Endurance International hosted Cloudbazaar 2019 the premier event for cloud, domains and web hosting - ETCIO.com

Read More..

AWS Outposts Is GA. Now What? – CRN: Technology news for channel partners and solution providers

AWS Outposts, the public cloud leader's once-unthinkable leap into corporate data centers, is here. Sort of.

The Amazon Web Services-outfitted, on-premises server rack officially went GA during the AWS re:Invent conference a couple of weeks back. Now launch partners are busy training staff, starting conversations with customersand waiting for AWS to tell them when the offering will actually ship.

The availability and capabilities of Outposts is "probably the No. 1 question coming out of re:Invent from our customers," Jeff Aden, executive vice president of marketing and strategic business development at Seattle-based cloud consultancy 2nd Watch, told CRN. "[AWS CEO] Andy [Jassy] said it's available, and we have lots of clients asking about this."

[Related: Supported AWS Outposts Services Hyped Ahead Of Launch]

Despite being a launch partner and one of the standouts in Amazon's channel, 2nd Watch still doesn't know when those engagements can begin in earnest, regardless of the GA status.

Large enterprisesespecially those operating manufacturing plants or with remote regional facilitiesare eager to hear how Outposts can fill the gap in their IT environments for workflows with proximity or latency requirements, Aden said.

2nd Watch technicians went through training before the annual AWS conference in early December, and now the consultancy's sales agents are going through the enablement process as they prepare to move early sales discussions to the purchasing phase.

But selling Outposts will not be as simple as an online transactionat least not initially.

"It's going to be an interesting product," Aden said. "Amazon is going to ship, deliver, rack, stack, set up."

Outposts will first offer AWS EC2 instances and EBS block storage, as well as the ECS and EKS hosted container services. S3 storage will be added in the first half of 2020, Jassy said at re:Invent.

The "AWS native" variant for customers that want to use Amazon APIs and control plane will ship first. For customers that want to use a VMware control plane, possibly in conjunction with VMware Cloud on AWS, that option will be available come early 2020, Jassy said.

Kelly Hartman, global head of AWS Partner Network, told CRN during re:Invent that AWS is finalizing the details of when the product will ship and how ongoing infrastructure management duties will be divided with consulting partners.

There's no special partner accreditation for Outposts, Hartman said, as AWS designed the system to be operationally identical to its traditional cloud offerings.

"There isn't anything different," she said. "We extend all the services."

For that reason, Hartman told CRN she sees the biggest play for partners, at least in the early days of Outposts, as migrations.

"Our starting point is our migration partners, so I would tell any partner who's interested to start there," Hartman said. "Look into the migration competency and match those capabilities, then surround that with whatever the use case is, whether it's data sovereignty or whatever [customers] need the on-prem solution for."

That's especially true in geographies with strict data requirementspartners serving those regions should prioritize looking into Outposts, she added.

As to lingering questions in the AWS channel about how ongoing management will be split between partners and AWS, "the management piece will probably evolve over time," Hartman said.

Technology partners will also soon learn how they can begin testing their software on Outposts hardware to deliver "a whole suite of applications customers are used to running" on the on-premises infrastructure, she added.

Onica, an exclusive AWS partner recently acquired by Rackspace, is currently putting its people through Amazon's enablement program as an Outposts launch partner. As staff attend training sessions, executives wait to hear more about the go-to-market timeline, Onica CTO Tolga Tarhan told CRN.

"It's not something we've seen deployed in the wild yet," Tarhan said, adding, "this is typically how things launch at AWS."

Amazon has suggested partners may one day have the option of installing the Outposts rack, but at first the channel's role will be migrating data and workloads and providing managed services around them, he said.

Part of the appeal of Outposts is it decouples the two main challenges of cloud adoption faced by many enterprises: transforming their operating models and outsourcing their infrastructure.

Outposts allows IT teams to start working with a DevOps, software-defined model of building and managing infrastructureessentially operating as in the cloud. Outsourcing can come later, possibly when motivated by financial exigencies, like a data center refresh, Tarhan said.

"That's the value [proposition] for a partner," he told CRN. "We can help customers do that transformative piece, and we are sure the cloud workloads will follow. Once people get comfortable with how cloud works, they are going to the public cloud when the rest of their data center ages out."

Whether Outposts make sense for customers because of compliance and security constraints, financial models, remote connectivity challenges, or just to host workloads not ready to move off-site, Tarhan said the common theme is "you can have the programming model of cloud without having the outsourcing of cloud. And I think that will be attractive to people."

Having recently merged its business with one of the world's largest private cloud providers, Onica hopes to even deploy some Outposts racks in Rackspace data centers. Doing so will allow Rackspace to offer customers a unique cloud solutionan AWS environment that's privately hosted.

One day, "customers could choose to order Outposts and gear it up in a Rackspace data center," Tarhan said.

Read more here:
AWS Outposts Is GA. Now What? - CRN: Technology news for channel partners and solution providers

Read More..

Experts Share Their 2020 Backup and Disaster Recovery Predictions – Solutions Review

For our BUDR Insight Jam, we asked experts in the field from around the world to share their 2020 backup and disaster recovery predictions. Originally, we planned on posting all of their comments and 2020 backup and disaster recovery predictions on Twitter under the hashtag #BUDRInsightJam. However, we face two unique challenges:

Admittedly, these challenges are the best ones to have. However, it made posting the full predictions almost impossible for our Twitter-based event. Because of this, we posted shortened versions of the 2020 backup and disaster recovery predictions on Twitter during the Jam, and well post the predictions in full here. Thus, you get the best of both worlds for our first-ever BUDR Insight Jam.

Without further ado, we present our experts thoughts on backup and disaster recovery in 2020, in no particular order. While youre reading, feel free to check out our new 2020 Backup and Disaster Recovery Buyers Guide.

Data for next-generation businesses will never touch a traditional data center

Data centers are consolidating at a startling rate. While they likely wont disappear completely and can help meet specific use cases, the overwhelming of new businesses are not leasing or building traditional data centers. New companies such as Slack, Zoom, ServiceNow, Box, and Twillio are 100 percent cloud-based. And meanwhile, existing companies are migrating to the cloud. The business model for new companies infrastructure is changing, and companies are realizing they do not need to be in the data center business anymore to successfully run their business. The traditional data centers as we know them are becoming obsolete.

Improved data hygiene will become a standard insurance requirement

Ransomware and cyber-attacks will expand in 2020 because they work. Targeted organizational attacks will, unfortunately, continue to increase across industries and governments and hackers are relentless in searching for any exposure in an ecosystem whether it is gullible users, unpatched systems, or misconfigured cloud accounts. Cyber-criminals know that you will pay to save your data, your job, and your reputation. They also know many companies now have cyber insurance to help pay the bill.

As a result of the ongoing attacks, cyber insurance companies will increasingly demand proof of protection and recovery plans. They cannot stay in business and continue to pay out claims to companies that arent taking certain steps to protect themselves. To get insured, organizations will need to document a holistic plan to deal with cyber threats both preventing and recovering from attacks. Providers will also become more prescriptive about categorizing claims. Today many ransomware claims are submitted under errors and omissions (E&O), which is a very general category protecting companies and their workers against claims of inadequacy. Cyber insurers will start rolling out specific claim guidelines that account for lost or compromised data rather than filing everything under the broad E&O category.

SaaS backup will be a prevalent method for recovering from cyber-attacks since it stores data in a separately managed cloud account. The air gap from the data center to the cloud protects the data from ransomware, and separation of control secures the data from deletion by malicious internal actors. Additionally, SaaS offers faster recovery in the event of an attack with the ability to restore data anywhere including running permanently in the cloud, while ensuring higher reliability for recovery through automated testing and machine learning to confirm data fidelity.

Cloud providers will pay big for analytics companies

Cloud providers are rich with data, but without proper analytics features, this data is underutilized. The market is teeming with analytics startups, but the successful ones have niche, vertical-specific applications that leverage their expertise. I would expect companies will look to either partner with or more likely acquire these smaller companies to add analytics capabilities to their own applications. The number of M&A deals with smaller analytics companies will skyrocket in 2020.

2020 will be the year of the mainframe (model): Next year will mark the death of the developing applications for dedicated infrastructure. Developers will not build new applications to run on virtual machines with flash storage. Instead, they will create containerized and serverless applications that start on-demand, load data from object storage into persistent memory, execute, and then release all the resources. If that makes cloud sound like mainframe, it should. The reborn centralized model eliminates the inefficiency of legacy server and storage systems.

The hardware trends leave traditional flash storage arrays without a role. Persistent memory will become mainstream and deliver application I/O performance. Meanwhile, object storage prices will continue to plummet, so they can efficiently store the application dataset. Traditional storage, optimized neither for performance or capacity, will have no place in well-designed applications.

The legacy dedicated hardware model is dead it just doesnt know it yet. The mainframe (as cloud) rises again!

In 2020 we will see improvement in many security areas, more specifically organization will start to pay more attention to the weakest link in data protection the user. Incorporating new technologies and integrated approaches to defend that first border between the company and the broader internet. Education will become more widespread looking to reduce the amount of phishing, spear phishing, and other socially targeted attacks.

Increasing government oversight and regulation of digital technology, combined with public demands for greater transparency on how personal data is handled, will bring previously obscure data management and storage practices into the national political conversation. Pressure to find a workable solution from consumers and businesses which are now struggling to deal with balkanized state and country data privacy standards will lead to a national data privacy law similar to Europes GDPR that resolves the current patchwork of laws and provides stability for business planning. Questions about where specific data is stored, how it is managed, and who has access will become top US business concerns rather than just administrative IT issues. Any new policy will require fundamental changes to business processes, staff education, and technology. It will prompt companies to look for better ways to get insight into their own complicated data infrastructure and secure granular control over their data.

Application vendors will architect HA and DR into their core solutions.

Application vendors will endeavor to deliver greater value and higher reliability by integrating core high availability (HA) and disaster recovery (DR) features into their solutions. Most applications today require the customer to provide these protections separately, and most organizations do this for all their applications with a general-purpose HA/DR solution. With HA and/or DR built into an application as a standard feature, customers will be able to simply deploy it on any platform in a private, purely public or hybrid cloud environment. This will be especially beneficial for smaller organizations that normally lack the expertise or resources needed to implement and operate configurations capable of eliminating all single points of failure. For cloud-native implementations, the application vendor will want to take full advantage of the resiliency afforded by the clouds multiple availability zones and regions.

Resellers and system integrators will play an increasingly vital role as critical applications move to the cloud.

As the migration of enterprise applications to the cloud accelerates and matures, the need to ensure mission-critical high availability (HA) will create opportunities for resellers and system integrators. This window of opportunity is forming as enterprises seek more robust HA solutions that have yet to be fully integrated into the application and system software. Some system integrators may have the expertise and resources needed to leverage open-source software in their Linux offerings. But an increasing percentage will choose to integrate solutions purpose-built to provide HA and disaster recovery protections, as these have proven to be more dependable for the customer, while also being just as (if not more) profitable for the integrator.

Purpose-built high availability and disaster recovery solutions will benefit from machine learning and artificial intelligence.

Increased focus on the mission criticality of enterprise applications currently or expected to be migrated to the cloud will motivate vendors of purpose-built high availability and/or disaster recovery solutions to enhance their offerings. Competition in growing markets always fosters innovation, and advances in related technologies hold real potential for improving HA/DR protections. Expect machine learning and artificial intelligence to be used to monitor how applications run 24365, and then automatically and dynamically adjust resource allocationsscaling both up and down as needed in both active and standby instances. These enhancements will make HA/DR protections more affordable for more applications, further accelerating migration to the cloud.

In 2020 the world of backup and DR will evolve to better support containerized environments. More and more businesses are using containers in production IDC has said that 76 percent of enterprises are making broad use of containers for mission-critical apps. This creates challenges for traditional backup solutions, which werent designed for the fast-moving, dynamic world of containers. In 2020, backup architectures will evolve to better accommodate these environments. Among other things, this means systems will back up data in a more granular fashion at the container level, rather than just at the VM level, and backups will need to capture an applications configuration details as well as its data.

Hadoop storage (HDFS) is dead. Hadoop compute (Spark) lives strong.

There is a lot of talk about Hadoop being deadbut the Hadoop ecosystem also had many rising stars. These were the compute frameworks like Spark that extracted more value from data. Others like Presto have also been adopted into the broader compute ecosystem. So todays Hadoop has been broken up. Hadoop storage (HDFS) is dead because of its complexity and cost and because compute fundamentally cannot scale elastically if it stays tied to HDFS. To glean immediate, real-time insights, users need immediate and elastic compute capacity thats plenty available in the cloud. Data in HDFS will move to the most optimal and cost-efficient system be it cloud storage or on-prem object storage. HDFS will die but Hadoop compute will live on and live strong.

We will see non-relational databases really take off within enterprises in 2020 as developers reject the one-size-fits-all approach of SQL and incorporate more purpose-built databases to handle specific needs and use cases. The fastest user growth is all occurring among non-relational databases and there are now database options that clearly do best with certain categories of data such as object storage, key-value, document graph, and time-series data.

As IT complexity increases and security concerns mount for business of all sizes, backup and DR tasks are evolving beyond their base functions and becoming increasingly integrated with other IT management tools. The increased use of automation and AI capabilities are also helping to make BDR easier and more effective. An end-to-end management platform that integrates remote management, monitoring, security detection and response, and backup and disaster recovery can improve the efficiency of an IT team, freeing them to work on other projects.

Things are going to get worse. Were going to see more ransomware attacks targeting backups in 2020. Remember: Backups arent immune to ransomware. With the right strategies in place, though, you can protect backups from ransomware. To prevent this malicious software from infecting your backup storage mediums, you must disconnect them from one another (air gap) and ensure versioning is enabled. This is what will keep your backups safe and prevent ransomware from infecting them.

As severe weather events become more common, businesses need to adjust their disaster recovery plans to better anticipate storms that could halt their operations and IT services. Disaster recovery and business continuity specialists should anticipate more states to implement disaster prevention tactics like Californias planned power shut-offs, and begin documenting procedures to prevent extended downtime, data loss, or financial damages. This will become the new normal for many businesses operating in areas where severe weather is more likely, so business leaders will start to prepare for these scenarios after seeing what has happened in California.

One or more major airport and/or seaport will be inoperable for a minimum of 24 hours due to ransomware attacks. A statistical, security and/or analyst firm will start publishing figures on the number of people who died or was injured due to critical healthcare IT systems crippled by ransomware. Ransomware attacks will increasingly be attributed as the main cause of serious public incidents, including potentially affecting major elections in at least one country.

Over the past few years, a lot of customers started to migrate their DR into the cloud leveraging VMware Cloud on AWS. Today, VMware has announced major partnerships with Azure and Google Cloud Platform. People can now select any of the public clouds for their DR workflows. We think, because of the adoption weve seen on VMware Cloud on AWS, the single largest trend for DR in 2020 is going to be that many more people start adopting different public clouds for their Disaster Recovery leveraging VMware technologies.

RANSOMWARE MEETS ITS NEMESIS AND ITS CALLED BLOCKCHAIN.

While it may take a while for blockchain to be adopted in the financial markets or in other consumer-applications, its real-world use-case as a mechanism to prevent ransomware attacks will gain swift adoption. StorageCraft is way ahead of the rest of the industry with an already implemented blockchain file system. I fully expect a catch up scramble amongst the data management vendors. Were the only ones to provide an immutable file system where data cannot be overwritten or deleted by ransomware. Our fully auditable, immutable and unchangeable view of the history of data at rest means organizations even with distributed environments know if, when and where a ransomware infection occurred. Our ability to also provide continuous, immutable snapshots of data, means we can return data to its pre-ransomware state.

CONVERGED INFRASTRUCTURE SPENDING PAYS OFF PUTS LEGACY VENDORS ON NOTICE

Early adopters of converged data infrastructures will see their investments pay off in terms of economics, scale and agility. This will accelerate the maturing of the converged data infrastructure market and put a nail in the coffin for the big legacy vendors who have built their business models on having a different product line for everything. Buyers will no longer tolerate paying for solutions from vendors that simply bifurcate their product lines over and over again. We will see the beginning of the end of siloed infrastructure vendors.

THE CHANNEL GETS WARY OF ACCIDENTAL VENDOR LOCK-IN

The M&A cycle will continue in the MSP space with MSPs acquiring for scale and vertical expansion. However, as MSPs merge and consolidate, they should pay particular attention to consolidation on the vendor side. As MSPs focus on their OML (operational maturity level), standardization for them only makes sense if it aligns with their business plan. Standardization must be in line with the vision of how they wanted to build their business. In contrast, MSPs may be in for a rude awakening if they find themselves being consolidated into a technology platform not out of choice but because of vendor consolidation. MSPs will need to be wary of this accidental vendor lock-in and make sure they can opt for best of breed where it makes business sense. We expect to win big here because, unlike our competitors, we offer a single technology stack for MSPs looking to standardize on their business continuity portfolio. In addition, StorageCraft offers a fit-in, stand-out approach where our solutions also present themselves as best-of-breed options for data management, protection, and recovery.

Many organizations are pursuing a cloud-based Disaster Recovery (DR) strategy to achieve the business objectives of: 1. Getting replicas off-site and 2. Eliminating the cost and complexity of building and maintaining a DR site. But these DR strategies typically depend on a VPN to connect the on-premises source to the cloud-based target. Thats a problem because traditional VPN software solutions are obsolete for the new IT reality of hybrid and multi-cloud. They werent designed for them. Theyre complex to configure, and they expose slices of the network, creating a lateral network attack surface. In 2020, a new class of DR software with integrated SDP security will emerge to eliminate these issues and disrupt the cloud DR market. This new SDP-enhanced DR software will enable organizations to build smart endpoint DR environments that can seamlessly span on-premises and the cloud without the added costs and complexities of a VPN, and with virtually no attack surface.

In the enterprise space, on-premises storage will see a comeback

Businesses have been moving to the cloud for primary and archive/DR storage for a long time In 2020, on-premises storage, whether for active or standby, will see a comeback especially, as customers are hit with cloud-use bills that are dramatically higher than originally anticipated.

In 2020, to be successful and provide value to the business, enterprise IT will need to be able to straddle the worlds of the cloud and on-premises storage. Software solutions that enable swift mobility between these two domains will become increasingly critical. Only via this bi-model model will IT be able to achieve the highest performance, scalability and capabilities, as well as the safest retention, at the most cost effective price.

In the Prosumer and SMB space, multi-layered data management and protection will become priority #1

In the Professional Consumer (i.e., Prosumer) and SMB space, data storage and protection has always been a priority, but cost has been a roadblock for those seeking to employ comprehensive end-to-end data management and protection solutions.

In 2020, Prosumers and SMBs will demand solutions that enable them to seamlessly and affordably layer features and functionality onto their on-site storage, such as integration with off-site cloud and SaaS backup and recovery solutions, with flexible cross-platform support for all major platforms including Windows, Mac, Linux, VMware and Hyper-V.

Disaster Recovery Will Be the Ultimate Brand Management Tool

2020 holds a lot of promise for storage professionals, but it also holds an entirely new threat landscape. Ransomware, data breaches, malware, and outages are just as common as they have ever been. A breach or cyberattack has the capability of destroying businesses and company reputations, which is why storage providers have made disaster recovery a major focus in recent years. At the end of the day, a companys reputation will be determined by how they either prevented the attack and/or data loss, or how they responded to the incident. The key ingredients of successful disaster recovery plans will include a focus on vulnerability detection, DR team development, and effective communication. There will also be an increased focus on multi-cloud strategies as a way to ensure both on-prem and offsite protect.

Ransomware will innovate faster than mechanisms to prevent it.

Ransomware is plaguing the enterprise and its getting worse, fast. Due to its insane profitability, the proliferation of non-state and state actors, and cybercrimes (including ransomware attacks) will cost the world $6 trillion annually by 2021. According to the State of Enterprise Data Resiliency and Disaster Recovery 2019 report, nearly 90% of companies consider ransomware a critical threat to enterprise business. Ransomware will be a massive threat to all organizations for the foreseeable future because it is very challenging to detect or prevent, exacerbated by the furious pace of innovation. Prevention would be the ideal course of action, however, organizations must prepare for when defenses failsince they will fail. While the current recommendation from experts is to just pay up the ransom, there is an alternative approach: every business should investigate deploying a quick data recovery infrastructure that can help instantly roll back the IT environment to its pre-ransomware state and recover from an attack unharmed. Ransomware recovery will become a budget line item for the majority of CIOs in 2020.

Mainstream enterprises will finally embrace DR to the cloud.

Businesses are clamoring for better disaster recovery solutions in the face of escalating threats from natural and human-generated disasters. Using the cloud for DR has been theoretically interesting but physically impractical due to the huge expense of storing large amounts of data in the cloud and the costs and slowness of moving it across the wire in either direction. In 2020, mainstream businesses will become open to leveraging the cloud as a DR site and will start shutting down their physical DR sites because new cloud DR technologies will make it possible to leverage on-demand cloud resources during a disaster while keeping cloud costs low during the state of normal business operations. While there will be many options for customers to choose from in 2020, they must take caution and make sure to verify claims surrounding recovery point objective (RPO) and recovery time objective (RTO). 2020 will be the Wild West of cloud DR performance claims.

Business continuity (BC) and disaster recovery (DR) strategies will be put to the test.

Business continuity will become even more critical as businesses respond to the always-on requirements of the on-demand economy. Today, IT practitioners still have to manually coordinate a mixed bag of data storage products and applications to prepare for a disaster event; BC and DR have been largely a bespoke process, making them very complex. At the same time, threats continue to grow increasingly advanced, pervasive and unpredictable.

With cybercrime such as ransomware, recovery of a whole data center depends on backups that are typically months old. DR orchestration software generally doesnt have ways to access these when time matters. Simplicity and integration (using snaps with VM-centric catalogs, converging primary and backup storage) trump per-subsystem optimization that can drift out of compliance easily. In 2020, IT teams must take advantage of new BC and DR innovations or else they will fail to compete in an increasingly treacherous and competitive business climate.

Thanks to all of our experts for their predictions and participation in the BUDR Insight Jam! Check out our Backup and Disaster Recovery Buyers Guide for 2020! It includes full profiles of the top 28 providers, as well as questions to ask yourself before purchasing.

Editor at Solutions Review

Tess Hanna is an editor and writer at Solutions Review covering Backup and Disaster Recovery. She has a degree in English and Textual Studies from Syracuse University. You can contact her at thanna@solutionsreview.com

Read the rest here:
Experts Share Their 2020 Backup and Disaster Recovery Predictions - Solutions Review

Read More..

Amazon, AWS and antitrust: How tough could US lawmakers be on the tech titan? – ComputerWeekly.com

The US House Judiciary antitrust subcommittee sent requests for information letters to four of the biggest technology companies in the world in September 2019, out of concern at the fact a handful of companies hold a huge share of the digital market perhaps more than is deemed healthy.

The CEOs of Amazon, Apple, Alphabet and Facebook were the recipients of these letters, which contained a number of detailed questions about how they do business. The deadline for the companies to submit their responses has now passed, and the committee is in the process of reviewing their submissions.

For Amazon in particular, this investigation could have a huge impact on its business. The committee not only questioned Amazon around its e-commerce presence, but also about its crown jewel, Amazon Web Services (AWS), which accounts for 52% of its entire operating income.

There were numerous questions around AWS, but there were several which stood out in particular, with the committee requesting all communications to or from relevant executives relating to:

Jonathan Osborne, an attorney from Globalaws Florida firm Gunster, says the information being sought suggests the US Congress and regulators are trying to understand two key things about how Amazon is run.

The first is whether Amazon is using AWS and its relationships with its business customers to create an unfair advantage, both in the web services space but also in the other businesses that Amazon has, he says.

The second question is whether the current antitrust laws in the US adequately provide oversight and regulation for companies like Amazon in the web space.

Stacy Mitchell, co-director at the Institute for Local Self-Reliance, a not-for-profit that challenges concentrated economic and political power, says there are two main concerns around Amazons access to competitors data stored on AWS.

The first is whether AWS uses information it has on companies such as Netflix, which rely on AWS to host their data and manage their activity in the cloud for Amazon Prime, or information from its retail platform to use for its e-commerce operation.

Incidentally, the company faced similar scrutiny on these matters in the UK earlier this year, when its UK director of public policy, Lesley Smith, fielded questions from the House of Lords Communications Committee pertaining to its hold on the cloud and e-commerce markets.

On this front, Marks & Spencer is known to have moved from AWS to its own proprietary infrastructure because itfeared Amazon would use its information to secure a competitive advantageof some kind.

As far as I know, theres been no indication that [Amazon has] done anything with regards to those kinds of customers, but where there is more evidence and concern is around third-party applications which are resting on AWS infrastructure, says Mitchell.

The question is, is Amazon watching those companies, seeing which applications are popular, and then making knock-off versions itself? And then, as it is fully integrated with AWS, customers are more likely to choose them, and therefore its version has a degree of preferential placement and is therefore more likely to be used by customers.

For many of AWSs rivals, there is no question that this is what it is doing.

How do you suppose Amazon decides which cloud products to develop next? I fully expect that they look at what other vendors are selling on their platform and say Gee, that one is selling really well. We should build our own version and cut those guys out. Well make more money, says DavidFriend, CEO and co-founder of cloud storage providerWasabi Technologies.

Friend says if the firm is using this kind of information to gauge how well AWS-hosted applications are selling to inform its product development, it should be made accessible to all to create a level playing field.

Why shouldnt I be able to see whats hot on AWS as a way to guide my companys product development when Amazon is doing exactly that? DavidFriend,Wasabi Technologies

Why shouldnt I be able to see whats hot on AWS as a way to guide my companys product development when Amazon is doing exactly that? Amazon will never agree to publicly disclose this data, though, [because] who would want their applications hosted somewhere that discloses such proprietary information? he says.

The issue for these third-party companies, many of which are startups, is that AWS has such a big hold on the infrastructure-as-a-service (IaaS) market, that if they do not put their apps on AWS, they may not be able to reach their customers.

When Amazon owns the rails that you need to get your product to market, and they also compete directly with you on those same rails, that is a core competition problem, says Mitchell.

"If Amazon was a small company and there were lots of cloud providers, then the playing field would be more balanced as third parties could tell AWS that if theyre not treated well theyre not going to offer their products on Amazons platform.

In other words, the balance of power is very much in Amazons hands, forcing companies to operate on AWS.

This means that Amazon can do whatever it wants without repercussion and therein lies the problem of competition, says Mitchell.

But this isnt all the committee is asking about, as Osborne alludes to.

To sign up for an AWS account, a business or individual would have to provide Amazon with identifying information, whether thats their email account, phone number or where they live. The question would be whether Amazon is making that same data, which is a pretty simple database of information, available to its other businesses, he says.

This ranges from groceries, to in-home services like Alexa, to the way people are reading books, listening to podcasts or even securing their homes.

While the first question for the committee would be around whether Amazon is freely sharing this data among its businesses, the second would be whether Amazons access to and use of this data has an exclusionary effect in the market.

In other words, is it enhancing Amazons power in other business realms, in a way that other competitors are not able to keep up with, because Amazon is getting the data from its AWS offerings? Osborne asks.

According to Mitchell, the US has not had a congressional investigation into monopoly power like this in decades. So not only is the investigation into these technology companies from an antitrust perspective new in itself, but the process has not been used for so long that this is new territory for everyone involved.

But the in-depth questions the committee has pulled together for each of the four tech companies suggests this is not a case of an investigation being undertaken for the sake of an investigation.

This is not a fire drill, theyre very serious and the list of information theyve requested speaks to that. The specific things theyre asking illustrates a depth of knowledge about these corporations and their business models and where the anti-competitive issues exist, says Mitchell.

So what next? Well, although the technology companies have sent over their evidence, there is uncertainty from AWS on when this evidence will be made public, and how.

Much to Mitchells surprise, Amazon has shared details of the evidence and responses it shared with the committee to its questions, which he previously thought the firm would be unwilling to do because of the potential impact such disclosures could have on its cloud and retail businesses.

Osborne says that after reviewing the data, the committee is likely to set hearings in which they would take comments from Amazons representatives as well as other experts in the fields of antitrust and potentially e-commerce and cloud computing.

After these hearings, the committee, which is made up of members of Congress, can propose legislation to change the landscape for antitrust regulations in the online market going forward in the US.

That would take proposing a bill, passing a bill and the bill ultimately being signed by the president, says Osborne.

This could take a significant amount of time, and as it involves the Federal Trade Commission (FTC) and US Attorneys offices in different parts of the country, as well as state regulators, a lot of complexity is involved.

This could lead to legislation which would break up the companies including Amazon.com from AWS, and Instagram from Facebook.

It appears from the congressional paperwork and the nature of the investigation that what they want to determine is whether the current antitrust laws adequately protect consumers, and if not, what if any changes could be made, says Osborne.

I think the third thing is Amazon is going to have to decide is whether it wants to break off from AWS to keep control over the way it operates and not wait for the government to step in and tell Amazon what it has to do, he adds.

Another course of action could be litigation.

Antitrust lawsuits could be filed by either of the two enforcement agencies in the FTC or the Department of Justice, if the agencies look at the findings in the report and see behaviours which violate antitrust laws and they can file a suit to address those, says Mitchell.

The key is whether the antitrust laws that the US currently has are strong enough or even applicable to these new companies. Considering the technicalities involved in cloud computing, this is unlikely to be the case, and a new set of laws would make sense.

It will be intriguing to see whether Amazon will wait for this to be completed or decide of its own accord to split the e-commerce and cloud computing businesses, although AWS CEO Andy Jassy recently told US news site CNBC that there is no incentive to spin-off the latter at this time.

Either way, Osborne says the outcome of this inquiry could be a defining one in the history of online marketplaces and how they are operated.

And while this is only US law, the changes it could potentially bring in with regard to how these companies operate could have a huge impact all over the world.

Data sovereignty issues are only going to increase and this antitrust probe is just another example of growing concerns about the power and accountability of big tech, and its ability to transcend individual government regulatory oversight, says Memset chief operating officer Chris Burden.

Monopolies damage choice, close down new ideas and slow down the pace in which innovation comes to market, he adds. Anything that can be done to control this monopoly before it is too late should be welcomed.

Read this article:
Amazon, AWS and antitrust: How tough could US lawmakers be on the tech titan? - ComputerWeekly.com

Read More..

Bitcoin Suddenly Dives Below $7,000 As Crypto Markets Lose …

Bitcoin and cryptocurrency markets have been suddenly sold off, with the bitcoin price losing around $200 per bitcoin in minutes and dipping under the psychological $7,000 mark once againcontinuing a period of relative volatility for digital tokens.

Bitcoin-rivals ethereum, Ripple's XRP, bitcoin cash, litecoin, EOS and binance coin were also heavily sold off, wiping billions of dollars from the combined cryptocurrency market capitalization.

[Update: 9:10am EST 12/17/2019] Bitcoin, which yesterday dropped by 3.5%, has moved lower again, dropping over 5% since this time yesterday and trading as low as $6,708 per bitcoin on the Luxembourg-based Bitstamp exchange after briefly recovering ground overnight. Altcoins ethereum, Ripple's XRP, bitcoin cash, litecoin, EOS and binance coin were last seen down between 7% and 11% and look to be heading lower.

The bitcoin price has been swinging wildly over recent months as traders and investors try to guess ... [+] how regulators will treat the bitcoin and crypto industry in 2020.

The cause of the sudden sell-off was not immediately clear, however, analysts have noted a drop in crypto market trading volume recently.

"All is quiet on the crypto front. Perhaps, a little too quiet," Mati Greenspan, the founder of bitcoin and crypto research outlet Quantum Economics wrote in a note ahead of the bitcoin sell-off today, adding the dominance of the world's biggest stablecoin, tether, "seems to be waning."

Bitcoin was earlier trading at $6,880, down by 3.5% over the last 24-hour trading period, according to prices from U.S.-based crypto exchange Coinbase, with ethereum, Ripple's XRP, litecoin, and bitcoin cash all off by between 5% and 8%.

EOS, a dectralized app token similar to ethereum, led the bitcoin and crypto market lower.

Earlier this year, bitcoin and cryptocurrency price watchers warned that "dismal" bitcoin volumes could mean the market was headed for a perfect storm.

In periods of low trading volume, crypto prices are more vulnerable to so-called whales moving the market by placing massive buy or sell orders at a little above or below current market rates. These can trigger trading algorithms that then send prices sharply higher or lower and can be a sign of market manipulation.

Meanwhile, research out earlier today suggested the bitcoin price might struggle over the short term due to the $2 billion PlusToken scandalone of the biggest ever cryptocurrency scams.

The bitcoin price hasn't fallen below $7,000 since the end of November and its sudden fall knocked ... [+] the wider crypto market, with major tokens ethereum, litecoin, bitcoin cash and Ripple's XRP falling sharply.

"Thats certainly something to consider when you are thinking about where the price is going, at least in the short term, Kim Grauer, senior economist at blockchain analysis company Chainalysis told financial newswire Bloomberg. "It could be, according to our research, continued downward pressure."

PlusToken scammers are thought to have sold some 25,000 bitcoin, according to Chainalysis data, with a further 20,000 still to be dumped back onto the market.

Read the rest here:
Bitcoin Suddenly Dives Below $7,000 As Crypto Markets Lose ...

Read More..