Page 4,195«..1020..4,1944,1954,1964,197..4,2004,210..»

Expert Predicts Bitcoin Will be Worth up to $20000 in the Next Three Years – Futurism

In BriefRonnie Moas, founder of Standpoint Research, spoke with CoinDesk about his predictions for bitcoin and the entire crypto market. He foresees a major boost as opposed to a bubble burst.

Bitcoin is on an ever-upward trend lately. Today, the worlds first and most popular cryptocurrency is flirting with $4,500 in valuation. Bitcoinis now more than three times more valuable than gold., and, according to one expert, Ronnie Moas of Standpoint Research, we are still only at the tip of the iceberg.

Moas spoke with CoinDesk about his forecast for the cryptomarket. He predicts that all cryptocurrencies will be worth $2 trillion in the next 10 years, a significant bump from their current standing at $150 billion. Moas sees a direct parallel between the crypto market and the dot-com boom of the 1990s.

I am not any more concerned with bitcoin being at a record high than Amazon or Google investors were concerned when those share prices jumped hundreds of percent and hit $100 and $200 many years ago. Today, both of those stocks are above $900. The question is not where we are at it is where are we going? I do not think we are in a bubble.

As for bitcoin specifically, Moas is predicting prices to continue to soar with value reaching $15,000 to $20,000 in the next three years.

Others, like Peter Schiff, an investor who predicted the 2008 mortgage crisis, see bitcoin and other cryptocurrencies as existing in a bubble and as a Ponzi scheme built on just plain greed. Still, there is no way to know which way this market will go as no one has yet invented a way to see into the future.

Disclosure: Several members of the Futurism team, including the editors of this piece, are personal investors in a number of cryptocurrency markets. Their personal investment perspectives have no impact on editorial content.

Read more:
Expert Predicts Bitcoin Will be Worth up to $20000 in the Next Three Years - Futurism

Read More..

Does Bitcoin Have a Mining Monopoly Problem? – Fortune

During bitcoin's early days, anyone could "mine" it using their home computer. But as the price of digital currency climbed towards $100 in 2013 (it's now over $4,000), professional mining groups with specialized computer chips emerged. Today, these groups, or poolsnearly all based in Chinahave become concentrated and now dominate the production of new bitcoins.

This phenomenon is not new, but an article in Quartz this week shows how pervasive it is. The article looks at a company called Bitmain, which became a powerhouse by developing ASIC chips used just for bitcoin mining:

Bitmain may now be the most influential company in the bitcoin economy by virtue of the sheer amount of processing power, or hash rate, that it controls. Its mining pools, Antpool and BTC.com, account for 28.9% of all the processing power on the global bitcoin network.

The piece, which describes Bitmain's plans to move into artificial intelligence, profiles the company's co-founder Jihan Wu, a controversial figure in the bitcoin worldin part over allegations he manipulates the crypto-currency for his own ends. This includes the recent schism that saw bitcoin's blockchain (the record of all transactions) split in two, creating a new currency called "Bitcoin Cash."

Critics of Bitmain suspect that Wu was behind the recent, somewhat related split of bitcoin called the bitcoin-cash hard fork. That split was supported by a miner in Shenzhen named ViaBTCwhich happened to be a company that Bitmain has invested in.

If the allegation is true (for the record, Wu denies them), it suggests bitcoin is vulnerable to market manipulation not just by traders who hold large stores of bitcoin, but also by miners like Bitmain.

Get Data Sheet, Fortunes technology newsletter.

One of those who holds this view is the CSO of the cyrptocurrency consulting firm Blockstream, Samson Mow, who recently wrote an editorial for Fortune questioning the viability of Bitcoin Cash. He believes Wu is engaging in shenanigans to secretly undermine the integrity of bitcoin.

"Jihan does have a lot of control for now, and much of that is simply due to mining centralization. As Bitmain is so vertically integrated, from selling ASICs, to operating mining farms, to running mining pools, he can prevent network upgrade and attempt to hijack the Bitcoin brand with things like [Bitcoin cash]," Mow said by email.

Such concerns over mining monopolies, and their ability to promote "forks" in the core bitcoin software, are typically regarded as philosophical feuds within the bitcoin community. But the real world market implications may also give pause for ordinary bitcoin buyersmany of whom are likely unaware of the emergence of mining cabals that are able to sway the future of bitcoin.

Mow, though, believes that whatever influence Jihan and other large miners may exert is only short-term and that the decision by bitcoin users to implement projects like SegWit (a plan to improve the efficiency of bitcoin's blockchain) show bitcoin remains fundamentally democratic.

This is part of Fortunes new initiative, The Ledger, a trusted news source at the intersection of tech and finance. For more on The Ledger, click here.

An earlier version of this article incorrectly described Mow as the CTO of Blockstream. He is the CSO.

Follow this link:
Does Bitcoin Have a Mining Monopoly Problem? - Fortune

Read More..

Marketo decides to go all-in on cloud computing, and picks Google as its home – GeekWire

Diane Greene, senior vice president for Google Cloud, speaks at Google Cloud Next this morning. (Google Photo)

One of the bigger marketing software companies, Marketo, has decided its ready to ditch its servers and move into the cloud, and Google is getting the business.

The two companies announced a multiyear collaboration strategy Thursday that will see Marketo move its business onto Google Cloud Platform over the next couple of years, and Google will do some work to integrate Marketos products into G Suite. Forbes noted that Google provided migration incentives in order to sweeten the deal, which will further the notion that a lot of Googles major customer wins have come at the cost of steep discounts for its services.

Still, the multiyear agreement provides Google with another long-term customer that could help it woo others, especially other marketing companies. Marketo told Forbes that one of the main reasons it choose Google was because of its in-house marketing savvy as one of the biggest advertising brokers in the world, and that might be an interesting niche for Google to pursue as other software-as-a-service marketing companies plot out cloud strategies.

Marketos software is used by a lot of companies to manage their marketing operations, from lead generation to campaign measurement. It might have decided that it needed some IT assistance earlier this year when it somehow forgot to renew its domain name registration and went down for several hours until it could fix the problem.

Google has been making slow but steady process in its cloud efforts, as it tries to shed a reputation for lacking the enterprise sales touch that Amazon Web Services and Microsoft enjoy. It has stepped up its support of hybrid cloud strategies through deals with Nutanix and just this week lowered prices on networking costs for customers that dont require all the performance that Googles fiber network provides.

The rest is here:
Marketo decides to go all-in on cloud computing, and picks Google as its home - GeekWire

Read More..

Top 2 aspects of cloud computing you need to consi – Accountingweb.com (blog)

If you are planning to invest in a cloud computing environment, you are not alone. In the present era, majority of business owners prefer to invest in cloud and data centre services in order to provide their customers with improved services. Although the large enterprises can plan to invest in personalized data centres, its hardly possible for SMEs. However, in order to cater to their target audience in a better way, they need to upgrade the quality of their services and products. This is where a cloud solution seems to be the most reliable option.

With the emergence of big data, each and every company needs a proper data storage facility in order to continue with their business initiatives properly. Cloud is a cost effective solution to store all the business critical data securely. But is it completely secure to store your mission critical data in cloud? Well, that is where you need to do a thorough research on your own business requirements and identify what kind of a cloud solution can meet your needs. At the same time, you should also be aware of the probable risk factors while investing in a cloud solution as this will help you deal with the situations tactfully in future. According to the industry experts, there are some aspects that you need to consider thoroughly and then determine whether or not you should opt for a cloud based solution. Here are two of the most relevant aspects of storing data in cloud that you should give a thought well in advance

If you are planning to invest in a cloud environment, make sure you have a clear understanding of all the core features of it. This will help you make a better utilization of the service you invest in.

See the original post here:
Top 2 aspects of cloud computing you need to consi - Accountingweb.com (blog)

Read More..

Biz Cloud Computing – Four States Homepage

More News

JOPLIN, Mo. - "It's just like I'm there at the office," says Wendy Brunner-Lewis.

She says it's hard to imagine not being able to tap into the Cloud. "How many times have you woken up and your kids are sick and you think, 'Oh gosh, all my stuff's at the office.' You know it's nice that you don't have to try to remember everything you need the night before and bring home," Brunner-Lewis says.

A study by IDG Enterprise says almost seven out of ten offices are doing at least part of their work remotely, or on the Cloud. And it predicts it will be 100% within three years.

John Motazedi with Joplin IT company SNCSquared says that's likely due to advantages like reduced maintenance. "Most of those things are already done by the vendor. So you don't spend time backing it up, you don't spend time patching it doing updates," he says.

He also points to the flexibility; you aren't limited by the size of your hardware on site, kind of like electricity. "Most people don't have a generator in their house. They use electricity whenever they need it, they have wires that come to their house," Motazedi says.

He adds security is a high priority, so new users should check out the Cloud service before signing up. "There is a difference in data centers and how secure access to those data centers are," Motazedi says.

SNC Squared is holding a seminar on Cloud computing next month.

See the article here:
Biz Cloud Computing - Four States Homepage

Read More..

Cloud Computing Confirmed for Travers | TDN | Thoroughbred Daily … – Thoroughbred Daily News

Cloud Computing at Saratoga | Sarah K. Andrew

After some deliberation by trainer Chad Brown, Klaravich Stables and William Lawrences GI Preakness S. winner Cloud Computing (Macleans Music) will compete in Saturdays GI Travers S. at Saratoga, the Eclipse Award-winning conditioner confirmed Monday morning. The latest addition to the expected full field augments an already competitive race that is expected to also draw GI Kentucky Derby winner Always Dreaming (Bodemeister) and GI Belmont S. hero Tapwrit (Tapit) from the Todd Pletcher barn.

Second in the Mar. 4 GIII Gotham S. and third in the Apr. 8 GII Wood Memorial S. prior to his win in the Preakness May 20, Cloud Computing returned from a two-month layoff with a disappointing last-of-five finish as the 6-5 second choice in the July 29 GII Jim Dandy S., the traditional prep for the Travers. He was asked to stay closer to a decidedly moderate pace that day and came up empty in the stretch, despite being beaten just 4 3/4 lengths by Good Samaritan (Harlans Holiday). The colt has worked twice since then, most recently posting a five-furlong move in 1:01.65 Saturday.

He couldnt have worked any better, said Brown. I was very happy with the work and Javier was pleased, and he came out of his work well.

Brown said recently inducted Hall of Fame jockey Javier Castellano will ride Cloud Computing in the Traversa race he has won a record five times.

Not a subscriber? Click here to sign up for the daily PDF or alerts.

Here is the original post:
Cloud Computing Confirmed for Travers | TDN | Thoroughbred Daily ... - Thoroughbred Daily News

Read More..

Digital Deluge on the Cloud – Valley News

Seattle More than 2 billion people log into Facebook every month. Every day, the social-media crowd uploads billions of photos, calls up hundreds of millions of hours of video, and fires off a prodigious flurry of likes and comments. Somebody has to store that slice of humanitys digital record.

Much of that task falls to Surendra Verma, a Seattle engineer who for more than 20 years has been building software that files away and retrieves large volumes of data.

Verma leads the storage team at Facebook, the group charged with making sure that the social network can accommodate its daily deluge without losing someones wedding photos.

Most of that unit is based in Seattle, part of a workforce that today numbers 1,600 people, up from just 400 three years ago. That makes Facebook one of the fastest- growing technology companies outside of Amazon, anyway in the city.

While Facebook employees work on a wide range of products in Seattle, the office has developed a specialty in the geeky realm of systems software.

About a quarter of the Facebook engineers in Seattle work on the companys infrastructure projects, the tools to transmit, store and analyze the growing heap of data people feed into the social network.

Thats a common trade in the region, where Amazon Web Services, Microsoft and Google are all building their own clouds giant, globe-straddling networks of data centers and the software that manages them.

Facebook could have built its products on computing power rented from those cloud giants, but it decided to build its own tools, from custom hardware designs all the way to mobile applications. Supporting Facebooks network are nine massive data centers a 10th, in Ohio, was announced earlier this month.

Facebooks cloud is different from the others in that its designed to support just one customer: Facebooks own apps.

They happen to be some of the most widely used pieces of software in the world and their use keeps expanding.

Verma is an Indian-born engineer who got his start at IBM before moving to Microsoft, where he worked most recently on Windows file systems. He joined Facebook in 2015.

By then, the Seattle office had come to pilot Facebooks storage-software efforts. We could find some very good engineers here, he said. And so a bunch of these projects started, and just got momentum from there.

His teams job, he said, is to be invisible to Facebooks product groups, letting the companys other engineers build whatever they can think up on top of a reliable foundation.

But a wave of new services, and the rapid growth in the number and quality of videos and photos that people share, is putting a huge burden on the networks infrastructure.

Facebook in January 2016 started a monthslong rollout of live video streaming, a milestone in the social networks effort to compete with popular streaming services.

Then Instagram Stories, aimed at competing with the ephemeral photo-sharing application Snapchat, launched last August, and quickly made its way to 250 million monthly users. (Facebook bought Instagram in 2012.)

Up next: live video broadcasts on Instagram, called Live Stories, a feature the product group hoped to launch before Thanksgiving.

Vermas team wasnt sure its systems could meet that deadline, and negotiated a few days of delay on the planned start date. After scrambling, the team scraped together the internal storage space needed to accommodate the new feature, which went live Nov. 21.

We looked very hard at our capacity, everywhere, Verma said. We scrounged everything we could, looked at nooks and crannies.

One of the people doing the looking was J.R. Tipton, a manager on Vermas team.

Tipton left the Chicago suburbs and came to Seattle in 2001 for the same reason as thousands of others in the 1990s and 2000s: a job offer from Microsoft.

Fifteen years later, he became part of another phenomenon reshaping the region, opting to leave maturing Microsoft for a job with a younger tech giant in the area.

I wanted an adventure, Tipton said of his move to Facebook last year.

Tipton and Verma are among the 640 people at Facebook in Seattle who, on LinkedIn, list past experience at Microsoft.

Seattle, home to Boeings legions of engineers long before Microsoft came along, has never really been a one-company town. But the range of options for technologists has ballooned in the last decade, with Amazon uncorking another Microsoft-like growth spurt, and Silicon Valley giants seeding local outposts to scoop up some of the talented software developers here.

When Facebook set up shop in Seattle in 2010, it was the companys first U.S. engineering office outside its Menlo Park, Calif., headquarters. Last year, Facebook Seattle, which then numbered about 1,000, moved into nine floors of custom-designed office space. That was followed in short order by two more leases that would give the company enough space for about 5,000 employees in Seattle.

Today, Tipton works on the Facebook system that is the last line of defense keeping peoples photos and videos from getting wiped out by a software error or power outage at a data center.

That would be cold storage, the backup system storing redundant copies of all of the photos, videos and other data Facebook stores.

The engineers who designed the system in 2013 wanted to build something more efficient than the typical rack of servers, which suck up energy both to keep their hard drives powered on and to run the fans and air-circulation systems that keep the stack of electronics from overheating.

The company landed on a design that leaves most hard drives in Facebooks rows of servers powered down until needed, hence the cold in cold storage. The software built to run that system was designed to handle plenty of growth in the amount of data people were throwing at Facebook.

Each cold-storage room in Facebooks data centers was built to house up to one exabyte of data. Thats 1,000 petabytes, or 1 billion gigabytes, or the combined hard drive space of 500,000 top-of-the-line MacBook Pro laptops. At the time, it seemed laughably huge, Tipton said of the software layer built to manage all of that.

Last year, it became clear it wouldnt be enough, the result of growth in both the number of Facebook users and the launch of data-intensive features like Instagrams Live Stories.

Tipton and his Seattle-based group spent the first half of 2016 taking a long look at the cold-storage software, which was starting to show its age in a world of 360-degree video. They could invest more time and energy in keeping the old architecture working, or rebuild.

They opted to rebuild.

It has to be much, much bigger, he said.

Tipton has metaphors at the ready to describe that continuing challenge, the fruit of years explaining his day job to nontechies.

You stand on top of a hill and see the buffalo herd coming, he said.

And if that doesnt sound harrowing enough, he tries again: Were basically laying the railroad track down as the train is coming.

Here is the original post:
Digital Deluge on the Cloud - Valley News

Read More..

How Can You Improve Document Management By Integrating Cloud-Based File Sharing And What You Need To Know … – Business 2 Community

Over the past few decades, cloud has gained immense popularity courtesy of the increasing demand of data storage and rapid advancement of technology. Due to the latest developments, cloud is no longer a service that only large and medium enterprises can opt for. Instead, it has become accessible for businesses of all sizes. Nowadays, SMBs often rely on cloud based storage for keeping the business critical data safe and secure. It not only helps them save cut the cost of renting or buying physical storage but also enables them to secure all the data, in the best possible way.

Webcast, August 29th: How to 8x Your SEO Traffic With These 3 Power Hacks

Since file sharing, too, is an important task that almost every business has to perform, business owners prefer doing it via cloud. No matter whether its about sharing internal files or sending some documents to a client or a third party, doing it through the cloud is considered to be easy and fast. In fact, cloud based file sharing is considered to be an effective alternative for backup solutions. This is especially applicable for organizations who want to ensure flexible file access for all their employees. If you are planning to invest in cloud based file sharing and improve the document management system within your organization, there are certain things that you must know.

If you are planning to shift to a cloud based document management solution and not sure whether or not it would be a relevant solution for your business, consider these factors thoroughly and take the right decision for your business.

Read the original here:
How Can You Improve Document Management By Integrating Cloud-Based File Sharing And What You Need To Know ... - Business 2 Community

Read More..

What You NEED To Look For In A Cloud Hosting SLA – TG Daily (blog)

In the modern world of business IT, the cloud is king thats just a fact. According to a recent survey, 95% of all businesses are using either public or private cloud hosting services and the vast majority of businesses are contracting with at least 6 different cloud computing providers.

This makes sense, of course. Cloud computing is inexpensive, reliable, and available even to SMEs (Small-To-Midsized-Enterprises), who often could not afford expensive, on-site IT infrastructure.

However, not all cloud hosting Vancouver companies are the same. As the cloud becomes more and more important to critical business operations, robust Service Level Agreements (SLAs) are essential for any business with a cloud hosting partner.

Essentially, an SLA is a legally-binding document that defines performance standards, uptime, and customer support standards between a cloud provider and a business.

In this document, things such as expected network uptime, Mean Time Between Failure (MTBF), data throughput, and server/software performance are defined in plain language.

The requirements both for the hosting provider and the customer are also defined as are the next steps that can be taken if either party fails to uphold their end of the contract.

An SLA is the single most important document youll sign when choosing a new cloud hosting partner. So heres what you should look for before signing a new cloud hosting SLA.

Cloud hosting SLAs are complicated documents but there are some simple things that you can look for to ensure youre signing an SLA from a reputable company.

System Uptime This is the single most important guarantee you can get on your SLA. Any reputable cloud hosting company should offer system uptime of 99.9% or higher, and have clear guarantees for compensation in case they fail to uphold the system uptime standards outlined in the SLA.

Clear Support And Response Time Guidelines Your SLA should include guarantees both for the level of customer support, and response times from support staff. Try to choose a cloud hosting provider that offers 24/7 customer support, and has a clear policy for fast, reliable response times.

Detailed Definitions For Data Ownership And Management Any SLA you sign should include details about data ownership. You must make it clear that your company still owns any data hosted by a third party.

Your SLA should include language that makes your data ownership clear as well as detailed next steps for retrieval of proprietary data in case you must break the service contract.

Clearly-Defined System Security An SLA should always include a set of security standards that are clearly defined, and testable by you, or a third party.

Your SLA should also allow you to take appropriate security precautions if desired such as using a third-party auditing service to ensure your data is properly protected.

Steps That Can Be Take In Case Of Non-Compliance Or Disputes If your cloud hosting provider fails to uphold their SLA, there must be proper, legal steps that your company can take to exit the contract, or obtain compensation from the company.

A clear strategy for resolving conflicts should be defined as should a clear exit strategy that can be implemented in case the terms of the contract are breached.

Any reputable cloud hosting company in Canada should be willing to create an SLA with these terms and if you find that your potential partner is unwilling to create a comprehensive SLA for any reason, walk away. You should never enter a contract with a cloud hosting provider without an SLA the risks are simply too great.

An SLA is a multifunctional legal document. It protects both you and your cloud hosting partner, and ensures that your business relationship is mutually beneficial.

For this reason, you should only do business with reputable companies that offer robust SLAs. And if you follow these tips and understand the basics behind SLAs, youre sure to find success when searching for a cloud hosting partner in Canada!

Read more from the original source:
What You NEED To Look For In A Cloud Hosting SLA - TG Daily (blog)

Read More..

President Trump Could Cost US Cloud Computing Providers More … – The Data Center Journal

The U.S. cloud computing industry stands to lose more than $10 billion by 2020 as a result of President Trumps increasingly shaky reputation on data privacy, according to the latest research from secure data center experts Artmotion.

Growth for U.S. cloud computing providers is already thought to be slowing. Although IDCs latest Worldwide Public Cloud Services Spending Guide suggests that the US will generate more than 60% of total worldwide cloud revenues to 2020, the country is expected to experience the slowest growth rate of the eight regions in the analysis.

However, this forecast slowdown does not factor in the effect that President Trumps controversial record on data privacy has had on business confidence in the U.S. as a data hosting location. This coincides with a rapid increase in people expressing unfavorable opinions about the U.S. more generally. In fact, the latest study from the Pew Research Center highlights that just 22% of people have confidence in President Trump to do the right thing when it comes to international affairs.

As a result of this growing uncertainty, Artmotions new analysis suggests that U.S. cloud providers will experience further slowing of growth in the next three yearscreating estimated losses of $10.1 billion for the industry between 2017-2020.

Mateo Meier, CEO of Artmotion, commented: In a market that is still expected to grow significantly in the next few years, it is vital that U.S. service providers continue to attract new customers in order to retain market share. Despite the U.S.s current dominance of the global cloud computing market, there is no certainty that the status quo will be maintained. Perhaps the key reason for US cloud providers to be fearful is that this isnt the first time weve been here.

Edward Snowdens revelations about PRISM and the NSAs mass surveillance techniques were hugely damaging to U.S. cloud companies. It also encouraged many businesses to completely rethink their data strategies, rather than continuing to trust that U.S. cloud providers would guarantee the levels of data security and privacy they need. The impact that President Trump could have needs to be understood in that context.

Artmotions full analysis is available as a free download here.

President Trump Could Cost US Cloud Computing Providers More Than $10 billion by 2020 was last modified: August 25th, 2017 by Press Release

View original post here:
President Trump Could Cost US Cloud Computing Providers More ... - The Data Center Journal

Read More..