Page 4,393«..1020..4,3924,3934,3944,395..4,4004,410..»

Op-ed: Portugal, Just Like Any Other Country, Needs Bitcoin – CryptoCoinsNews

Its Easter and some relatives of yours want to come visit. They live in a remote area of the country and arent exactly wealthy. On a Friday afternoon they asked you to wire them some money to help with travel expenses and, as expected, if you try to go with traditional financial institutions they will only receive the money after the weekend.

The above scenario happened to me and if only more businesses accepted bitcoin in Portugal, the problem wouldve easily been solved in minutes. Bitcoins decentralized, peer-to-peer nature empowers people, gives them control over their own money, and fits in all the fundamental properties of money better than gold or fiat.

Yet, a prominent Portuguese newspaper, Jornal de Negcios, recently published a piece dubbed Portugal is no country for bitcoin, in which the writer explains electricity costs arent as cheap as they are in some other countries and, as such, there arent any major bitcoin miners in the country.

Over here, one kWh costs roughly $0.1105, according to the article, and that forces local miners to either make a substantial investment or lose money mining, as making a profit will be hard without the best hardware out there just like in any other country in the world by now.

Although I do agree making a profit mining bitcoin might be a challenge, it is wrong to claim Portugal is no country for bitcoin, at least without specifying mining operations, as there is a lot of information neglected in the article. Otherwise, those learning about bitcoin may get the wrong picture.

The articles claim that bitcoins place isnt in Portugal because it isnt easy to mine it profitably would leave our fiat currency, the euro, in a much darker place, as no citizen can legally print them out.

Yet, we transact in euros, a less secure, centralized currency whose value the people have absolutely no control of, on a daily basis. Moreover, there are dozens, if not hundreds, of different ways to acquire bitcoin, without mining one single satoshis: exchanges and r/Jobs4Bitcoins are great examples.

Even Microsoft co-founder Bill Gates himself said bitcoin is better than fiat currency:

As previously reported by CCN, the Bank of Portugal has issued a statement back in 2013 in which it states bitcoinhas no central authority to control it, and that it has no legal tender in the country.

Now more than ever bitcoin is needed in Portugalas our financial system is apparently letting us down. Back in 2015, over 500 people lost their lifetime savings because they were led to invest in dubious financial applications they werent even told were risky in the first place. Their representative once told the media(translated statement):

These are people that lost everything because of the banking system, and the lack of [proper] surveillance.

Last month, our mediareported another financial institution, Montepio,could be in trouble, as its major shareholder reportedly has a 107 million hole in the balance sheet. The reports forced consumer protection organization DECO to warn against certain financial products as these arent properly supervised.

Featured image from Shutterstock.

View original post here:
Op-ed: Portugal, Just Like Any Other Country, Needs Bitcoin - CryptoCoinsNews

Read More..

Cloud Computing might skip Kentucky Derby – Daily Racing Form

Email

Michael Amoruso

Cloud Computing is ranked 20th in qualifying points for the Kentucky Derby.

Trainer Chad Brown on Monday said he is leaning towards passing the Kentucky Derby with Cloud Computing, a decision that has implications regarding the potential field for the May 6 Derby at Churchill Downs.

Cloud Computing currently sits 20th on the Derby points list, pending a decision by the connections of Conquest Mo Money as to whether to supplement for $200,000 and run.

Cloud Computing most recently finished third in the Wood Memorial, only the third start of a career that began on Feb. 11. Brown said Cloud Computing would have his first work since that race this weekend.

The horse looks great, but he is lightly raced, Brown said. We havent made a final decision yet, but were leaning towards passing and pointing to the Preakness or even a summertime campaign. Well see what hes ready for right now.

Brown and the owners of Cloud Computing, Seth Klarman and William Lawrence, have one certain starter in the Derby in Practical Joke, who was second in the Blue Grass in his final prep. Brown said Practical Joke would work this weekend at Keeneland, then move over to Churchill Downs and have his final work there. Joel Rosario has the mount.

Assuming Cloud Computing skips the Derby, that would provide an opportunity for Untrapped, currently 21st on the points list, to get in. He finished sixth in the Arkansas Derby on Saturday.

Currently 22nd on the points list is Lookin At Lee, third in the Arkansas Derby. Steve Asmussen trains both Untrapped and Lookin At Lee and on Monday said both remain under consideration for the Derby.

Several riding assignments are pending for the Derby, most notably runners trained by Asmussen and Todd Pletcher, both of whom potentially will juggle multiple entries. Pletchers lone confirmed assignment is John Velazquez on Always Dreaming.

One assignment finalized Monday was with Irap, who won the Blue Grass with Julien Leparoux. In light of Classic Empire winning the Arkansas Derby and moving on to the Derby, Leparoux will remain with Classic Empire. Irap will be ridden by Mario Gutierrez, reuniting him with trainer Doug ONeill and owner Paul Reddam, the team that won the Derby with Ill Have Another in 2012 and Nyquist last year.

The winning Beyer Speed Figure of the Sunland Derby, won by Hence, has been adjusted to 97 from its original 93, according to Andrew Beyer. Irap and Conquest Mo Money, second in the Arkansas Derby, both exited the Sunland Derby.

See the article here:
Cloud Computing might skip Kentucky Derby - Daily Racing Form

Read More..

S Korea’s Naver jumps into cloud computing market – The Borneo Post

SEOUL: Naver Corp., the operator of South Koreas dominant Internet portal, launched a new cloud-computing platform on Monday to join other IT firms in the fast-growing market, South Korea s Yonhap news agency reported.

Naver Business Platform (NBP), a subsidiary firm under Naver providing IT infrastructure and solutions, released the platform, Naver Cloud Platform, to offer 30 basic services such as data computing and security management.

The firm said it plans to add four or five new services every month to allow its clients to use Navers services including web search and voice recognition.

The company said it aims to compete against other global powerhouses and become one of the worlds top five cloud service providers in the next two years.

Naver has been seeking to attract customers for its own cloud computing services that have become core infrastructure for Internet of Things technology, big data and other new Internet technologies.

Major players, including Amazon and IBM, have become front-runners in the South Korean market for cloud computing services.

According to a recent study by the National IT Industry Promotion Agency (NIPA), the South Korean market for cloud computing jumped 46.3 per cent on year to 766.4 billion won (US$657.1 million) last year. Bernama

What do you think of this story?

Angry(0%)

Sad(0%)

Nothing(0%)

Interesting(0%)

Great(0%)

Originally posted here:
S Korea's Naver jumps into cloud computing market - The Borneo Post

Read More..

Amazon Web Services Head Jassy Reaps $35.4 Million for 2016 – MSPmentor

Leading the fastest-growing and most profitable division of Amazon.com Inc. is paying off for Andrew Jassy.

The head of Amazon Web Services, which includes the companys cloud business, received $35.4 million in stock and about $179,000 in salary and a 401(k) match, according to a regulatory filing from the Seattle-based company Wednesday. The shares rose in value to about $54 million as of Tuesdays close.

Jassy, promoted to a new CEO role for the division a year ago, was the companys top-paid employee among the six executives whose compensation has to be publicly disclosed, including Chief Executive Officer Jeff Bezos.

Jassy is leading a push into artificial intelligence to boost Amazons cloud computing, which commands about 45 percent of the market for infrastructure as a service, where companies buy basic computing and storage power from the cloud. The unit, which Jassy has run since its inception 11 years ago, brought in a record $12.2 billion in revenue last year as the company introduced an image-recognition program, a speech-to-text service dubbed Polly and tools for building conversational apps. AWS has data centers around the world that provide computing power for many large companies, such as Netflix Inc. and Capital One Corp.

Inside AWS, were excited to lower the costs and barriers to machine learning and AI so organizations of all sizes can take advantage of these advanced techniques, Bezos wrote in his annual letter to shareholders, which was also released Wednesday.

Biennial Grants

Like fellow tech giant Alphabet Inc., Amazon mainly pays top employees with biennial grants of restricted shares that vest over several years independently of company performance. That sets them apart from large companies in other industries, which tend to link payouts to specific goals such as revenue or stock return. Emphasizing certain criteria could cause employees to focus solely on short-term returns at the expense of long-term growth and innovation, Amazons board said in the filing.

Amazon last year also promoted Jeffrey Wilke to CEO of the worldwide consumer business, awarding him a $33 million compensation package, the bulk coming from restricted shares vesting over several years.

Senior Vice Presidents Jeffrey Blackburn and Diego Piacentini got $22.2 million and $23.7 million, respectively, mostly coming from biennial stock grants.

CEO Bezos, whos the worlds second-richest person with net worth of $77.7 billion, got his usual $81,840 annual salary and $1.6 million in security services last year. The billionaire, whose wealth comes from his ownership stake in the company, has never received equity compensation from Amazon.

Visit link:
Amazon Web Services Head Jassy Reaps $35.4 Million for 2016 - MSPmentor

Read More..

Oracle Targets Developers with Wercker Acquisition – Talkin’ Cloud – Talkin’ Cloud

Oracle announced on Monday that it has signed a definitive agreement to acquire Wercker, a continuous integration and delivery platform. The terms of the deal have not been disclosed.

In a statement on Monday, Oracle said that Wercker empowers organizations and their development teams to achieve continuous integration and continuous delivery goals with micro-services and Docker.

Oracle Cloud senior vice president AmitZavery said earlier this year that its cloud provides developers with the "full infrastructure for doing orchestration, devops for it as well as we provide you the capability to scale it out and have a highly available system as well and deploy it in multiple data centers globally. The whole runtime, the devops and all that stuff for microservices is all provided in our platform today." It seems this latest acquisition will continue to build onthis pitch to developers.

Oracle is building a leading IaaS and PaaS platform as the foundation for a new generation of cloud computing, Oracle said in a statement. A leading cloud needs great tooling and adding Werckers container lifecycle management to Oracles Cloud provides engineering teams with the developer experience they deserve to build, launch and scale their applications. Together, Oracle and Wercker will democratize developer tooling for the modern cloud.

In a FAQ for customers and partners, Oracle said that it plans to continue investing in Wercker once the transaction closes.

We expect this will include more functionality and capabilities at a quicker pace. In addition, Wercker customers will benefit from better integration and alignment with Oracles other product offerings, Oracle said.

According to a blog post by Micha Hernandez van Leuffen, CEO and founder of Wercker, Werckers Docker-based platform has a strong, rapidly growing user base as companies, large and small, transition to container-based workloads. Developers will now have access to a strong Docker-based portfolio as part of Oracle PaaS and IaaS. We are excited to join Oracle and bring even more value to our customers as part of Oracles cloud computing platform.

Joining forces with Oracle means we have aligned with a hyperscale cloud provider that will enable us to bring our vision of Docker-based developer lifecycle management to a broader range of customers and applications, while increasing the pace of innovation for our existing customers. In the near term, our team, free community edition, and relentless focus on the developer experience remain unchanged.

Go here to see the original:
Oracle Targets Developers with Wercker Acquisition - Talkin' Cloud - Talkin' Cloud

Read More..

Cloud Computing and Service Level Agreements (SLAs) – Datamation

A service level agreement (SLA) is a technical services performance contract. SLAs can be internal between an in-house IT team and end-users, or can be external between IT and service providers such as cloud computing vendors. Formal and detailed SLAs are particularly important with cloud computing providers, since these infrastructures are large scale and can seriously impact customer businesses should something go awry. In the case of cloud computing, SLAs differ depending on a specific providers set of services and customer business needs. However, all SLAs should at a minimum cover cloud performance speed and responsiveness, application availability and uptime, data durability, support agreements, and exits.

Customers will provide their key performance indicators (KPI), and customer and provider will negotiate related service level objectives (SLO). Automated policies enforce processes to meet the SLOs, and issues alerts and reports when an agreed-upon action fails. Cloud computing providers will usually have standard SLAs. IT should review them along with their legal counsel. If the SLAs are acceptable as is, sign it and youre done. However, many companies will want to negotiate specific requirements into their SLAs, as the vendor SLA will be in favor of the provider. Be especially careful about general statements in the standard SLA, such as stating the clouds maximum amount of customer computing resources, but not mentioning how many resources are already allocated. Not every cloud computing provider will automatically agree to your requirements, but most customers can make good-faith negotiated agreements with providers. Quality of service depends on knowing what you need and how they will provide it.

This example of a cloud computing SLA details numerous technical details of the cloud agreement.

A service level agreement is not the time for general statements. Assign specific and measurable metrics in each area, which allows both you and the provider to benchmark quality of service. SLAs should also include remediation for failing agreements, not only from the cloud provider but from the customers as well if they fail to keep up their end of the bargain. Cloud computing users should specifically review these items in a cloud computing SLA:

Service credits are the most common way for a cloud computing provider to reimburse a customer because the provider failed an agreement. The reason for the failure is an issue, since the provider will rarely issue credits if the failure was out of their control. Terrorist acts and natural disasters are common exclusions. Of course, the more data centers that a service provider has, and the more redundant your data is, the less likely that a tornado will affect your data.

Your computing needs are not static, and your SLA shouldnt be either. Your needs and the providers capabilities will change over time. Your provider will periodically revisit their standard and custom SLAs considering new procedures and technologies. You should do the same. Periodically review your SLAs, especially when there are changes to your business needs, technology, workloads, and measurements. Also review your SLAs when your cloud computing provider announces new services. You wont take advantage of every offer that comes down the pike. But if a new service will improve your customer experience, then adopt it and modify the service level agreements to reflect the new product.

SLAs are a critical part of any service offering to an internal or external client, and are particularly important between a business and its cloud computing provider. Dont let the cloud SLA be a battleground of assumptions and mistaken expectations. Negotiate and clarify agreements with your provider. Be reasonable without being blindly trusting, and the SLA will protect both of your businesses as it is meant to do.

More here:
Cloud Computing and Service Level Agreements (SLAs) - Datamation

Read More..

How Fog Computing Will Shape The Future Of IoT Applications And Cybersecurity – ISBuzz News

Fog computing may be the next big thing for the Internet of things. The fog computing market, valued at $22.3 million in 2017, willexpand at an explosive rateand grow to $203.5 million over the next five years, according to projections by Markets and Markets. IoT interconnectivity, machine to machine communication, real-time computing demand and demand for connected devices are driving the fog markets growth.

Businesses impacted by these trends are turning to fog computing for greater efficiency, faster decision-making processes and lowered operating costs. Heres a closer look at what fog computing is, why it will play a key role in the future of IoT technology and how it will help with cybersecurity.

Fog Computing vs. Traditional Cloud Computing

Fog computing is an extension of cloud computing to adjust to the emerging Internet of things. The IoT is connected to a vast array of devices, including mobile phones, wearables, smart TVs, smart homes, smart cars and even smart cities. The amount of devices collecting data and the amount of data being processed is growing exponentially.

Public cloud computing provides the computing space to process this volume of data through remote-located servers. But uploading this amount of data to remote servers for analysis and delivering the results back to the original location takes time, which can slow down processes that demand rapid responses in real time. Additionally, when Internet connectivity is unreliable, relying on remote servers becomes problematic.

Fog computing is a solution to these issues, explains Cisco, a pioneering member of the OpenFog Consortium. Rather than relying primarily on remote servers at a central location, fog computing uses distributed computer resources located closer to local devices to handle processes that demand rapid processing, with other, less time-sensitive processes delegated to more remote cloud servers.

This can be visualized as pushing the border of the cloud closer to the edge of local devices connected to the Internet of things. Because of this, fog computing is also sometimes called edge computing. Thus, fog computing is not really opposed to cloud computing, but it can be viewed as a variety of hybrid cloud computing where some processes are handled by private fog networks closer to network devices and some are handled by the public cloud.

Why Companies are Turning to Fog Computing

There are a few major reasons why companies are turning to fog computing, explains TechTalks software engineer Ben Dickson. One is the emergence of IoT applications, where real-time responsecan be a matter of life or death. A key example is the healthcare industry: Medical wearables are increasingly being used by healthcare providers to monitor patient conditions, provide remote telemedicine and even to guide on-site staff and robots in procedures as delicate as surgery. Thus, reliable real-time data processing is crucial for these types of applications.

Another IoT application where rapid response is crucial is vehicle communications. Many cars use online information to guide navigational decisions. In the near future, driverless cars will rely entirely on automated input to perform navigation. Thus, a slow response when vehicles are moving at 60 mph can be dangerous or even fatal, so real-time processing speed is required. Fog computing networks are especially suitable for applications that require a response time of less than a second, according to Cisco.

How Fog Computing Helps Cyber Security

Security is another big reason companies are turning to fog computing. Data for applications such as healthcare and point-of-sales transactions is very sensitive and a primary target for cyber criminals and identity thieves. However, fog computing provides a way to keep this type of data under tight guard.

Fog systems are designed from the ground up to protect the security of information exchange between IoT devices and the cloud, providing security suitable for real-time applications, according to the OpenFog Consortium. Fog systems can also be used to keep device data securely in-house and away from vulnerable pubic networks. Data backups can then be safely stored bydeploying reliable backup services, like those provided by Mozy, allowing companies to schedule automated backups protected by military-grade encryption.

Go here to read the rest:
How Fog Computing Will Shape The Future Of IoT Applications And Cybersecurity - ISBuzz News

Read More..

New Azure migration tools analyze which apps should move to the cloud – TechTarget

A new service from Microsoft can help IT shops interested in a move to Microsoft Azure better estimate workload performance.

Potential customers have access to new Azure migration tools, such as the free Cloud Migration Assessment, which analyzes on-premises workloads to determine how applications will perform and the cost to run them on Azure. The move gives Azure parity with Amazon Web Services (AWS), which added similar capabilities last year.

The new feature was rolled out along with two other attempts to ease the transition of Windows Servers to Azure: licensing discounts and improved capabilities in Azure Site Recovery.

The Cloud Migration Assessment works across a company's IT environment to evaluate hardware configurations. Microsoft then provides a free report that estimates the cost benefit to house those workloads on Azure, as well as suggestions to appropriately size environments in the cloud. It also informs users on which VM types to select.

"This was an area Microsoft didn't have and really needed," said Angelina Troy, an analyst at Gartner.

Other updates rolled out this week provide access to the Azure Hybrid Use Benefit in the Azure Management Portal. Customers can save up to 40% on Windows Server licenses that include Software Assurance, according to Microsoft.

In the coming weeks, Azure Site Recovery -- Microsoft's tool for migrating Hyper-V, VMware and physical servers -- will add new tools to tag VMs directly within the Azure portal, rather than using PowerShell.

Cloud migration is a more prominent issue as customers shift from born-in-the-cloud startups to enterprises that want to shift existing VMs to the public cloud. They often have a hard time predicting how workloads will perform in these environments; a cottage industry of third-party vendors has sprung up to help migrate and manage workloads.

Cloud providers have also extended their capabilities as they seek to eliminate hindrances to adoption and use. They offer a variety of tools for real-time replication or transfer of configuration-dependent images. AWS and Azure now have similar options in terms of ways to migrate a VM into their respective compute services, though Azure may actually have a few more replication services and tools than AWS, Troy said.

The assessment capability isn't necessarily superior to what other third-party companies provide, but the main benefit is that it's free, Troy said. This tool can now be combined with other Azure migration tools, such as Azure Migration Accelerator and Azure Site Recovery, to coordinate and move workloads to Microsoft's public cloud.

Third parties don't always have that same depth of knowledge of cloud platform updates, but they can provide insights across providers to help users find the best fit, especially if they're vendor-agnostic.

The actual migration can often be the simplest part of the move to the cloud, said Timothy Campbell, product manager at Datapipe, a managed service provider based in Jersey City, N.J., that partners with AWS and Azure. Still, navigating Azure's large product and feature set can be daunting, so these features address an important piece of the puzzle, he added.

These updates "will likely accelerate adoption by providing a native tool that can help align workloads correctly and create efficiencies that are specific to the platform," he said.

Trevor Jones is a news writer with SearchCloudComputing and SearchAWS. Contact him at tjones@techtarget.com.

Microsoft bulks up its set of Azure cloud monitoring tools

How will Azure Stack vs. Azure functionality compare?

Five tips for more efficient Azure management

See the original post:
New Azure migration tools analyze which apps should move to the cloud - TechTarget

Read More..

Your Amazon Echo Recordings Can be Listened To and Deleted, Like This – 1redDrop

Is your Amazon Echo always listening to you and recording all your conversations? In short, yes and no. The Echos AI system is always listening for the wake word Alexa or Computer or whatever youve set it to but it is not always recording. The recording starts when the wake words are spoken, and the recording is then sent to Amazons cloud servers for processing. Those recordings are all stored there until you delete them.

Heres how to listen to everything that your Amazon Echo has recorded, and then delete those recordings.

But before you go on an Echo-recordings-deleting spree, you need to understand that past commands help Amazon Alexa understand your needs in a more personalized way. Deleting all your past recordings will hamper that ability.

To listen to your recordings, you can go to the Amazon Alexa app on a smartphone or tablet, go to Settings > History. There, youll be able to see the tens, hundreds or thousands of entries stored on Amazons cloud servers, depending on how busy youve been keeping Alexa on your Amazon Echo device.

You can listen to any of those recordings, which will be served from the cloud. To delete just a few recordings, select the ones you want to delete and then hit Delete.

What if you want to delete the whole bunch of them? That could take hours or days if you do them one by one. To delete your entire recordings history, youll need to open up your browser and go towww.amazon.com/mycd, where youll be asked to sign in with the same ID you used on the Alexa app.

Once logged in, youll be able to view the audio files, listen to them and delete everything that was ever recorded since you bought your Amazon Echo.

But again, be warned that if you delete everything, itll be like Alexa has to start learning again from scratch, which you might not want. Alternatively, you can delete the oldest recordings the ones that were made when you first bought the device and asked test questions before you got the hang of it.

Remember, Amazon Echo only records voice commands that are heard after the wake word is spoken, so you dont have to worry about Alexa spying on you, as many people believe. Shes always listening, its true, but she doesnt send any data to Amazon until an authentic voice command is issued.

Thats how it works. If not, your Amazon Echo would be the size of a large bedroom, or even as large as your house, because thats how much computing power artificial intelligence needs in order to be intelligent. Thats the reason nearly everything is processed on the cloud it cant be otherwise; at least, not until processing power evolves to a much, much greater level than it is today.

Thanks for reading our work!If you enjoyed it or found value, please share it using the social media share buttons on this page. If you have something to tell us, theres a comments section right below, or you cancontact@1redDrop.comus.

Read more from the original source:
Your Amazon Echo Recordings Can be Listened To and Deleted, Like This - 1redDrop

Read More..

Salesforce eyes regional growth after Australian Amazon Web Services cloud deal – The Australian Financial Review

Salesforce CFO Mark Hawkins says Australia is the starting point for a concerted push across Asia.

The chief financial officer of cloud computing giant Salesforce has said a local presence for hosting data will help charge the company's growth as its focus rests on Asia.

Salesforce CFO Mark Hawkins, a top lieutenant of exuberant chief executive Marc Benioff, was talking to The Australian Financial Review about the longer term repercussions of a long-awaited announcement last month that its Australian customers' data would no longer be hosted offshore.

The company signed a deal with Amazon Web Services to host its Intelligence Customer Success Platform in AWS' Sydney data centres, ending years of speculation about when the company would lay down some local roots.

The $US58.5 billion ($77 billion) valued company, which has been the highest profile and fastest growing pioneer of the cloud-hosted software as a service revolution, is looking to the Asian region to be a key driver of its plans to push its revenue well past the $US10 billion mark, having completed last year at $US8.4 billion.

Last year Salesforce's vice-chairman, president and chief operating officer Keith Block told The Financial Review it was planning to make its Australian operations the hub of its plans, and Mr Hawkins said the recent AWS deal was an example of barriers to further growth coming down.

"I think it's just one more example of us investing in Australia, which is really the cornerstone of our strategy in Asia," Mr Hawkins said.

"We're hiring people, we are growing our capabilities ... Australia is like the launch pad for all of Asia for us. We see it very, very strategically."

Mr Hawkins said that during the last completed quarter Salesforce's Asia Pacific operations had grown at a rate of 29 per cent, generating $US749 million in revenue, but that the region represented the biggest growth trajectory for the company worldwide.

His comments echo those recently made by Google's regional boss Karim Temsamani, who said the tech giant had recognised Asia Pacific was becoming the centre of the world in terms of being a leading indicator of consumer behaviour, and provided a far more lucrative growth profile than the US or Europe.

"I do think Asia-Pacific is the land of opportunity. I've spent a lot of time in Asia throughout my career and it's a huge opportunity and that's why we've been investing," Mr Hawkins said.

"Our headcount has grown 27 per cent on a compound annual growth rate the last three years across Asia Pacific, and while the company as a whole will be growing in the 20 per cent range for the foreseeable future in my opinion Asia-Pac will be leading the pack."

Go here to see the original:
Salesforce eyes regional growth after Australian Amazon Web Services cloud deal - The Australian Financial Review

Read More..