Page 4,286«..1020..4,2854,2864,2874,288..4,3004,310..»

Bitcoin Extends Weekly Decline Past $750 as US Dollar Rebounds After Fed Hike – TheStreet.com

Bitcoin (BTC) prices have fallen more than 20% this year, hiving more than $750 from their all-time high, as investors retreated from the crypto currency into the U.S. dollar following Wednesday's rate hike from the Federal Reserve.

Bitcoin was priced at $2,203 each on the bitsmap exchange in London at 14:30 local time, down $750 from the record high 2.954.22 the exchange recorded on Sunday. The moves follow a 0.5% gain for the dollar index, which measures the greenback's strength against a basket of six global currencies, since the Fed lifted its key rate for the second time this year to a range of 1% to 1.25%.

Bitcoin prices have surged more than 190% so far this year, according to Bitmap prices, amid a series of developments in the cryptocurrency space that has convinced investors it could soon be a more formalised form of international commerce.

Morgan Stanley, however, doesn't believe that cryptocurrencies such as Bitcoin will be a viable currency in the future, still seeing them more like investment vehicles than anything else. The firm argues that Bitcoin is a "more inconvenient way to pay" for goods and services than using a debit or credit card.

"Most regulators and investors view cryptocurrencies more as assets than actual currencies. Their values are too volatile and too hard to actually use for payment for most to consider them currencies," the bank said in a note that was posted on the Business Insider website. "Our conversations with some merchants indicate that, while cryptocurrencies might actually be attractive for them to operate their businesses, they find that the cryptocurrencies are far too volatile to be used."

Bitcoin shares fell 2.2% on Thursday in the early afternoon.

Here's more of the latest top stories from TheStreet:

Read the original here:
Bitcoin Extends Weekly Decline Past $750 as US Dollar Rebounds After Fed Hike - TheStreet.com

Read More..

Why Bitcoin Can’t Serve As A Currency – Seeking Alpha

This will be rather a short and obvious article. What prompted me to write it was Bitcoin Investment Trust (OTCQX:GBTC) trading at a large premium to the Bitcoin market quote, as I explained in my article titled "Bitcoin And AMD: Dej Vu All Over Again". Since then, Bitcoin lost 23.7%, and GBTC lost 42.4%.

Now, with Bitcoin having previously rallied so much, it's likely that many "investors" in this "digital currency" were convinced that not only did it serve its purpose as a currency, but it also was actually a pretty good one. However, to be a currency/money, what's usually asked of a candidate is for it to fulfill three functions: to be a medium of exchange, unit of account, and store of value.

Bitcoin, unfortunately, cannot fill that last function (store of value). To be a store of value, a currency needs to keep its value reasonably stable over time. And also, unfortunately, if Bitcoin can't keep its value over time, it can't also work as a medium of exchange - but more on that later.

First, I must focus on keeping the value over time. This isn't about Bitcoin going up in value as much as it is about the volatility it suffers. For instance, Bitcoin was at ~$3,000 3 days ago, and as I write this, it sits at $2,170, for a ~28% drop from the top.

This kind of volatility has extreme consequences for anyone doing commerce (selling goods and services) and accepting Bitcoin as payment (a currency). Many businesses will trade on tight margins, most businesses will have gross margins over 28%, but on average they'll struggle to have 10% in net margins (after not just direct product/service costs, but all the overhead of running a business).

What this means is that most businesses cannot sell a product/service at 100, and then see the proceeds from the sale turn into 72 in a matter of days. This would risk turning profitable sales into unprofitable ones. Of course, those businesses could accept Bitcoin and then trade out of the accepted Bitcoin immediately (for USD or another currency).

That seems like a solution, doesn't it? It so happens that this solution - which would be perfectly necessary - also represents the failure of Bitcoin as a currency. And funny enough, this can be illustrated with another event of a currency failing: Weimar Germany, known for its hyperinflation.

During the worst of the hyperinflationary days, workers were paid twice a day. Their spouses would go to their jobs, collect their wages and immediately spend them - since raging inflation meant as time went by, the same money would be worth less and less.

What does that mean? It means that the solution to sell goods or services in Bitcoin and then having to get rid of Bitcoin as fast as possible is a direct parallel to what happens during currency failure. Also, this would mean that if there was enough commerce taking place in Bitcoin, it would probably put systemic downward pressure on its price since each sale of a good or service priced in Bitcoin would represent an immediate outflow from Bitcoin (with no compensating inflow since there would be no need to buy Bitcoin to buy the good or service in the first place).

Counter-Arguments

Obviously, one could come up with a host of reasons why this present difficulty won't last forever. For instance:

Conclusion

To put it short, the solution to offer Bitcoin as a means of payment (trading it off immediately) is the very solution which represents the failure of Bitcoin as a currency. And now, if Bitcoin can't be a currency, what is it? In my view, it's two things:

As a side note: As of right now, with GBTC at $311 and BTC at $2,170, the GBTC premium has fallen to 54.3% from the 103% it traded at when I last wrote on it. It's still significantly more expensive to hold GBTC than to hold the underlying Bitcoins.

Disclosure: I/we have no positions in any stocks mentioned, and no plans to initiate any positions within the next 72 hours.

I wrote this article myself, and it expresses my own opinions. I am not receiving compensation for it (other than from Seeking Alpha). I have no business relationship with any company whose stock is mentioned in this article.

Continue reading here:
Why Bitcoin Can't Serve As A Currency - Seeking Alpha

Read More..

Terror Thursday: It’s a Cryptocurrency Bloodbath – CryptoCoinsNews

The market correction that a number of analysts have predicted has hit, with leading cryptocurrencies losing in double digits in the last 24 hours. Market leaders bitcoin and Ethereum were not among the biggest losers, dropping 12.81% and 16.04% in the last 24 hours, respectively, but their market cap losses were in the billions, falling to $37.4 billion and $28.9 billion, respectively.

Ripple, a distant number three in market capitalization at just under $10 billion, lost over 12%. NEM, number four, lost over 17%, while Ethereum Classic, number five, lost 13.77%. Litecoin, number 6, suffered the least among thbillion-dollarar players, losing just over%. Eighth placed IOTA was the biggest loser among the cryptocurrencies with more than $1 billion in market capitalization, falling 36.5% when its price fell to $0.38.

All top 100 cryptocurrencies tumbled in the last 24 hours, according to marketcap.com, except for four: Quantum Resistant Ledger, the number 41 cryptocurrency with $81.4 million market capitalization, jumped 19.43%; LBRY Credits, number 57, posted an 18.24% gain; Xarum, number 62, gained 10,4%, and ZCoin, number 69, gained 9.58%.

The correction that began Monday continued after a breather yesterday, as bitcoin failed to launch a new rally towards all-time highs and rolled over after the bounce. Correlations are high once again, as is usual for a correction, and its likely that bitcoin and Ethereum will dictate the trend of the coming days, with small cap coinsfollowing the majors lower.

Bitcoin continues to trade near its lows from Monday, and it will likely head for a test of the $2375 level, as it clears its overbought momentum readings. The rising long-term trendline is found near $2200, providing further strong support. The long-term picture remains bullish, but there is room for further correction after the strong rally since the end of March.

A 30%-50% correction, that has been the normal for bitcoin in the past, is a huge psychological burden that makes a panic sale likely, usually just before the bottom. Because of this, buyers are advised to wait for the correction and oversold readings, even for those planning to buy it at a higher price later on.

Analyst Nicola Duke of Forex Analytix predicted hefty price corrections for both bitcoin and Ethereum in late May. Duke said bitcoin could experience a 46.5% price correction at $2,800 afterwitnessing arecord $2,791.70 high in late May. After reaching $2,800, Duke predicted it would fall and reach as low as $1,470, marking a 46.5% drop from the late May price.

Duke expects the correction to be temporary, with the price recovering, and continue its upward movement through 2018. An analysis called the Fibonacci retracement examines the peaks through different periods of up and down movements to determine future asset prices.

In wave two, in the fall of 2013, bitcoin bottomed out in January 2015 before rebounding for several months and then declining again. It rebounded again in January of 2015. Duke said bitcoin is now in a third wave.

Duke expects the fourth wave will see bitcoin stay at 61.8% of the time the second wave lasted. This means the rally following the correction will begin in January.

Short-term traders are advised to wait until the correction runs its course and the short-term trend turns higher again, while long-term investors should prepare to add to their holdings heading towards the targets of the move, and buying opportunities emerge. This holds true for long-term investors who plan on holding on to the coins and adding to their core holdings on the dips. Short-term traders should still wait for the short-term trend to turn higher before buying.

Featured image from Shutterstock.

See the rest here:
Terror Thursday: It's a Cryptocurrency Bloodbath - CryptoCoinsNews

Read More..

Phoenix Rise of the Bolivian Cryptocurrency Market – Coin Idol (press release)

Jun 15, 2017 at 13:47 // News

The cryptocurrency market in Bolivia is experiencing hard times, though the crisis is probably not going to last forever and sooner or later the market in the country will rise like a phoenix.

The government is now trying to not only ban the use of cryptocurrency but to persecute any promotion of it with any means. They even arrested some people that were talking about bitcoin and other cryptocurrencies on social networks.

Fortunately, Bolivia is not the first country to be hostile to Bitcoin and blockchain. There are many other examples where governments tried to prevent people from using cryptocurrencies, but their efforts appeared to be useless. All that is needed is a strong community with advanced awareness of the technology.

Vctor Manuel Clavijo Jan, Spanish Ambassador of CureCoin , stated to Coinidol:

Russia and China tried to ban it in 2014 and in less than 4 years they had to reconsider it. In Russia, they are now planning to use it to combat money laundering.

The Bolivian cryptocurrency community is small and undereducated. Besides, according to Fernando Ontiveros, former System Analyst at Banco de Crdito BCP, a Peruvian bank in Bolivia, and currently director of engineering, Blockchain and DLT Researcher at Mojix , an IoT software solutions platform, there are currently many pyramidal business that negatively affect how people see cryptocurrency.

Another expert, Jorge Kuljis, Investor and Board Advisor at Minka, a Technology company, and Bank as a Service Hub for Latin America that provides platform as a service for start-ups & fintechs using blockchain technology, established operations in Bolivia, Colombia and Peru, connecting the main financial BPO companies and financial networks, Founder of Sintesis , a Financial Business Process Outsourcing company for Bolivia, influential leader in the Bolivian business ecosystem of blockchain added to Coinidol:

This year (2017), the bitcoin concept was used by a company to get money from small investors, offering a very high and risky interest rate in a pyramidal model frequently used by massive scams that are unsustainable. The problem was the scam, not the bitcoin, but unfortunately the problem was mistakenly associated with the name of the most known cryptocurrency.

According to Andre Torres, a Bitcoin expert in Latin America, former National Director at Blockchain Education Network in Brazil and currently Community Manager and Benchmarker at GUNBOT, an automated bot trader for POLONIEX, BITTREX and KRAKEN, it is impossible to ban Bitcoin, no matter how hard the government tries. He added:

Internet today is mobile. We just have to look at African countries and their vast experience with money remittance using simple cellphones and sms. And Bolivia has a differential - chinese products flourish through the country, so access to online wallets and bitcoin-related services remain totally accessible. For the most savvy, altcoins like Monero provide good alternatives even for merchants. How could a political power effectively prevent access to telecom services? Unpractical. And this is just from one perspective, others exist. Current telecom infrastructure in Bolivia is the biggest bottleneck for a socio economic development around the country. The blockchain, if allied to ICT that are relatively simple to implement - like mesh networks and other forms or wireless access - form a potent combination to generate new alternatives for bolivian economy. The government only perceives Bitcoin 1.0 applications, but the new possibilities brought with blockchain based platforms like Ethereum aren't even being considered, thus opening opportunities for both for national and international online enterprises.

The economic crisis that Bolivia is facing, exists since much before the actual government. With a devalued currency and precarious situations in many parts across the country, not excluding the most developed Provinces, it's natural that information and communication technologies be the escape route of a potential economic collapse. Also, the corporations established in the country have need for newer, faster and cost effective solutions. Blockchain is here to solve issues like that. This way, even if cryptocurrencies would not be massively adopted by the regular population, enterprises will be making use of it in different modes. Bolivia is neither an isolated country nor possesses the Great Chinese Firewall to prevent cryptocurrency adoption by companies or by the people.

Jorge Kuljis seem to agree with this opinion. He adds that regulators just need time to get acquainted with these new trends:

Fintech, Blockchain and cryptocurrencies are new trends for regulators. They must learn about as soon as possible, global and regional institutions must work in lobby issues with them. In my opinion, in short term cryptocurrencies will not be used like digital money, they will be used like digital assets for cross border transactions. The power of fintech and blockchain in global and regional financial service will force regulators to update their rules and laws. Blockchain will be used in short term in digital identity, clearance and settlement, advanced distributed cryptography and compliance with bank APIs. Then with this new ecosystem the use of cryptocurrencies will be a "must". The emergent Bolivian fintech ecosystem must work with regulators to solve the big issue. Financial Inclusion is a fundamental objective for the bolivian government, it can be heavily leveraged with fintech technologies.

View post:
Phoenix Rise of the Bolivian Cryptocurrency Market - Coin Idol (press release)

Read More..

A cryptocurrency for weed is crashing a day after it sponsored Dennis Rodman’s trip to North Korea – Yahoo Finance

A digital currency for the legal marijuana industry is plummeting a day after it sponsored Dennis Rodman's trip to North Korea.

PotCoinis a digital cryptocurrency much like bitcoin that was specifically developed to remove the need for cash transactions between marijuana consumers and dispensaries.

PotCoin plummeted 23% on Wednesday, just a day after it soared 97% following the publicity it receivedfrom sponsoring Rodman's trip.

The currency is valued at just over 13 US cents.

Marijuana is illegal under federal law, so the majority of banks won't take cash or open lines of credit for marijuana businesses. PotCoin was developed to facilitate transactions between marijuana consumers and businesses, removing the need for cash.

(Potcoin down.Coin Market Exchange)

NOW WATCH: An economist explains the key issues that Trump needs to address to boost the economy

More From Business Insider

Read the original here:
A cryptocurrency for weed is crashing a day after it sponsored Dennis Rodman's trip to North Korea - Yahoo Finance

Read More..

Why automation driven by cloud technologies is becoming more critical for organisations – Cloud Tech

More than half of respondents in a survey carried out by managed cloud provider 2nd Watch say at least half of their deployment pipelines are automated, with 63% saying they can deploy new applications in less than six weeks.

The study, which garnered responses from more than 1,000 participants from US companies with at least 1,000 employees, found that companies embracing cloud automation were able to deploy new applications and workloads faster and more frequently.

Alongside the almost two thirds who said deploying new applications took less than six weeks, 44% said deploying new code to production took a day or less, while 54% say they are deploying new code changes at least once a week. A similar number (55%) say they are measuring application quality by testing everything, while two thirds argue at least half of all their quality assessments, such as lint and unit tests, are also automated.

The survey results reiterate what were hearing from clients and prospects: automation, driven by cloud technologies, is critical to the rapid delivery of new workloads and applications, said Jeff Aden, 2nd Watch co-founder. Companies are automating everything from artifact creation to deployment pipelines and process, which includes metrics, documentation and data.

The result is faster time to market for new applications, and less application downtime.

Earlier this month, a report from Puppet found particular discrepancies between higher and lower performing organisations when it came to automation. Top performing firms automated 72% of all configuration management processes on average, while lower ranked companies spent almost half (46%) of their time on manual configuration.

Read more:
Why automation driven by cloud technologies is becoming more critical for organisations - Cloud Tech

Read More..

Student records unintentionally made public on OU mail servers – Norman Transcript

NORMAN Private and sensitive information about past and present University of Oklahoma students was available to anyone with a campus-issued email account.

Student records with details as sensitive as financial aid information, Social Security numbers and eligibility status could be accessed through a document sharing system linked to campus emails over the course of a month. It was first discovered by OUs student-run newspaper, The Oklahoma Daily, which notified university administration and ran a story Wednesday about the type of records available.

Upon learning of the security issue much of the data available was protected by the Family Educational Rights and Privacy Act (FERPA) OU shut down the Microsoft file sharing program Delve, which was available to students through the campus Microsoft Office 365 software. According to The Dailys report, users were able to search for documents and records with information about other students.

At no point was the security of OU IT systems breached,saidMatt Hamilton, registrar and vice president for enrollment and student financial services. Rather, some sensitive files were inadvertently made accessible to OU account holders due to a misunderstanding of privacy settings.

In his statement, Hamilton contends no unauthorized person other than the author of the report accessed any of the files mentioned in the OU Daily Story.

Microsoft Delve works with another program, SharePoint, to allow users to share and access documents. Users place documents in SharePoint; Delve enables them to search for those documents.

Any SharePoint site with the open privacy setting was searchable to any user within the OU system, Hamilton said. This is how The Daily was able to access the sensitive data in question.

In its story, The Daily notes that any data gathered for the purposes of the story was deleted once the story was published.

It also states no stories will be written based on any records found.

The records available, according to The Daily, ranged from scholarship money students received to Social Security numbers, academic performance and eligibility of student athletes based on drug test results, academic performance and recruiting violations.

The records were made available when the university moved SharePoint to cloud servers May 14, OU spokesperson Rowdy Gilbert said.

Hamilton said some OU departments used the program to share files with each other, which is legal under FERPA.

However, in some cases, the privacy setting options of these sites were misinterpreted, inadvertently allowing access to any OU account holder, Hamilton said.

Delve remains shut down to any OU user. The SharePoint sites mentioned in The Dailys story have now had access restricted to authorized staff users only, Hamilton said.

While there was no outside breach of our files, we understand and acknowledge concerns about the vulnerability of sensitive data, he said. We rectified the situation immediately and can assure students that their FERPA-protected files are secure. Moving forward, we will continue to evaluate our privacy measures to ensure absolute protection of personal data.

Gilbert said since OU faculty and staff handle sensitive information daily, there are strict guidelines and expectations they are required to uphold.

We have no evidence that this expectation has been violated, Gilbert said.

Students reacted to the report with concern. While there is no sense the information was made available on purpose, there is a worry the records were so widely available at all.

I dont think the university was using the files and information for anything negative, but its an issue that anyone, not just school employees, could look at or use that information,saidDan Williams, a junior studying political science. I think taking it down is a great response, but I think they need to be constantly monitoring data inside of OU.

We switch platforms all the time, and any time you make these changes, you have to make sure the data is safe. Its possible that private information is out in the public and we dont know about it, and that is very concerning.

According to The Daily's findings, the records that were available include:

29,000-plus cases of protected data disclosed

18,668 financial aid records of freshman classes from 2012-2016

4,585 Pell Grant recipients

626 semester GPAs for student athletes and managers

539 visa types and statuses for international students

133 semester GPAs for students on President Leadership Council

30 Social Security numbers

Read more:
Student records unintentionally made public on OU mail servers - Norman Transcript

Read More..

IoT apps and event-driven computing reshape cloud services – TechTarget

Tools are always shaped by their uses. When cloud computing first came on the scene, it was a form of hosted virtualization, and its goal was to look like a bare-metal server.

Infrastructure as a service (IaaS) shaped the earliest cloud services, and it still dominates public cloud as well as the private cloud software market. Even so, that doesn't mean it's going to be the source of future cloud opportunity.

Cloud providers are always strategizing for the future, and their plans reveal an important -- and already underway -- shift. Every major public cloud provider has added services to process events. In particular, providers are adding features to help developers build applications for the internet of things (IoT). Could these be the basis for the most transformational set of applications to come along since the internet itself?

Legacy applications follow a pattern that's decades old: Work comes to the applications that support it. In traditional cloud computing, users pay for the processing resources they use. The terms differ, but it's essentially a lease of virtual infrastructure. This is a direct mirror of what happens in a data center -- a server farm is loaded with applications and transactions are routed to the correct server in the pool. This approach is great where work is persistent, as in the case of a retail banking application that runs continuously.

Event-driven and IoT apps change this critical notion of persistence. An event can pop up anywhere, at any time. It would be wasteful, perhaps prohibitively wasteful, to dedicate an IaaS instance to wait around for an event. Or the instance might reside within a data center halfway around the world from where the event occurs. If all possible event sources were matched with traditional cloud hosting points, most would sit idle much of the time, doing little but running up costs.

The reason why there's a specific right or wrong place to process events is simple: delay. Most events have a specific response-time expectation. Imagine a machine that triggers spray paint when an item passes a sensor. Picture a self-driving vehicle that approaches a changing traffic light.

The information flow between an event and the receipt of the appropriate response is called a control loop. Most events require a short control loop, which means that their processes need to be close to the point of the event. That's the problem with control loops that force event-handling processes to disperse out toward the cloud edge -- and multiply in numbers.

It's easy to see how the scarcity of events at a given point creates a problem of cloud efficiency and pricing for traditional cloud computing. It's also possible to have too many events. The cloud can allow for cloud bursting, or scaling capacity by spinning up multiple copies of an application component on demand, but it's not that easy.

Few applications written to run on a bare-metal server can seamlessly scale or replace failed instances. Those cloud capabilities aren't common in data centers, where legacy applications run. Moving the applications to the cloud doesn't add the features necessary to scale applications, either.

Multiple copies of an application component require load balancing, and many applications were not designed to allow any copy of a component to handle any event or request. Applications that work by assuming a string of requests in context can't work if half the string goes to one copy of the application and the other half to another. How do we make IoT apps scalable and resilient? They have to be rewritten.

Developers are doing just that, and big cloud providers are responding. In particular, they all see the same IoT-and-event future for the cloud. They have been steadily enhancing their cloud offerings to prepare for that future. Not only do the cloud giants offer special web services to manage IoT devices and connections, they now provide tools to support the kind of programming that IoT apps will demand.

The functional or lambda style of programming doesn't allow an application or component to store data between uses. As a result, all instances of the component can process an event. Cloud providers now offer functional or microservice support instead of simply providing infrastructure, platform or software as a service, because a function cloud is very different.

Where is your function hosted in a function cloud? Everywhere. Nowhere. Functions are activated anywhere they're needed -- when they're needed -- and you pay when you use one. Function clouds for IoT, or any kind of event-driven computing, represent the ultimate flexibility and agility. They also demand that users take care to establish policies on just how much function hosting they are willing to pay for, a decision they'll have to make based on the combination of cost and those pesky control-loop lengths.

Amazon has even allowed for the possibility that IoT will demand cloud applications that migrate outside the cloud. Their Amazon Web Services (AWS) Greengrass platform is a software and middleware framework that lets users execute AWS-compatible functions on their own hardware. This capability will let IoT users do some local processing of events to keep those control loops short, but still host deeper, less time-critical functions in the AWS cloud.

The old cloud model made you pay for hosting instances. In the function cloud, you don't host instances in the usual way. You have extemporaneous execution of functions, as needed. This is what gives rise to the pay-as-you-go or serverless description of the function cloud, but that's short of the mark. You could price any cloud computing service, running any application, on a usage basis, but that doesn't make those cloud services scalable or easily optimized. Without these features, serverless is just a pricing strategy.

Developers will have to make changes in applications to accommodate IoT and function clouds. Almost every new program or service stores information, and this makes it difficult to scale. The rule of functional programming is stateless, meaning that the output you get from a process is based only on what you provide as input. There are even programming languages designed to enforce stateless behavior on developers; it's not second nature.

The notion of the function cloud is likely to accelerate a trend that's already started in response to the use of mobile devices and implementation of BYOD policies. Companies have found that they are creating application components designed to format information for mobile devices, interface with apps written for a variety of mobile platforms and provide consistent support from back-end applications often running in data centers.

These forces combine to create a two-tier model of an application. The device-handling front tier lives in the cloud and takes advantage of the cloud's ability to distribute applications globally. The cloud part then creates traditional transactions for the core business applications, wherever they are.

IoT is even more distributed than mobile workers, and some IoT events need short control loops. As a result, cloud hosting of the front-end part of applications could see explosive growth. That puts pressures on the off-ramp of this kind of two-tier application structure because many events might generate many transactions. These transactions can overwhelm core business applications. Cloud providers are working on this, too. Microsoft, for example, has a cloud-distributed version of the service bus typically used to feed business applications with work.

If you're writing functions for any reason, isn't using a function cloud inevitable?

Given that IoT is in its infancy -- and cloud IoT is even younger -- it's easy to wonder why cloud providers are already offering IoT features. There are three reasons. First, IoT could radically increase IT spending, and cloud providers want to grab some of that as potential new revenue. Second, IoT isn't the only thing that generates events. A lot of mobile worker interaction, for example, looks like event processing. Finally, functional programming techniques are being promoted for every kind of processing. IoT apps demand them. Developer tools and conferences are already describing how functional programming techniques can make programs better and more maintainable.

If you're writing functions for any reason, isn't using a function cloud inevitable?

That's the big question that every cloud provider and cloud user needs to think about. Fully scalable applications -- ones that can increase or decrease capacity under load and repair themselves by simply loading another copy -- are very useful to businesses. The functional programming techniques developed for IoT apps, and the function clouds to support those techniques, will remake programs.

Tools are defined by their uses, remember? Well, users are already seeing the cloud of the future in event processing, and IoT will accelerate that trend. IoT's potential to generate events over a wide area, in large numbers, while demanding short control loops will revolutionize cloud use.

Learn the benefits of runtime as a service

Are you ready for serverless computing?

Build IoT apps for cloud in a flash

Read the original post:
IoT apps and event-driven computing reshape cloud services - TechTarget

Read More..

Softcat among those trumpeting G-Cloud success – ComputerWeekly.com

Softcat has strengthened its position as a public sector player on the latest G-Cloud 9 framework moving into a position where it can offer a significant number of cloud offerings.

Digital transformation is a phrase that means many things to many people but for it to have any real relevance to the channel then it needs to mean a chance to make money. This guide will share some of the recent developments in the channel and the latest thoughts about the issue.

By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.

You also agree that your personal information may be transferred and processed in the United States, and that you have read and agree to the Terms of Use and the Privacy Policy.

The reseller can now provide 226 services through the framework to a range of public sector customers, including government, emergency services, health and education.

The G-Cloud services are split into lots that cover cloud hosting, software and support. Softcat operates across those areas and has tendered through the evolution of the G-Cloud process.

Winning a place on this latest framework agreement reinforces our public sector offering and allows us to provide a huge range of services to our customers via a trustworthy procurement route," said Any Bruen, public sector frameworks manager at Softcat.

The G-Cloud framework is one of the more straightforward frameworks for our customers to procure under and offers a vast array of different cloud-based IT solutions," he added.

The latest iteration of G-Cloud came into force on 22 May and other public sector channel specialists have also been trumpeting their success in getting listed as approved suppliers.

Jason Clark, CEO and president at Proact, which can offer 33 services, said that it had been on both version 7 and 8 of G-Cloud.

With a specialist team covering all aspects of the sector across the whole of mainland UK, Proact's services on G-Cloud 9 can support UK public sector organisations in their digital transformation journeys with flexible, affordable and on-demand services and solutions that can help provide public services in a more effective, more efficient and lower cost way to the taxpayer," he said.

Plenty of G-Cloud business should be going through the channel with 50% of the top 50 suppliers being SMEs. So far 1.6 billion spent on the government's digital marketplace and 56% of total sales and 64% of volume was awarded to SMEs.

See the article here:
Softcat among those trumpeting G-Cloud success - ComputerWeekly.com

Read More..

Quantum Computing Technologies markets will reach $10.7 billion by 2024 – PR Newswire (press release)

Eighteen of the world's biggest corporations (see image above) and dozens of government agencies are working on Quantum Computing or partnering with startups like D-Wave.

Near-term expectations for quantum computing range from solving optimization problems, quantum-encrypted communications, artificial intelligence, smart manufacturing & logistics and smart retail, through quantum computing in the cloud and molecular structure research.

Smaller quantum computers will make other contributions to industry (energy, logistics etc.), defense and national security intelligence, as well as other markets spanning from drug design to finance.

Even simple quantum computers can tackle classes of problems that choke conventional machines, such as optimizing trading strategies or pulling promising drug candidates from scientific literature.

The fierce competition at the national industrial and academic level is leading to a race for quantum supremacy.

The competitors are all worthy of respect, especially because they are striving for supremacy not just over each other, but over a problem so big and so complex, that anybody's success is everybody's success.

2024 Market* $10.7 Billion. 2 Volume Report.

We are in the midst of a "Quantum Computing Supremacy Race" one that will result in groundbreaking computing power, enabling disruptive new quantum computing technologies that have the potential to change long-held dynamics in commerce, intelligence, military affairs and strategic balance of power. If you have been paying attention to the news on quantum computing and the evolution of industrial and national efforts towards realizing a scalable, fault-tolerant quantum computer, that can tackle problems, unmanageable to current supercomputing capabilities, then you know that something big is stirring throughout the quantum world.

In a way that was unheard of five years ago, quantum physicists are now partnering with corporate tech giants, to develop quantum computing capabilities and technologies as the foundation of a second information age.

Eighteen of the world's biggest corporations (see image above) and dozens of government agencies are working on Quantum Computing or partnering with startups like D-Wave. Near-term expectations for quantum computing range from solving optimization problems, quantum-encrypted communications, artificial intelligence, smart manufacturing & logistics and smart retail, through quantum computing in the cloud and molecular structure research.

Smaller quantum computers will make other contributions to industry (energy, logistics etc.), defense and national security intelligence, as well as other markets spanning from drug design to finance.

Even simple quantum computers can tackle classes of problems that choke conventional machines, such as optimizing trading strategies or pulling promising drug candidates from scientific literature.

The fierce competition at the national industrial and academic level is leading to a race for quantum supremacy. The competitors are all worthy of respect, especially because they are striving for supremacy not just over each other, but over a problem so big and so complex, that anybody's success is everybody's success.

According to the report, "Quantum Computing Technologies & Global Market 2017-2024", the global Quantum Computing market* will reach $10.7 billion by 2024, out of which $8.45 billion stemming from product sales and services and $2.25 billion from Gov. RDT&E programs and funding.

The 2-volume 520-page landmark report is the only comprehensive review of the global quantum computing market available today. This report is a valuable resource for executives with interests in the market. It has been explicitly customized for ICT industry, investors and government decision-makers to enable them to identify business opportunities, emerging applications, market trends and risks, as well as to benchmark business plans.

The report provides an updated extensive data of the leading 52 Quantum Computing vendors: - 1Qbit - Agilent Technologies - Aifotec AG - Airbus Group - Alcatel-Lucent - Alibaba Group Holding Limited - Anyon Systems, Inc - Artiste-qb.net - Avago Technologies - Booz Allen Hamilton - British Telecommunications (BT) - Cambridge Quantum Computing - Ciena Corporation - Cyoptics - D-Wave Systems Inc - Eagle Power Technologies, Inc - Nano-Meta Technologies - Emcore Corporation - Enablence Technologies - Fathom Computing - Finisar Corporation - Fuijitsu Limited - Google Quantum AI Lab - H-Bar Quantum Consultants - Hewlett Packard Enterprise Company - IBM - ID Quantique Infinera Corporation - Intel Corporation - IonQ - JDS Uniphase Corporation - Kaiam Corporation - Lockheed Martin Corp. - MagiQ Technologies, Inc. - Microsoft Quantum Architectures and - Computation Group (QuArC) - Mitsubishi Electric Corp. - NEC - Nokia Bell Labs - NTT Basic Research Laboratories and - NTT Secure Platform Laboratories - Optalysys Ltd. - Post-Quantum - QbitLogic - QC Ware Corp. - Quantum Hardware Inc - Qubitekk - QxBranch - Quintessence Labs - Raytheon BBN - Rigetti Computing - SK Telecom - Sparrow Quantum - Toshiba

Read the full report: http://www.reportlinker.com/p04838492/Quantum-Computing-Technologies-Global-Market-.html

About Reportlinker ReportLinker is an award-winning market research solution. Reportlinker finds and organizes the latest industry data so you get all the market research you need - instantly, in one place.

http://www.reportlinker.com

__________________________ Contact Clare: clare@reportlinker.com US: (339)-368-6001 Intl: +1 339-368-6001

To view the original version on PR Newswire, visit:http://www.prnewswire.com/news-releases/quantum-computing-technologies-markets-will-reach-107-billion-by-2024-300474271.html

SOURCE Reportlinker

http://www.reportlinker.com

See the original post:
Quantum Computing Technologies markets will reach $10.7 billion by 2024 - PR Newswire (press release)

Read More..