Page 4,082«..1020..4,0814,0824,0834,084..4,0904,100..»

Localethereum Becomes Localcryptos and Adds BTC Trading – Bitcoin News

P2P trading site Localethereum has rebranded to Localcryptos, allowing users to buy and sell both ETH and BTC without KYC. The platform has been providing peer to peer trading services for ethereum since 2017, but in a surprise announcement this week, the website revealed its new home, localcryptos.com and rolled out its P2P trading service and fiat onramp to bitcoin users. Visitors to Localethereum will now find themselves redirected to the new multi-currency site, in a move which will give rival P2P exchange Localbitcoins (LBC) pause for thought.

Also read: Traders Bemoan New Localbitcoins Identity Requirements

Localcryptos is painting a target on the back of Localbitcoins, placing itself in direct competition not only by offering BTC trades, but by making it possible for users to import their reputation score from one site to the other. The platform announced: When you join LocalCryptos, you can bring your profile reputation from LocalBitcoins. That way, you can keep your hard-earned reputation you dont need to start over.

By making it as simple as possible for LBC users to carry over their positive reputation, Localcryptos has fired the first shots in a new battlefront for the P2P exchange marketplace. Localcryptos also elaborated on what it considers to be a number of issues with the long running BTC onramp, of which security was at the top of the list. Localcryptos did not hold back on its criticisms of the longstanding exchange, writing: LocalBitcoins looks the same today as it did in 2012, and theres plenty wrong with it. Platforms like LocalBitcoins are centralized, custodial, and a far cry from private.

By comparison, Localcryptos is a non-custodial service where users retain their private keys, an advantage which will be welcomed by traders who are well aware of the fairly regular hacks and security breaches that centralized services fall foul of. The system in place on Localcryptos is similar to P2P bitcoin cash marketplace local.Bitcoin.com which launched earlier this year.

The escrow feature on Localcryptos is noncustodial, using ethereum smart contracts for ETH trades and P2WSH for bitcoin, so whatever crypto youre trading, the funds are never held on a centralized server at any point. For anyone interested in exactly how the decentralized bitcoin escrow service is achieved, Localcryptos goes into the nuts and bolts here.

As well as rolling out its service to bitcoiners, Localcryptos intends to extend its P2P trading service to other ERC20 tokens over time, and eventually plans to tackle non-fungible ERC721 tokens as well. P2P exchanges have often labored with low trade volumes and liquidity, an issue which news.Bitcoin.com recently covered in a report on decentralized exchange Bisq. If Localcryptos spreads itself too thin across too many tokens, theres the potential for liquidity issues to surface.

At present, however, with 100,000 users worldwide and 1,000 active users on any given day completing 200 trades, Localcryptos looks set to create another valuable decentralized gateway for BTC with minimal or no KYC. Coupled with the BCH marketplace available at local.Bitcoin.com, and bitcoiners bemoaning the loss of LBC have new platforms to turn to at last.

What are your thoughts on Localcryptos.com? Let us know in the comments section below.

Disclaimer: This article is for informational purposes only. It is not an offer, solicitation or a recommendation, endorsement, or sponsorship of any products, websites, software, services, or companies mentioned. Neither the company nor the author is responsible, directly or indirectly, for any damage or loss caused or alleged to be caused by or in connection with the use of or reliance on any content, goods or services mentioned in this article.

Images courtesy of Shutterstock.

Did you know you can verify any unconfirmed Bitcoin transaction with our Bitcoin Block Explorer tool? Simply complete a Bitcoin address search to view it on the blockchain. Plus, visit our Bitcoin Charts to see whats happening in the industry.

Kai's been manipulating words for a living since 2009 and bought his first bitcoin at $12. It's long gone. He's previously written whitepapers for blockchain startups and is especially interested in P2P exchanges and DNMs.

View post:
Localethereum Becomes Localcryptos and Adds BTC Trading - Bitcoin News

Read More..

Stock to Flow Analyst Sees Bitcoin Trading at $10,000 Before End of Year – CCN.com

In the last few days, bitcoin has been the subject of bearish calls. Many are calling a drop between $6,000 and $5,500.

Peter Brandt chimed in on the topic. The most followed trader on Twitter said that the head-and-shoulders top for bitcoin doesnt look valid. But the analyst painted a descending triangle breakdown and indicated a target of $3,905.

While bitcoin appears to be on a technical downtrend, PlanB believes that it is possible for the top cryptocurrency to regain its bullish composure. The analysts stock to flow model suggests that the recent selloff is a deviation.

The stock to flow (SF) model is a method used to predict the value of a commodity like gold or silver by determining the assets scarcity. The formula to determine scarcity or hardiness is simple. Get the amount of the precious metal in inventory and divide it by the amount mined or produced every year.

Among precious metals, gold has the highest stock to flow ratio. According to BayernLB, gold has an (SF) ratio of 58, which is significantly higher than the SF ratio of silver, palladium and platinum. Based on this model, golds market cap should be close to $10 trillion. Currently, the precious metal is worth $8 trillion.

While gold may be ahead of the curve, BayernLB predicts bitcoins SF ratio would make tremendous gains in six months. The model indicates that the top cryptocurrency would have a SF ratio of 53 after the May 2020 halving. That will put bitcoin at a $1 trillion market cap.

Using the stock to flow model, PlanB can forecast the price of bitcoin without using technical analysis. So far, the analysts model has been right on the money.

Bitcoins recent selloff has put the price (red dot) below the SF value (grey line). According to PlanB, bitcoin trading below the SF value a few months below the halving is unusual. Thus, it wont come as a shock if bitcoin bounced hard and traded above $10,000 before the end of the year.

Even after a brutal dump over the last few weeks, bitcoin is still up by over 94% year-to-date. Thats why Simon Peter, an analyst at eToro, also believes that the cryptocurrency will recover $10,000. The trader told CCN,

In terms of a price target for the end of the year, I feel well get back to a $10,000 level and probably stay around, or just above, that level by the end of December.

The analyst added,

There are a few reasons for this. Firstly, given the fall in price we saw last week, various technical indicators are pointing to an oversold bitcoin market, and thus the opportunity to acquire more of it is arguably more attractive.

Peters also said,

With global currencies potentially losing further purchasing power going forward, over the long term this could be great news for bitcoin. Investors could look to crypto to protect their wealth, which in turn, increases demand. This coupled with a decreasing supply could cause prices to push higher.

It appears that PlanB is not alone in seeing bitcoin trading above $10,000. Its possible that the move down to $6,500 is nothing but a deviation.

Disclaimer: The above should not be considered trading advice from CCN. The writer owns bitcoin and other cryptocurrencies. He holds investment positions in the coins but does not engage in short-term or day-trading.

Read more:
Stock to Flow Analyst Sees Bitcoin Trading at $10,000 Before End of Year - CCN.com

Read More..

Bitcoin Price Recovers Above $7,000 But Markets Shiver in Extreme Fear – CCN.com

The Crypto Fear and Greed Index (CFGI) recently plummeted to its lowest value in months, signaling extreme levels of fear in the industry. The move comes after bitcoin dropped below $7,000, which brought out some of the most prominent crypto critics to spread fear, uncertainty, and doubt (FUD) around the market.

Peter Schiff, the CEO and chief global strategist of Euro Pacific Capital, entertained the idea that a head-and-shoulders pattern was developing on bitcoins chart, which could trigger a sell-off to $2,000. Meanwhile, Mark Dow, a former US Treasury and International Monetary Fund economist, affirmed that bitcoin was dying. And, Dr. Nouriel Roubini, a professor at New York Universitys Stern School of Business, called it the most useless and over-hyped technology ever.

Regardless of the pessimism, historical data shows that the current levels of fear could be a sign that the market will recover.

Over the last week, more than $56 billion were wiped off the cryptocurrency market cap. The massive downturn took many investors by surprise since the run-up to the bitcoin halving usually has a positive impact on the industry as a whole. But, this time around, BTC dropped nearly 23 percent to hit a low of $6,600 on Nov. 25.

The sudden nosedive turned the market sentiment into extreme fear, according to the Crypto Fear and Greed Index (CFGI).

This technical index evaluates the emotions and attitudes that investors have towards bitcoin. It takes into consideration several factors, including volatility, volume, social media, surveys, and market dominance. Then, the CFGI turns these viewpoints into a straightforward number. A value of 0 means extreme fear, while a value of 100 represents extreme greed.

At the moment, the CFGI is on a 21, which implies extreme levels of fear in the crypto market. This is the lowest value that this technical indicator had reached since Sept. 26, when it went into a 12.

Despite the current market sentiment, Warren Buffett, one of the top investors of all time, once said:

Be fearful when others are greedy, and greedy when others are fearful.

Based on historical data, this could be the case. Every time the CFGI goes into extreme fear, bitcoin tends to surge immediately after. On Oct. 24, for instance, this index was on a 20 indicating extreme fear, but the next day BTC skyrocketed over 40 percent to hit a high of $10,500.

According to the former senior market analyst at eToro, Mati Greenspan, the recent sell-off that the market went through triggered a massive amount of buy orders on Bitfinex, a cryptocurrency exchange headquartered in Hong Kong. He maintains that bitcoin buy positions hit their highest level since February.

It's very clear what the market thinks of this latest selloff!

Bitcoin buy positions (blue line) are at their highest level since February, while bitcoin sell positions (orange) are near their lowest level ever.

Data from @bitfinex displayed by @tradingview. pic.twitter.com/yaT0wQpGgs

Mati Greenspan (@MatiGreenspan) November 26, 2019

Along the same lines, the CEO of Three Arrows Capital, Su Zhu, believes that $6,775 was a key price point for bitcoin.

If this was bottom (6680~ish on normal venues, 6775 on bfx), then it has some similarities to the low 3k capitulation in that the low was fully absorbed by buy walls rather than culminating in a flash wick (as what happened in 2015)

Liquidity is a lot more efficient nowadays https://t.co/zSdiZU6mwe

Su Zhu (@zhusu) November 26, 2019

The economist Alex Kruger argued that the Crypto Fear and Greed Index is relatively accurate at pinpointing bottoms. Still, it does not take into consideration multiple factors that are crucial to determine the direction of the market. Therefore, time will tell whether the current state of extreme fear in the market could indeed estimate an upswing.

Read the original:
Bitcoin Price Recovers Above $7,000 But Markets Shiver in Extreme Fear - CCN.com

Read More..

AI, Cloud Computing and IoT: How digitalisation is driving dramatic IT changes in healthcare – Techerati

Features Hub Opinion

8 hours ago | Jamie Bourassa

The IT architecture at the heart of todays healthcare innovations is shifting to the edge. Schneider Electrics Jamie Bourassa explores

Today, IT within the healthcare industry is undergoing profound changes. This has been driven, in part, by the development of advanced new treatments, including robotics, analytical imaging and robust data networks, which enable the lessons learned from pioneering medical practitioners to be distributed to peers around the world, more rapidly than ever before.

For healthcare providers, ensuring a quality environment of patient care is paramount. New technologiesfrom digital imaging to security-enhancing baby finders to always-on wearable technologyare helping to reduce errors, improve care, and decrease costs simultaneously.

In the 2019 Global Health Care Outlook Report Deloitte states that, there is an exponential increase in the pace and scale with which digital healthcare innovations are emerging. Digital technologies are supporting health systems efforts to transition to new models of patient-centered care and helping them develop smart health approaches to increase access and affordability, improve quality, and lower costs.

Such disruptive digital technologies described by Delloite include Blockchain, Robotic Process Automation (RPA), cloud computing, Artificial Intelligence (AI) and Internet of Things (IoT) applications; all of which are helping medical professionals diagnose and treat patients with increasing speed, quality and accuracy. However, underneath them, the role of the IT system and in some cases, the data centre, has become increasingly important.

With the emergence of these disruptive technologies, the adoption of a new hybrid IT systemcombining the ubiquitous nature of the cloud with local computing resource closest to where healthcare professionals both create data and need access to itis both changing the business models of many health organisations and helping to solve many of their associated problems.

Consider, if you will, the impact of robotics on new medical treatments, those specifically used in modern surgical procedures. It is now possible for the most advanced medical instruments to carry out the most complex surgeries, under the control of a doctor who is positioned in a room separate to the patient undergoing treatment.

In this case, the surgical team will be surrounded by monitors and control systems, which they use to direct robotic equipment through the procedure. Miniature cameras operating both close to, and often inside, the patient, relay images that are analysed by the on-site IT equipment allowing surgeons to make life-changing decisions with greater accuracy and less risk of error than ever before.

Although not a subject yet synonymous with the healthcare industry, such IT equipment is an archetypal example of an edge computing deployment.

Latency, for example, is a vital factor; the images have to be relayed and analysed in real time for the treatment to be effective. Using a remote server somewhere in the cloud is a poor option for such applications, and so the necessary IT equipment must be positioned close to the point of use.

Typically, racks of IT equipment, or micro data centres, will be found in networking closets near to the control room in which the surgeon operates as will Uninterruptible Power Supplies (UPSs), which are used to protect the equipment itself from power surges or outages. In this case, resiliency and availability are absolutely essential to the healthcare organisation.

Another consideration might be Artificial Intelligence (AI) or Machine Learning (ML) functionalities. For example, as imaging information, which is essential to the surgical operation, is analysed locally, key data points are shared via the cloud to other medical systems around the world.

These data pools will analyse and document outcomes of similar surgeries that have taken place, forming a feedback loop that uses a database of crowd-sourced information. In this way, the lessons learned by surgeons pioneering advanced medical techniques can be disseminated around the world rapidly, increasing awareness of new and potential treatments that benefit humankind in a manner that was previously unimaginable.

Finally, data security is another key aspect for the healthcare IT professional. The IDC Data Centre Operational Survey, published in January 2019, found that 45 percent of respondents cited data security and compliance as the #1 issue they are tackling. Within the same survey IDC found that 17 percent of respondents had also experienced a physical security breach in the past year. Furthermore, IDCs March 2018 Enterprise Data Centre Edge Survey highlighted cyber and physical security as top concerns for businesses exploring edge computing deployments.

Since the technologies that run healthcare are becoming more and more of an extension of IT, these assets require a higher degree of both physical security and cyber security. How does one, for example, ensure that patient information, and indeed data centres and server rooms, remain safe and secure at all times? There are of course the software and cyber security services that an IT Manager must employ, but the biggest risk to patient information remains with human error; an unauthorised person gaining phyical access to its IT system.

Physical security therefore becomes paramount to the healthcare organisation and whether through ID cards, environmental monitoring or indeed, via secure access to IT racks or network closets, the protection of the patients data remains crucial.

Its clear to see that as healthcare evolves so will the technology architectures that supports it. However, in order to enable these technologies, healthcare facilities need a rock-solid physical infrastructure to support their assets.

Edge computing is the IT architecture required to deliver these new and digitised medical services, ensuring that patient care and treatment remain the primary concerns for medical professionals and not the resiliency of their IT systems.

So why should IT Managers within the healthcare industry take note of the edge? The answer is simple. This computing architecture is directly supporting tomorrows advancements in medical surgery, patient treatment and recovery.

In the future, as medical services continue to become more digitised, the challenges that accompany them the need for continuous power, avoidance of Internet congestion or low latency, the security of IT, data and rapid access to patient information can only be enabled by localised edge computing systems.

Read the original here:
AI, Cloud Computing and IoT: How digitalisation is driving dramatic IT changes in healthcare - Techerati

Read More..

The Impact of Cloud Computing on the Insurance Industry – Doxee

Cloud computing is having a huge impact on the insurance industry, with benefits for internal processes, new customer acquisition, and policyholder loyalty. Cloud Computing has had a huge impact on all sectors, in terms of business processes, production dynamics, and in the relationship with customers, users, employees, and suppliers.

Well start by consulting industry data to have a better understanding of this impact.

According to a 2018 Gartner study, cloud users will double by 2021, while the market around this technology will increase from $153 billion in 2017 to an estimated $302 billion by 2021. According to another Gartner report, by 2022 about 90% of organizations will use cloud services. Moreover, skills in Cloud Computing are the most requested by companies in any sector, according to research from LinkedIn.

For more information on cloud computing, the different types of systems, how to best exploit the potential of this sector, and the most important future trends, read this article.

In this post, well be focusing on a specific sector: insurance. We will focus, in particular, on three aspects that in a nutshell cover the whole range of action of a company in the insurance sector. Well start from the improvement of internal processes, to winning new customers, up to the loyalty of existing customers.

But first, lets start with a quick look at how digital transformation has changed the sector.

Digital transformation has profoundly revolutionized the world of insurance.

As with other sectors, digital has greatly accelerated processes and helped simplify the dialogue between companies and customers. It has also resulted in a marked that is now crowded with new players, making the challenge for loyalty more difficult than ever before.

One way to summarize these changes is to consider that the user (therefore the insured), once a figure at the culmination of business processes, is now at the center of the business.This emphasizes the fundamental importance of personalization and the necessity of a larger investment in Customer Service and Customer Communication departments.

To go into more detail, Bain & Company and Google have identified seven key technologies that are driving digital transformation in the sector. These are technologies that have already radically transformed the world of Insurance and that will have an even greater impact in the future:

Given this brief but fundamental list, one thing is clear: these drivers of innovation will all rely on Cloud Computing and its increased computing power, the simultaneous reduction in costs, the increased flexibility and scalability, and the enormous possibilities from an omnichannel perspective.

Next, well identify the three main themes around the most important advantages of Cloud Computing in the Insurance Industry ecosystem.

As we anticipated, the first advantages of Cloud Computing systems are seen inside the organization, chief among these, the financial advantages.

Companies that use cloud systems greatly reduce the cost of purchasing hardware and software, thanks to on-demand and pay-per-use optics. They no longer have to buy local servers and data centers, which require specialized personnel to manage and maintain, and which take up physical space and consume electricity 24 hours a day, 7 days a week.

And, since most services are provided on-demand, you can have access to abundant computing resources quickly, easily, and with the flexibility your business needs, and without an expensive hardware or software investment.

All of this is in favor of optimizing performance and internal processes, also because, by hosting platforms, software, and databases remotely, youre able to free up memory and computing power on individual machines within the organization.

Optimization and efficiency also apply to the production of documents, such as policies, forms, and contracts of various kinds. This constitutes a very expensive (and complex) element of insurance company processes. Today, such processes can be managed in a totally digital and cloud-native way.

This is exactly what Doxee is doing in collaboration with RGI, a leading European insurance company, with the PASS_Insurance 4.0 platform (to learn more follow this link), which supports dialogue with customers (more on this point below).

Cloud Computing allows you to collect and analyze large amounts of data (Big Data); and above all to select the most important, functional and deep data (in other words, Smart Data and Deep Data).

In other words, it is a matter of going in search of the digital traces that we all leave, daily, on the web.

Its important to underline that, in Italy, more than any other group, millennials are the largest web audience. In Italy, this is around 13 million people, who prefer the web (76% of whom are permanently connected) to research and choose the products that fit their needs; this same group is also more likely to give away their data in exchange for newsletters, downloads, requests for quotes, and other useful information or services.

Once these traces have been collected (in an omnichannel way, of course) it will be easier to identify the real audience and to further segment this audience that can be approached with targeted actions that are as tailored, as personalized, as possible.

We understand that this data-driven approach is by far the most effective way to secure new customers.

According to an analysis by Bain & Company, winning a new customer costs between 6 and 7 times more than retaining one, through a satisfactory Customer Experience.

From this, we can understand why all companies are investing resources and attention on the Customer Service department. Here, the objectives are to increase engagement first and foremost, to retain customers (customer loyalty), and to initiate effective up-selling or cross-selling actions.

Now, what is the most effective way to improve the dialogue with customers, to make it effective and increase the sense of trust?

The answer is a strategy that is not new: Its about addressing each customer, each person, in a different way, depending on their unique characteristics, behaviors, and needs. Today, this is at the center of todays new strategies and future trends.

By exploiting the power and opportunities of Cloud Computing, companies can get to know their customers in greater depth. It is the frontier of personalization, which goes beyond segmentation, to arrive at a truly one-to-one dialogue between company and customer, in interactive and omnichannel mode.

Think of how fundamental this is in the insurance sector, which is characterized by a large number of potentially very slippery touchpoints (think of the collection and payment phases; during the claims process, and including dealing with unsatisfied customers).

Thats why more and more insurance companies are relying on organizations that specialize in both cloud and personalization services. Doxee has been involved in this sector for years, and collaborates regularly with leading players such as Axa, Credem, and PosteVita.

Read more here:
The Impact of Cloud Computing on the Insurance Industry - Doxee

Read More..

Its in the Cloud, So it Secure . . . Maybe! – Security Boulevard

Since the introduction of cloud computing, more and more companies have been flocking to cloud computing, because it has proven to be cost effective and inherently more secure than on premise data centers. However, no one has ever claimed that making switch magically happens by pressing a button. Cloud computing needs to be properly managed and configured. Processes and policies that protect the data and applications that reside in the cloud need to be developed and continuously monitored to stay within best practices.

Cloud providers offer a higher level of infrastructure security whether its protecting the data through physical security armed guards behind fenced-in facilities with dual or triple authentication to get into see the actual servers or making sure the underlying servers software is patched. But, they cannot and do not ensure that the clients apps and processes are secure, just as it is on your own on premise data center. Therefore, leaving yourself exposed to workload and application layer attacks.

With the switch to cloud computing, IT professionals need to develop a new skill set, especially within the security area. This influx of valuable data in single locations makes cloud providers a prime target for malicious activity. Case in point Capital One.

Capital One accidently left its data exposed by poorly configuring its infrastructure and the security surrounding it. When a company is breached in the cloud, this is usually the most common reason. Basically, it comes down to that humans will make mistakes.

With more and more businesses moving to the cloud, companies may find it beneficial to hire a cloud security consultant to analyze its set up which would have helped Capital One alleviate this public relations nightmare. Security consultants will examine how an enterprise processes and stores data and then craft a custom governance protocol for comprehensive protection. Professional security assessments are instrumental to helping ensure cloud-service providers meet your compliance needs to responsibly protect your valuable their data.

Regardless of the infrastructure, the tools or processes in place, you must also continuously monitor to detect dubious activity. The threat landscape is constantly evolving and your security posture must evolve as well. Unfortunately, this on top of everything else that needs to be done to make sure that you are meeting your companys needs. That is where many firms are finding it necessary to hire cloud security professionals or firms that specialize in this area, especially in regulated industries where 24 by 7 monitoring is necessary. Because anyone in the industry knows that most events happen on Friday afternoon at 5:30.

Securing Shared Infrastructure Whitepaper

Cloud computing is different from traditional on premise IT. The cloud is a shared infrastructure and when using shared infrastructures, organizations do not control much of the technology that underlies the cloud services they engage, especially networking. Shared infrastructures have their own security considerations that should be assessed before embracing the cloud.

Download Now

The post Its in the Cloud, So it Secure . . . Maybe! appeared first on CCSI.

Recent Articles By Author

*** This is a Security Bloggers Network syndicated blog from CCSI authored by CCSI Team. Read the original post at: https://www.ccsinet.com/blog/cloud-secure-maybe/

Follow this link:
Its in the Cloud, So it Secure . . . Maybe! - Security Boulevard

Read More..

Study shows continued cloud maturation in Nordics with manufacturing a standout – Cloud Tech

A new report from Nordic IT services provider Tieto has found the regions cloud landscape has matured significantly since 2015 from both a strategic and operational perspective with Sweden and Finland fighting for supremacy.

The study, the latest Cloud Maturity Index which was based on responses from almost 300 decision-makers across the public and private sectors in the Nordics, placed almost one in five (18%) organisations as mature, while a quarter (27%) were seen as proficient, 42% at a basic level, and 13% immature.

In other words, its a broad church, with just a slight emphasis on the have-nots rather than the haves. Those who are described as mature use cloud services to a larger extent virtually everything (97%) being cloud-based and are much likelier to exploit the technologys advantages compared with their immature cousins. Being classified as a mature cloud business means an approximately 20% lower IT operation costs, and on average 15% more efficiency in increasing business competitiveness.

When it came to specific industries, finance came out on top for Nordic organisations, maintaining its lead previously forged in the 2015 and 2017 surveys. The public sector continues to report the lowest strategic and operational maturity. Yet the gap is closing when it comes to traditionally slower verticals, with manufacturing proving particularly effective. Whereas finance scored 6.0 in 2015 and 6.3 this time around, the manufacturing industry has leapt to 6.0 from 4.4.

The report also noted the importance of environmental factors in organisations initiatives. This is not entirely surprising given the temperate climate has enabled many data centre providers to set up shop in the Nordics. Approximately half of companies polled said they were already considering issues such as energy consumption or CO2 emission as part of their cloud strategy. Again less than surprisingly, mature cloud organisations were considerably further ahead on environmental initiatives than their immature brethren.

Despite the reports figures again ranked out of 10 which showed Sweden and Finland comfortably ahead of Norway, according to Tietos head of cloud migration and automation Timo Ahomaki it is the latter who should be celebrating. Data sovereignty, Ahomaki argues, is an area which is quite polarised in Sweden, with Finlands more advanced cloud security meaning it is at the forefront of the Nordic public sector.

Regular readers of this publication will be aware of the various initiatives which have taken place regarding the emerging data centre industry in the Nordics. As far back as 2015, CloudTech reported on a study from the Swedish government which was later put into legislation to give tax breaks for data centre providers. Last year, DigiPlex announced a project whereby wasted heat from its data centres would be used to warm up residential homes in Oslo.

You can read the full report here (email required).

Interested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Read the original post:
Study shows continued cloud maturation in Nordics with manufacturing a standout - Cloud Tech

Read More..

As Per New Report on Hybrid Cloud Computing Market Will Touch a New Level in Upcoming Years | Microsoft Corporation, Cisco Systems, and Amazon Web…

The Hybrid Cloud Computing Market is expected to grow more than USD 94.6 billion, at a CAGR of 18.0% during the forecast period.

To provide the global outlook of the Hybrid Cloud Computing a new statistical study has added by The Research Corporation to its massive database. During the analysis of this market the existing industries, as well as upcoming startups, have been considered. It helps to make informed decisions in the businesses. Well explained Porters five analysis and SWOT analysis have been used by a researcher of the report. The research report is comprised of market trends and holistic business information that can pinpoint market pinpoint analysis along with revenue, growth, and profit over a predictable period. This provides a complete analysis of driver, paper and market opportunities.

Ask for sample copy of this report@ https://www.theresearchcorporation.com/request-sample.php?id=51134

Profiling Key players: Equinix Inc., Hewlett-Packard, VMware Inc., Oracle Corporation, Citrix Systems Inc., Rackspace Inc., IBM Corporation, Microsoft Corporation, Cisco Systems, and Amazon Web Services Inc and many more.

Hybrid Cloud Computing by Key Component

Hybrid Cloud Computing by Service Type:

It relates to new Hybrid Cloud Computing competitors and recognized players. This report includes the Hybrid Cloud Computing situation and forecasts of global and key regions, with the introduction of vendors, regions, product types and end enterprises and this report calculates product types and final industry in the world and major regions.

This market report includes analysts, managers, Hybrid Cloud Computing industry experts, and access research that is designed to help key people view graphs and tables, as well as data that records the resources needed to conduct their own analysis. This report, which deals with information integration and research potential with results, predicted a strong rise in this market in the produce section and in all regions.

Ask for a discount on this report@ https://www.theresearchcorporation.com/ask-for-discount.php?id=51134

This report provides a comprehensive analysis of:

Highlights of the Report:

Lastly, this report provides market intelligence in the most comprehensive way. The report structure has been kept such that it offers maximum business value. It provides critical insights into the market dynamics and will enable strategic decision making for the existing market players as well as those willing to enter the market.

Table of Contents:

For more information ask our experts @ https://www.theresearchcorporation.com/enquiry-before-buying.php?id=51134

If you have any special requirements, please let us know and we will offer you the report as per your requirements.

About Us:

The Research Corporation symbolizes current market trends in the global industry. Our mastery in the field of market insights and analysis makes our company an ideal platform for clients seeking pioneering research in the lucrative global market fields. At the Research Corporation, we work diligently on delivering prudent market insights with sound market intelligence; with that, we take pride in delivering comprehensive industry insights based on the market, market competitors, products and global customers. Through our erudite market approach, The Research Corporation has become synonymous with delivering the best product service.

Contact:

The Research Corporation

William K (Sales Manager)

1632 1st Avenue, New York, NY 10028, USA

+1 929 299 7373

sales@theresearchcorporation.com

Website: http://www.theresearchcorporation.com

Go here to see the original:
As Per New Report on Hybrid Cloud Computing Market Will Touch a New Level in Upcoming Years | Microsoft Corporation, Cisco Systems, and Amazon Web...

Read More..

Infarm plants its blend of vertical farming and cloud computing in QFC grocery stores – Yahoo Tech

KIRKLAND, Wash. The Seattle area offers a rich smorgasbord of geeky tech-as-a-service offerings ranging from software as a service, to gaming as a service, to pizza as a service.

Now you can add farming as a service to the list.

Thats what Infarm is going for, with hydroponic plant-growth cabinets that shrink the acreage needed to grow fresh greens to fit in a grocery-store aisle. The startup, based in Berlin, Germany, has just opened up its first North American farms inside a pair of QFC supermarkets east of Seattle, at Bellevue Village and here at Kirklands Urban Plaza.

Its a merger of agriculture and technology, Emmanuel Evita, Infarms global communications director, told me during todays first harvest in Kirkland.

The process began a few weeks earlier at a distribution center in Seattles Georgetown neighborhood, where the seedlings for herbs, kale and lettuce got their start. After about a week, Infarms urban farmers transferred the plants to on-site grocery cases, where theyve been spending the past few weeks reaching maturity in a closed, temperature-controlled environment..

LED lights shine with a magenta glow thats optimized for growth, while a hydroponic system feeds and waters the greenery. The plants sit in circular trays that are designed to give the roots more room as they grow. Infarm says its process uses 95% less water and 75% less fertilizer than soil-based agriculture.

Infarms cloud-connected sensors keep track of more than 200,000 plants that cycle through the system every month.

The idea is to have a global network thats also hyperlocal, Evita said. Its grown in stores, hyperlocal. But its global: Each of these farms is connected to the cloud, if you will, where we collect data from all of our farms around the world. At any farm, at any time, we know the pH, the temperature, the nutrient levels. We know everything about whats happening to the plants, and we can adjust accordingly.

Story continues

Infarm was founded in Germany back in 2013 by a trio of entrepreneurs, and has since spread out to supermarkets across Europe. Among its partners are retail heavyweights including Amazon Fresh in Germany, Marks & Spencer in Britain, and Metro in France. In June, Infarm announced a $100 million funding round led by London VC Atomico.

The company is one of several ventures aiming to capitalize on vertical farming, with varying degrees of success. One such startup, Plenty, has attracted funding from Amazon CEO Jeff Bezos and other high-profile techies but announced earlier this month that it was putting its plans to build a 100,000-square-foot indoor farm in the Seattle area on hold.

About a year ago, QFC executive Suzy Monford came across Infarm during a visit to Europe. She sold her colleagues at Kroger, the grocery chains parent company, on the idea of forming the first U.S. partnership. Infarms cabinets are due to start popping up at about a dozen more QFC stores after the holidays and could put down roots with Krogers other divisions as well.

The value proposition focuses on getting fresh produce including salad staples such as lettuce and kale as well as basil, cilantro, dill, mint and parsley to customers who rarely get out of the city or down to the farmers market.

We are hoping to shorten the distance between where food is produced and where its consumed, Evita said. In 2050, therell be 7 billion people living in cities, and we want to do what we can to help feed these people in a sustainable way. What you see here is the beginning.

He offered me a snippet of the just-harvested cilantro with gloved hands. After rolling the leaf to release its aroma, I chewed the sample and was reminded of the fresh leaves of clover and lambs quarters that I munched on when I was a farmboy in Iowa. It was enough to make me believe Infarms claim that conventionally grown greens lose 45% of their nutrients by the time they arrive at the supermarket.

But dont just take my word for it: Christiane Leibrandt, a German-born transplant to Kirkland, was the first shopper to sample the basil and give a review. Really delicious, she said. I love basil and use it a lot. The flavor is really rich.

Freshness comes at a premium: Infarms lettuce sells for $2.99, compared with the $1.99 price for a head of organic leaf lettuce on the next aisle over. But that $2.99 compares more favorably with the price charged for other brands of living, hydroponically grown butter lettuce.

The acid test came when I brought a head of Infarms caravel lettuce home and pulled off a leaf for our canaries. The birds love the lettuce we grow in the back garden, but theyre not so crazy about store-bought stuff.

For what its worth, the leaf was pecked down to the stem in a half-hour.

Read the original post:
Infarm plants its blend of vertical farming and cloud computing in QFC grocery stores - Yahoo Tech

Read More..

‘Guess What, There’s A Cost For That’: Getting Cloud & AI Right – Breaking Defense

DETROIT: I get so frustrated, fighter pilot turned Microsoft exec Mark Valentine said, when senior program officials and my former colleagues from the Air Force [ask me] Dude, I need to get me some of that AI. And I shake my head.

To do what? Col. Valentine asked the AUSA conference on autonomy and AI here. What are you trying to accomplish?

Then there are the customers who know just enough to be dangerous, added Leonel Garciga, an expert in military intelligence IT on the Army headquarters staff (G-2). Hes gotten requests like, hey, Leo, I want to move my instance of SharePoint a popular collaboration tool to a Google cloud platform a popular cloud provider. And I said, really? That sounds like not a great idea, Garciga recalled.

Army soldiers access the DCSG-A intelligence system during an exercise.

Why? SharePoint is a Microsoft product that comes standard on Microsofts cloud service, Azure, Garciga tried to explain. Why would you pay Google to do that integration work all over again?

And then, Garciga said, I get a blank stare.

Heres an analogy: Imagine you want a car with a sunroof. Do you buy a car that has a sunroof, or do you buy a car that doesnt have one and pay a mechanic to take a blowtorch to it?

You really, really, really dont want to reinvent any wheels by integrating new software you dont have to, agreed Matt Carroll, a former Army officer whos now CEO of startup Immuta. Of course, he noted, the companies selling you that software wont necessarily warn you. All too often, they believe they can just hand their product over, it goes into cloud infrastructure, and its magic and it works, he said. In reality, he went on, in one case, it took us a year-and-a-half just to install our product. A year and a half!

Installation and integration arent the only hidden costs, Carroll continued. The Defense Department wants cloud computing so it can pool data previously scattered across the bureaucracy, making it easier to analyze; machine learning in particular requires a massive central collection of data to train the algorithms. But running sophisticated analytics let alone AI requires much more computing power than just dumping 1s and 0s in a database, and its correspondingly costly.

Modern machine learning AI relies on the fact that, in any large set of data, there will emerge clusters of data points that correspond to things in the real world.

It is very different than a database, very different than cold storage, Carroll said. Analytics run in that computer infrastructure, and guess what? its wicked expensive. Its like 10 times the cost.

People freak out .holy crap, how am I going to pay this bill? he said. Well, the whole point of moving your data to the cloud was to make it easier for people to use it, and now theyre using it more. Thats what you want to do! he said. Guess what, theres a cost for that.

Such unpleasant surprises in cloud billing were all too common across the defense industry as well, Garciga said but theyve learned how to manage it, when the Pentagon has not.

All major defense contractors that are here, I talk to the folks on the IT side, and I get the same exact story: Leo, we went to cloudand we had like 3,000 PCs out there, and people were just leaving stuff running, and we had a bazillion dollar bill, Garciga said.

The companies responded by instituting new rules for cloud use much of it the 21st century equivalent of reminding employees to turn the light off before they leave the building but that lesson has not been learned by DoD yet, he said. That lesson has definitely not been learned by the Army.

Then-director of the Defense Digital Service, Chris Lynch (at center, in grey hoodie) surrounded by his casually-clad tech geniuses.

Bureaucracy Hackers

The Army CIO, Lt. Gen. Bruce Crawford, is now creating an enterprise cloud management office to get a handle on the devilish and expensive details of moving to the cloud. In the intelligence world, which is particularly enthusiastic about cloud computing, weve actually stopped a lot of the technical pieces, Garciga said, [while] were hiring more business folks to manage the business of cloud vs. technical [aspects]. Thats going to be a bigger challenge.

Pentagon officials often fixate on hiring people who know computers, but they really need to hire people who know their way around the governments own bureaucracy, agreed Don Bitner, chief of infrastructure strategy & development for the Joint Artificial Intelligence Center.

We need to have policy hackers, Bitner said. That is a skillset we really need: people that can hack the bureaucracy and get through the red tape, [and] you really only get that skillset through experience.

(Its worth noting the Defense Digital Service was created to bring hackers of both technology and bureaucracy to the Pentagon from the private sector, but their most ambitious project, the JEDI cloud computing initiative, has repeatedly run afoul of legal and procedural problems).

The Pentagons plan to consolidate many but not all of its 500-plus cloud contracts into a single Joint Enterprise Defense Infrastructure (JEDI). Note the suggestion that the single pathfinder contract for JEDI might evolve into multiple JEDI contracts.

The lack of experience manifests in different ways at different levels of the hierarchy. Nave generals, admirals, and senior officials all too often try to order up AI or cloud, or whatever the latest buzz word is as if it were a side of fries, without thinking through what they really want it to do. But uneducated procurement and contracting officials often see obstacles to high-tech projects, obstacles they could actually bypass if they understood all the options already available to them in acquisition law and regulation.

Special Operations Command, for instance, has a reputation for getting its commandos high tech at high speed and manageable cost, often attributed to SOCOMs special legal authorities. But in fact, said Bitner, I might have been doing a SOCOM project, but nine times out of 10 wed use a standard FAR [Federal Acquisition Regulation] warrant to buy our stuff, and we bought it just faster than the services.

Its because we knew how to work the system in our favor, Bitner said. You need bureaucrats that are willing to hack the system, not just sit back and tell us [no].

All too often, Garciga said were delaying operationalizing stuff that I go could build right now in my house, and I could deploy from my laptop, tonight, and were turning it into a six-month effort. And DoD is suffering this self-inflicted wound, he said, in large part not because of what the regulations actually say, but because of how acquisition officials have gotten used to coloring well within the lines rather than using all the leeway they really have.

Listening to one litany of obstacles, Garciga said, at almost every single point, my thought was, oh, well, theres already policy that allows you to do that. Its just not being used.

Sometimes, officials just arent aware of new reforms. About four months ago, Garciga said, the undersecretary for acquisition & sustainment issued new rules for rapid software provisioning. He asked anyone whos heard of the change to raise their hands, then scanned the cavernous conference hall.

One person. Outstanding, he snarked. Everybody should be raising their hand.

In many other cases, though, the flexibility has been available for years. How many people think OTA Other Transaction Authority, a favorite tool of reformers is a new thing? Garciga asked. Its been around forever. We just werent using it.

The policies that are actually in place right now and the law thats in place allow us to do this, Garciga said. Everybody has the authorities to do this. Lets get that done.

See the rest here:
'Guess What, There's A Cost For That': Getting Cloud & AI Right - Breaking Defense

Read More..