Page 3,988«..1020..3,9873,9883,9893,990..4,0004,010..»

Bare Metal Cloud Market Future Aspect Analysis and Current Trends by 2017 to 2025 – Market Research Sheets

Global Bare Metal Cloud Market: Snapshot

As a public cloud service that offers a facility to customers to rent hardware resources from a remotely situated service provider, bare metal cloud comes with the primary benefit of flexibility for the businesses to meet their specific and diverse requirements. With base metal clouds services, small and medium enterprises can also troubleshoot their applications without interfering with other nearby virtual machines (VMs). Since bare metal cloud are made out of dedicated servers, complications from neighbors are avoided and it works very well for high traffic workloads that are intolerant to latency as well as applications pertaining to big data.

Get Free Sample of Research Report @https://www.tmrresearch.com/sample/sample?flag=B&rep_id=2777

Some of the key factors augmenting the demand in the global bare metal cloud market are: critical need for reliable load balancing latency sensitive and data-intensive operations, decommissioning of workloads after termination of service level agreements (SLAS), no noisy neighbors and hypervisor taxes, and the advent of fabric virtualization. On the other hand, restraints such as stringent cloud regulations, expensive model, hindrances pertaining to restoring, and lightweight hypervisors are challenging the bare metal cloud market from attaining greater potential. That being said, growing usage for big data and DEVOPS applications, micro-services and batch processing applications, and growing interest in open compute project (OCP) are anticipated to open new opportunities in this market in the near future. Some of the industry verticals that are generating the demand for bare metal cloud are manufacturing, retail, healthcare, IT and telecommunication, government, and BFSI.

Some of the key audiences of this research report are providers of base metal cloud, application developers, managed service providers, third party system integrators, bare metal hardware vendors, regulatory agencies, and government. The report provides analytical and figurative assessment of the markets potential during the forecast period of 2017 to 2025.

Global Bare Metal Cloud Market: Overview

Bare metal cloud is an alternative for virtual cloud services and works with the help of a dedicated server. The dedicated server is needed in order to balance and scale the arrangement of this model. However, the dedicated hardware is attributed without including any additional storage. Yet, bare metal cloud server can support huge workloads. The main motto of bare metal cloud is to minimize the overhead which is incurred on account of the implementation of virtual technology. Despite the elimination of implementing virtual technology, bare metal cloud services offers efficiency, scalability, and flexibility. One of the other benefits of bare metal cloud servers is that it does not require any host or recipient and can be deployed with cloud-like service model. Bare metal cloud combines features of traditional hosting as well as infrastructure as a service (IaaS) in order to provide high performance workloads. Due to all these reasons, this market is expected to witness high growth in the years to come.

Global Bare Metal Cloud Market: Key Trends

There is a high demand for bare metal cloud from the telecom and the IT sector on account of big data, resulting in high demand for effective storage. The advertising sector will also make extensive use of bare metal cloud and this is a trend which is anticipated to continue throughout the forecast period. Today enterprises are switching from conventional hosting services to bare metal cloud on account of the escalating demand for secure storage facility as well as advancements in the cloud industry. Bare metal cloud solutions offer innumerable benefits such as data security, effective service delivery, efficient data storage, faster service delivery, improved performance, streamline data center operations, and standardized hardware platforms.

Global Bare Metal Cloud Market: Market Potential

The global bare metal cloud market has displayed promising potential as it offers various advantages such as easy maintenance of records, enhanced security, and ability to monitor activities in residential and commercial areas. Bare metal cloud has also found its use and application in providing national security. Because it can help monitor activities, it is enabling countries to fight against terrorism as well as external threats. This is anticipated to create potential growth opportunities within the global bare metal cloud market.

Check Exclusive Discount on this report @https://www.tmrresearch.com/sample/sample?flag=D&rep_id=2777

Global Bare Metal Cloud Market: Regional Outlook

On the basis of geography, the global bare metal cloud market is segmented into Asia Pacific, North America, Latin America, Europe, and the Middle East and Africa. Of these, North America has been leading in the market on account of the increasing focus on research and development in cloud technology. The European bare metal cloud market is also estimated to expand at a fast pace with key contribution from Germany, Spain, and the UK. However, it is Asia Pacific which is anticipated to expand the fastest pace during the forecast period on account of the increasing number of new market players. The digicloud initiative which is undertaken by the government in Singapore so as to offer IaaS, SaaS, along with the use of bare metal servers is also an important factor driving the growth of the Asia Pacific bare metal cloud market.

Global Bare Metal Cloud Market: Competitive Landscape

Key players in the market are concentrating towards achieving organic growth and thus implementing various strategies in order to maintain their position. The report profiles leading players operating in the market. They are: Rackspace Hosting, Inc. (The U.S.), CenturyLink, Inc. (The U.S.), IBM Corporation (The U.S.), Media Temple (The U.S), and Internap Corporation (The U.S.).

About TMR Research

TMR Research is a premier provider of customized market research and consulting services to busi-ness entities keen on succeeding in todays supercharged economic climate. Armed with an experi-enced, dedicated, and dynamic team of analysts, we are redefining the way our clients conduct business by providing them with authoritative and trusted research studies in tune with the latest methodologies and market trends.

Contact:

TMR Research,

3739 Balboa St # 1097,

San Francisco, CA 94121

United States

Tel: +1-415-520-1050

This post was originally published on Market Research Sheets

See the article here:
Bare Metal Cloud Market Future Aspect Analysis and Current Trends by 2017 to 2025 - Market Research Sheets

Read More..

Microsoft Stock Is Floating on a Cloud – Investorplace.com

If you look at a multiple-year chart of Microsoft (NASDAQ:MSFT) stock, youll see a very pretty geometric shape known as a parabola. More accurately, youll only see the part of the parabola that goes vertically upwards, as the other side of the parabola (the one that goes down) is completely missing.

Source: VDB Photos / Shutterstock.com

Thats all fine and good, but does it make sense to buy Microsoft stock at this lofty price? If you want reasons I can give them to you, but the most compelling reason might be Microsofts progress in cloud computing; this alone could justify ignoring the sky-high MSFT stock price and taking a position anyway.

If youre looking for a company that will pose a threat in the cloud race, you wont find anything much better than Microsoft. In the first fiscal quarter of 2020, Microsofts Intelligent Cloud revenue advanced by 27%, with the Azure division expanding by an astonishing 59%. Thats quite an achievement considering the companys fourth-quarter cloud segment growth was 19% and its fiscal year 2019 cloud division growth was 21%.

You can feast your eyes on those encouraging stats, but its more important to appreciate how pivotal cloud computing will be in the coming years. Just as Microsoft was a desktop-computing pioneer in the 1980s and 1990s, the companys reminding stakeholders than an old dog can do new tricks as it continues to rivalAmazon (NASDAQ:AMZN) in the cloud-computing space.

How big is Microsofts footprint in this market? Put it this way: the companys Azure platform is currently being used in 54 global regions, while Amazons AWS could-computing platform is only being used in 25. Again, investors only need to look at the numbers and the choice is crystal clear.

The characters in the mob filmThe Godfather talked about how youre supposed to separate your business from your personal matters, but Amazon founder Jeff Bezos might have forgotten that lesson. At least, he seems to be taking it personally that the Pentagon awarded Microsoft a coveted $10 billion cloud computing contract.

The fact is, the Pentagon needed a provider with hybrid-cloud experience, and Microsoft trumps Amazon in that respect. Microsofts Azure platform allows for on-premise servers as well as pure cloud computing; during the Pentagons contract negotiations, surely the government took into consideration that Microsoft has a sizable head start in the hybrid-cloud space.

This head start has clearly paid off, as 95% of Fortune 500 companies are currently using the Azure platform. The flexibility of the hybrid approach is indubitably the differentiator here: as one Fortune 500 client observed, Microsoft didnt ask us to bend to their vision of a cloud.

As you may have already heard, there have been suggestions that the Pentagons choosing Microsoft was politically motivated. Theres no need to explore that here, but we can say with confidence that the storys not likely over. I fully expect Jeff Bezos to strike back against the government, Microsoft, and any other perceived antagonist.

Whether Microsoft won the Pentagon contract fair and square is immaterial for the time being, though; what matters to investors is that the companys miles ahead and will likely stay that way for the foreseeable future.

Your best strategy as an investor is to avoid the flashy headlines and the accusations and just stick to the facts. In this case, the facts are evident: MSFT stock had a strong year and the cloud had something to do with that. I expect another exciting and prosperous year to await Microsoft shareholders the share price is high, but the cloud can take it even higher.

As of this writing, David Moadel did not hold a position in any of the aforementioned securities.

See original here:
Microsoft Stock Is Floating on a Cloud - Investorplace.com

Read More..

How Important Is The Intelligent Cloud Segment To Microsofts Stock? – Forbes

BARCELONA, CATALONIA, SPAIN - 2019/11/19: The Microsoft company logo during the Fair.First day of ... [+] the Smart City Expo World Congress which exists to empower cities and collectivise urban innovation. More than 400 international experts and 844 global companies joined the congress which will take place from 19 to 21 November 2019. (Photo by Paco Freire/SOPA Images/LightRocket via Getty Images)

Microsofts (NASDAQ: MSFT) Intelligent Cloud business is expected to contribute $41.9 billion to Microsofts 2020 revenues (ending June 2020) representing 31.2% of Microsofts $290.4 billion in expected revenues for the year. More Personal Computing, and Productivity & Business Processes segments are expected to contribute 35.1% and 33.7% respectively to Total Revenue making the relative contribution of all the 3 reporting segments to Microsofts top line nearly identical. Notably, Microsoft is expected to add $37.8 billion in revenue between 2017 to 2020, out of which the Intelligent Cloud segment is expected to provide $14.5 billion, or 38.3% of the total expected increase. In fact, it is the strong growth in Microsofts Intelligent Cloud segment revenues that has been primarily responsible for the 200% jump in the tech giants stock since 2017. While the segment has considerable growth potential, its higher profit margins have unlocked sizable value over recent years. We discuss Microsofts valuation analysis in full, separately.

Below we discuss Microsofts business model, followed by sections that review past performance and 2020 expectations for Microsofts revenue drivers and competitive comparisons of its revenue with Apple and Oracle. You can look at our interactive dashboard analysis ~ Microsofts Revenues: How Does Microsoft Make Money? ~ for more details.

Microsoft Business Model

What does Microsoft offer:

Has 3 major Operating Segments:

What Are The Alternatives?

Microsoft competes with:

What Is The Basis of Competition?

Details about how Microsoft revenue changes over recent years compare with competitors Apple and Oracle are available in our interactive dashboard.

Whats behind Trefis? See How Its Powering New Collaboration and What-Ifs ForCFOs and Finance Teams|Product, R&D, and Marketing Teams More Trefis Data Like our charts? Exploreexample interactive dashboardsand create your own

Read the rest here:
How Important Is The Intelligent Cloud Segment To Microsofts Stock? - Forbes

Read More..

Privacy scare leads Wyze to unpair all devices from Google Assistant and Alexa, you’ll need to add them back – Android Police

Smart home appliance maker Wyze has responded to what it calls an "alleged" data breach against its production databases by logging all users out of their accounts and has strengthened security for its servers. Customers endured a lengthy reauthentication process as the company responded to a series of reports claiming that the company stored sensitive information about people's security cameras, local networks, and email addresses in exposed databases.

Texas-based Twelve Security, a self-described "boutique" consulting firm, posted the claim of a breach against Wyze's two Elasticsearch databases on Medium yesterday. The unsecured data is said to have come from 2.4 million users. A plurality of them are located on the east coast of the United States, though data was sourced from across the country as well as in the United Kingdom, the United Arab Emirates, Egypt, and parts of Malaysia.

The dataset included any email addresses that have been registered to or shared access to a camera, the models, firmware versions, and assigned names of every cameras in a household, time of devices' last activation, times of users' last login and logout, account login tokens for users' Android and iOS devices, camera access tokens for users' Alexa devices, Wi-Fi SSID, and internal subnet layout. A particular subset of users who gave or have had tracked their height, weight, gender, bone health, and protein intake information may have had those data exposed as well. Twelve Security also noted that there were "clear indications" that data was being trafficked through Alibaba Cloud servers in China.

Video surveillance news blog IPVM followed up with Twelve Security and was able to spot accounts and devices linked to its staff who reviewed Wyze products.

Twelve Security opted not to notify Wyze before going public with its claims on suspicion of either the company's gross negligence or a concentrated espionage effort, based on the alleged Alibaba Cloud link as well as a previous security blunder where Alexa users could view camera feeds from devices they've resold to other people that vulnerability has since been patched.

In a bulletin on its community forums, Wyze stated that it was notified by IPVM late yesterday morning and has failed to verify a breach. It also denied any association with Alibaba Cloud.

The company said it decided out of caution to adjust access permissions for its databases and wipe all active login tokens this also cleared users' Alexa, Google Assistant, and IFTTT integrations as well. Customers who employed two-factor authentication complained shortly after the token refresh that their login attempts were denied due to various errors. Wyze updated its bulletin late last night to report it had fixed the 2FA login process.

The Seattle-based Wyze sells smart plugs, lights, security cameras, and the like at prices well below its competition. It's able to do so by turning to vendors for advanced software features Xnor.ai recently canceled its contract with Wyze to provide its cameras with subject detection and vesting a number of resources, including manufacturing, in China. While we'd like to see more details come along, Twelve and IPVM's reporting to this point may cast doubt, at the very least, on how Wyze handles its resources.

Link:
Privacy scare leads Wyze to unpair all devices from Google Assistant and Alexa, you'll need to add them back - Android Police

Read More..

To protect data and code in the age of hybrid cloud, you can always turn to Intel SGX – The Register

Sponsored Data and code are the lifeblood of digital organisations, and increasingly these are shared with others in order to achieve specific business goals. As such, data and code must be protected no matter where the workloads run, be they in on-premises data centers, remote cloud servers, or edge-of-the-network.

Take medical images processed in the cloud for example. Their processing must be encrypted for security and privacy. Banks need to share insights into financial data without sharing that underlying confidential data with others. Other organisations may want to process data using artificial intelligence and machine learning but keep secret these learning algorithms that turn data into useful analysis.

While encrypting data at rest or in transit is commonplace, encrypting sensitive data while it is actively in-use in memory is the latest, and possibly most challenging, step on the way to a fully encrypted data lifecycle.

One new security model that is growing increasingly popular as a way of protecting data in use is confidential computing. This model uses hardware protections to isolate sensitive data.

Confidential computing changes how code and data are processed at the hardware level and changes the structure of applications. Using the confidential computing model, encrypted data can be processed in the hardware without being exposed to the rest of the system.

A crucial part of that is the Intel Software Guard Extensions (Intel SGX). It was introduced for client platforms in 2015 and brought to the data center in 2017, and developed as a means of protecting the confidentiality and integrity of code. It does this by creating encrypted enclaves that help safeguard information and code whilst in use. This year, Intel submitted the SGX software development kit (SDK) to the Linux Foundations new Confidential Computing Consortium to help secure data in applications and the cloud.

To protect data in use, applications can employ something called Trusted Execution Environments (TEEs) running inside a processor. The fundamental principle here is of hardware isolation between that TEE where only trusted code is executed on selected data and the host devices operating environment. Within a TEE, data is safely decrypted, processed, and re-encrypted. TEEs also provide for the secure execution of authorised software, known as trusted applications or TAs, and protect the execution of authenticated code.

To keep data safe, TEEs use a secure area of memory and the processor that is isolated from the rest of a systems software stack. Only trusted TAs are allowed to run inside this environment, a system that is cryptographically enforced. Applications using a TEE can be divided into a trusted part (the TA) and an untrusted part (the rest of the application that runs as normal), allowing the developer great control over the exact portions of data needing advanced protections.

The goal of the Confidential Computing Consortium is to establish common, open-source standards and tools for the development of TEEs and TAs.

This is where Intel has stepped in with Intel SGX. It offers hardware-based memory encryption that isolates specific application code and data in memory. It works by allowing developers to create TEEs in hardware. This application-layer TEE can be used to help protect the confidentiality and integrity of customer data and code while its processed in the public cloud, encrypt enterprise blockchain payloads, enable machine learning across data sources, significantly scale key management solutions, and much more.

This technology helps minimise the attack surface of applications by setting aside parts of the hardware that are private and that are reserved exclusively for the code and data. This protects against direct assaults on the executing code or the data that are stored in memory.

To achieve this, Intel SGX can put application code and data into hardened enclaves or trusted execution modules encrypted memory areas inside an applications address space. Code in the enclave is trusted as it cannot be altered by other apps or malware.

Intel SGX provides a group of security-related instructions, built into the companys Intel Core and Xeon processors. Intel provides a software development kit as a foundation for low-level access to the feature set with higher-level libraries that open it up to other cloud-optimized development languages.

Any number of enclaves can be created to support distributed architectures. Some or all parts of the application can be run inside an enclave.

Code and data are designed to remain encrypted even if the operating system, other applications, or the cloud stack have been compromised by hackers. This data remains safe even if an attacker has full execution control over the platform outside the enclave.

Should an enclave be somehow modified by malicious software, the CPU will detect it and wont load the application. Any attempt to access the enclave memory are denied by the processor, even those made by privileged users. This detection stops encrypted code and data in the enclave from being exposed.

Where might enterprise developers use Intel SGX? A couple of specific scenarios spring to mind. Key management is one, with enclaves used in the process of managing cryptographic keys and providing HSM-like functionality. Developers can enhance the privacy of analytics workloads, as Intel SGX will let you isolate the multi-party joint computation of sensitive data. Finally, theres digital wallets with secure enclaves able to help protect financial payments and transactions. There are more areas, but this is just a sampler.

Intel SGX enables applications to be significantly more secure in todays world of distributed computing because it provides a higher level of isolation and attestation for program code, data and IP. Thats going to be important for a range of applications from machine learning to media streaming, and it means stronger protection for financial data, healthcare information, and user smartphone privacy whether its running on-prem, in hybrid cloud, or on the periphery of the IoT world.

Sponsored by Intel

Sponsored: How to get more from MicroStrategy by optimising your data stack

View original post here:
To protect data and code in the age of hybrid cloud, you can always turn to Intel SGX - The Register

Read More..

The silver lining in cloud for financial institutions – ITProPortal

A recent report from analyst research firm Aite estimates that the majority of global tier-one financial institutions have less than ten per cent of their total technology stack hosted in a public cloud environment. A startling statistic considering how large institutions such as Bank of America, which has been running cloud since 2013, has saved $2.1 billion in infrastructure costs. The financial sector has been slow to move to a cloud-based environment. Understandable concerns around data security are often cited as a key reason, but those financial institutions that do take the step towards the cloud can expect to realise a number of huge business benefits.

Cloud migration offers demonstrable business benefits for banks and financial institutions. A Chief Technology Officer at a tier-one global bank, explains this better when he says: The transparency that the cloud offers around costs - getting that understood with folks who are responsible for finance and forecasting is an important piece. The CTO added that cloud-enabled IT departments are able to deliver solutions in a faster timeframe. Furthermore, he states that the IT department can also work on DevSecOps models, giving them a lot more automation around their software development, data quality, software development and test quality.

The technology that allows systems to run in a cloud environment has been around for a long time. The bigger challenge for CTOs and CIOs is often getting the business approvals to be able to do what they want to do with cloud technology. Organisations should not underestimate the length of time it can take to implement a cloud infrastructure. Each system that's being moved into the cloud needs to be judged on its own merits; there might be legacy systems or the organisation might already have the existing infrastructure it needs and it makes sense to actually continue to use it for a period of time.

Risk and security understandably are among the biggest challenges faced by banks and financial institutions when migrating their organisation towards a cloud environment. Firms need to be assured that, in the cloud, both the institutions and its customers data are well protected. One of the main challenges financial institutions face is actually moving data to the public cloud. Security teams and regulators have strict requirements that need to be met, such as data location and encryption. These need to be understood and addressed upfront, rather than waiting for the project to be implemented.

Ensuring that we had the right risk and security controls that are approved and agreed on from an enterprise perspective was essential, said a CTO from a tier-one bank. With the cloud, like any environment, its essential to manage the security and risk associated with that environment, which should be a foundational piece of your strategy.

Cloud technology offers all financial institutions and banks the opportunity to have greater control over their workloads and, even with out-of-the-box cloud solutions, they can enjoy greater insights into those workloads and budgets associated with them. In a world of shrinking budgets for both people and resources, the cloud offers a lot of additional tooling at very competitive prices, whether that comes via a third-party or as part of the native toolset.

Other benefits of the cloud include stability and scalability as well as disaster recovery. In the cloud, stability means reliability. Financial firms can rest assured that in the event of a problem in one location, they have business continuity as other servers in the cloud will scale, and act as backup providing the correct deployments architectures are in place. CTOs and CIOs can also add new functionality without disruption. Financial institutions reputations rely on 24/7/365 reliability. With the cloud, there is no downtime, if one site is down another will pick up the slack, so the business can continue to function in the case of a disaster in one location.

Adopting cloud technologies is a key strategic business decision and firms looking to start their cloud programs can look to engage with experts early in order to help them formulate and implement strategies. This is where managed service providers can really add value as well as help save time, money and resources over the long term. Working with external managed service providers can help to accelerate migration and so that IT departments can start to realise the benefits quickly while focusing on other business critical tasks.

So, in general, the tide is turning - senior executives are starting to champion the move to the cloud. But the reality is that it's an education exercise. There is no one single cloud infrastructure model for financial institutions - every organisation is different. The key thing is that firms adopt a cloud infrastructure that works for them and gives them scope to scale as required.

Colin Sweeney, VP Client Operations, Fenergo

Read the original:
The silver lining in cloud for financial institutions - ITProPortal

Read More..

Listen to the boomers – or cloud could make you go bust – ITProPortal

Much like a millennial showing their baby boomer parent how to use social media, cloud-native companies are often perceived as having the edge over their legacy rivals due to their familiarity with the cloud. This isnt without reason as the benefits that being born in the cloud has given these companies are clear, but their reliance on the platform could also be their undoing.

This familiarity can lead to overconfidence and in turn, uncontrollable cloud costs. With more traditional rivals now well up to speed on how to use the cloud to their advantage, cloud-natives need to avoid saying okay boomer and falling back purely on the cloud.

There is no doubt that leveraging the cloud has driven advantages for digital businesses. This is borne out by the litany of digital-first firms who have disrupted traditional players in their industries. However, this cutting-edge technology doesnt always support effective cost management. These young upstarts must take desperate action to maintain their edge, or they are in danger of becoming victims of their own success.

The reason many cloud-natives are facing challenges is multifaceted. With unswerving faith in the cloud, its highly likely that this group will make continued investments in this technology without effective analysis. Paired with the move fast and break things mantra that many young companies have, this can lead to cloud usage and costs escalating out of control wiping out the business value derived from cloud-based applications.

In some organisations, a cloud-first mentality can result in a lack of accountability and an overall reckless approach to managing the cloud whoever requires it first gets it first, with zero concern paid to costs. This unintentional spending on cloud is called cloud sprawl.

All too often this situation is fostered because cloud-native players lack both visibility around resources and effective inter-department communication. This means its impossible to link the business strategy to cloud usage. While digital players must capitalise on the benefits of the cloud, they must also understand its impact on budgets and business goals.

Strategic business targets and the balance sheet have suffered because the goal for many cloud-native businesses has been to scale quickly, unencumbered by the liabilities of their legacy competitors, or even being profitable. This isnt inherently a problem, but the rules of business dictate at some point in time a firm must turn a profit, which can be challenging for cloud-natives when they have ploughed everything into growth via cloud and struggle to claw spending back.

Spiralling cloud costs are a threat. Cloud-native businesses need to consider how they remove that threat take a step back, consider the wider businesses objectives and how the power of the cloud can be bridled to meet them.

In order to tackle a hubristic approach to the cloud, CIOs and CFOs of cloud-native players need to start with a holistic snapshot of cloud spending. To achieve this, they must leverage data, drive transparent inter-department communication and continually optimise their platforms to eliminate cloud sprawl. In this way they can build an informed strategy anchored in realistic spending.

Running workloads in the cloud can be expensive if not managed properly. This makes it necessary to also understand how using more cloud will impinge on networking, storage, risk management and security expenditure. Analysing the costs in this helps businesses to decipher the total value that is being driven by cloud usage and then link it back to the strategic requirements of the business.

With cloud being carved up among business units, from marketing to IT, a cloud-native player cant develop the total picture of usage or cost. Departments leverage multiple clouds for various requirements, and, over time, this results in increased usage. This happens even when there is no actual demand on a just in case basis.

The regular communication between IT and the other various business units still isnt happening 41 per cent of IT decision makers say that decisions on cloud are made in siloed departments, with either IT or a business unit deciding without communicating with each other. Each department has different needs and its down to IT to collaborate with them to make informed decisions.

To manage this issue, cloud-native players should adopt a Single Source of Truth (SSOT). An SSOT involves structuring information models and associated data so every data element can be edited in one place. With this centralised system of record, all cloud data and costs can be viewed transparently and communicated to any part of the company.

Without an SSOT, cloud usage can become split between business units or devolved to different applications and software or compute power and storage. Again, this creates a situation where its almost impossible to see what is being used, paid for and what cloud capacity is required.

One of the main triggers of cloud sprawl is the assumption that the cloud is the solution to every business requirement. Not every business unit should go all in on the cloud. In some instances, on-premise is a better option, because it enables more direct control over workloads. Nearly half (41 per cent) of IT leaders say that on-premise offers more agility in workload control than cloud

Moreover, on-premise offers greater control over rewriting and refactoring costs. This can be crucial for guaranteeing more efficient operations, especially when benchmarked against the cost of complete cloud migration.

A hybrid approach is seen as striking the right balance by many in the industry. One survey found that a 70/30 split of cloud to on-premise was the perfect balance, enabling specific mission critical applications to stay, but most of the compute power to move into the public cloud. However, a shift towards hybrid cloud needs to be accompanied by a culture shift throughout the business which enables communication around, and understanding of, hybrid IT.

Once these considerations have been decided on, its important for cloud-native players to continually optimise. This can be as straightforward as analysing whether an instance would be more efficient if it is managed on a pay-as-you-use cloud model versus a reserved spend. Or, perhaps calculating the value that can be gained from migrating depreciated servers to the cloud. Optimisation helps to support improved decision making, but also managed cloud usage and expense.

The rocketing cloud usage and costs of the okay boomer attitude that some cloud-natives have embodied is leading them into difficult territory, often resulting in cloud sprawl, careless investment and missed business goals. However, this situation can be remedied with a more considered approach and a realisation that the silver lining does not always lie in every cloud.

Henrik Nilsson, Vice President EMEA, Apptio

Follow this link:
Listen to the boomers - or cloud could make you go bust - ITProPortal

Read More..

Server Microprocessor Market Projected to Grow at an Impressive CAGR Of XX% Between 2017 2027 Bulletin Line – Bulletin Line

According to a new market study, the Server Microprocessor Market is projected to reach a value of ~US$XX in 2019 and grow at a CAGR of ~XX% over the forecast period 2017 2027. The presented study ponders over the micro and macro-economic factors that are likely to influence the growth prospects of the Server Microprocessor Marketover the assessment period.

The market report throws light on the current trends, market drivers, growth opportunities, and restraints that are likely to influence the dynamics of the Server Microprocessor Market on a global scale. The Five Force and SWOT analysis included in the report provides a fair idea of how the different players in the Server Microprocessor Market are adapting to the evolving market landscape.

ThisPress Release will help you to understand the Volume, growth with Impacting Trends. Click HERE To get SAMPLE PDF (Including Full TOC, Table & Figures) at https://www.futuremarketinsights.co/reports/sample/REP-GB-5216

Analytical insights enclosed in the report:

The report splits the Server Microprocessor Marketinto different market segments including, region, end-use, and application.

The report provides an in-depth analysis of the current trends that are expected to impact the business strategies of key market players operating in the market. Further, the report offers valuable insights related to the promotional, marketing, pricing, and sales strategies of the established companies in the Server Microprocessor Market. The market share, growth prospects, and product portfolio of each market player are evaluated in the report along with relevant tables and figures.

The study aims to address the following doubts related to the Server Microprocessor Market:

Get Access To TOC Covering 200+ Topics athttps://www.futuremarketinsights.co/toc/REP-GB-5216

key players, and development of workload-specific server microprocessor designs is the growing trend in the global server microprocessor market.

Server Microprocessor Market: Market Dynamics

Expanding cloud infrastructure coupled with increasing adoption of cloud based solutions by organizations across various industries is the prominent factor drives the growth of global server microprocessor market. Increasing interest on hyper cloud solutions due to dynamic workload of organizations, emerging 5G networks and expanding internet of things (IoT) applications, accelerates the growth of global server microprocessor market. Also rising focus on exploring wide range of chip technologies by top internet giants such as Facebook, Google, Amazon, with the objective to enhance their artificial intelligence capabilities, fuels the growth of global server microprocessor market. Increasing focus on reducing data centre volume coupled with increasing investment in commercializing quantum computing, and complexity in upgrading server processors, are the factors identified as restraints likely to deter the progression of global server microprocessor market.

Server Microprocessor Market: Market Segmentation

The global server microprocessor market is segmented on the basis of number of cores, operating frequency, and by region.

On the basis of number of cores, the global server microprocessor market is segmented into

Six-core & less

Above six-core

On the basis of operating frequency, the global server microprocessor market is segmented into

1.5GHz 1.99GHz

2.0GHz 2.49GHz

2.5GHz 2.99GHz

3.0GHz and higher

Regionally, the global server microprocessor market is segmented into

In terms of revenue, the above six-core segment is expected to dominate the global server microprocessor market, due to expanding cloud infrastructure.

Server Microprocessor Market: Regional Outlook

Among all regions, server microprocessor market in North America is expected to dominate the market, due to increasing enterprise cloud data volumes. In terms of revenue, Asia Pacific is identified as the fastest growing server microprocessor market, due to adoption to software as a service (SaaS) based business models.

Server Microprocessor Market: Competition Landscape

In July 2017, Intel Corporation a U.S. based multinational technological company, launched Xeon Scalable an energy efficient server processor, with the objective to expand its portfolio.

In June 2017, Advanced Micro Devices, Inc.- a U.S. based multinational semiconductor company, launched EPYC 7000 series a high performance processor for datacentre, with the objective to cater the increasing demand for lower energy high computing efficiency server processor.

Prominent players in the global server microprocessor market includes Intel Corporation, Advanced Micro Devices (AMD), Inc., Cavium, Qualcomm Technologies, Inc., Applied Micro Circuits Corporation., and Marvell.

The report covers exhaustive analysis on:

Server Microprocessor Market segments

Server Microprocessor Market dynamics

Historical Actual Market Size, 2015 2016

Server Microprocessor Market size & forecast 2017 to 2027

Ecosystem analysis

Server Microprocessor Market current trends/issues/challenges

Competition & Companies involved technology

Value Chain

Server Microprocessor Market drivers and restraints

Regional analysis for Server Microprocessor Market includes

The report is a compilation of first-hand information, qualitative and quantitative assessment by industry analysts, inputs from industry experts and industry participants across the value chain. The report provides in-depth analysis of parent market trends, macro-economic indicators and governing factors along with market attractiveness as per segments. The report also maps the qualitative impact of various market factors on market segments and geographies.

Report Highlights:

Detailed overview of parent market

Changing market dynamics in the industry

In-depth market segmentation

Historical, current and projected market size in terms of volume and value

Recent industry trends and developments

Competitive landscape

Strategies of key players and products offered

Potential and niche segments, geographical regions exhibiting promising growth

A neutral perspective on market performance

Must-have information for market players to sustain and enhance their market footprint.

NOTE All statements of fact, opinion, or analysis expressed in reports are those of the respective analysts. They do not necessarily reflect formal positions or views of Future Market Insights.

Request Customized Report As Per Your Requirements athttps://www.futuremarketinsights.co/customization-available/REP-GB-5216

Why Opt for FMI?

About Us

Future Market Insights (FMI) is a leading market intelligence and consulting firm. We deliver syndicated research reports, custom research reports and consulting services which are personalized in nature. FMI delivers a complete packaged solution, which combines current market intelligence, statistical anecdotes, technology inputs, valuable growth insights and an aerial view of the competitive framework and future market trends.

Contact Us

Future Market Insights

616 Corporate Way, Suite 2-9018,

Valley Cottage, NY 10989,

United States

T: +1-347-918-3531

F: +1-845-579-5705

T (UK): + 44 (0) 20 7692 8790

Read this article:
Server Microprocessor Market Projected to Grow at an Impressive CAGR Of XX% Between 2017 2027 Bulletin Line - Bulletin Line

Read More..

Dr. Max Welling on Federated Learning and Bayesian Thinking – Synced

Introduced by Google in 2017, Federated Learning (FL) enables mobile phones to collaboratively learn a shared prediction model while keeping all the training data on the device, decoupling the ability to do machine learning from the need to store the data in the cloud. Two years have passed, and several new research papers have proposed novel systems to boost FL performance. This March for example a team of researchers from Google suggested a scalable production system for FL to enable increasing workload and output through the addition of resources such as compute, storage, bandwidth, etc.

Earlier this month, NeurIPS 2019 in Vancouver hosted the workshop Federated Learning for Data Privacy and Confidentiality,where academic researchers and industry practitioners discussed recent and innovative work in FL, open problems and relevant approaches.

Professor Dr. Max Welling is the research chair in Machine Learning at the University of Amsterdam and VP Technologies at Qualcomm. Welling is known for his research in Bayesian Inference, Generative modeling, Deep Learning, Variational autoencoders, Graph Convolutional Networks.

Below are excerpts from the workshop talk Dr. Welling gave on Ingredients for Bayesian, Privacy Preserving, Distributed Learning, where the professor shares his views on FL, the importance of distributed learning, and the Bayesian aspects of the domain.

The question can be separated in two parts. Why do we need distributed or federated inferencing? Maybe that is easier to answer. We need it because of reliability. If you in a self-driving car, you clearly dont want to rely on a bad connection to the cloud in order to figure out whether you should brake. Latency. If you have your virtual reality glasses on and you have just a little bit of latency youre not going to have a very good user experience. And then theres, of course, privacy, you dont want your data to get off your device. Also compute maybe because its close to where you are, and personalization you want models to be suited for you.

It took a little bit more thinking why distributed learning is so important, especially within a company how are you going to sell something like that? Privacy is the biggest factor here, there are many companies and factories that simply dont want their data to go off site, they dont want to have it go to the cloud. And so you want to do your training in-house. But theres also bandwidth. You know, moving around data is actually very expensive and theres a lot of it. So its much better to keep the data where it is and move the computation to the data. And also, personalization plays a role.

There are many challenges when you want to do this. The data could be extremely heterogeneous, so you could have a completely different distribution on one device than you have on another device. Also, the data sizes could be very different. One device could contain 10 times more data than another device. And the compute could be heterogeneous, you could have small devices with a little bit of compute that now and then or you cant use because the batterys down. There are other bigger servers that you also want to have in your in your distribution of compute devices.

The bandwidth is limited, so you dont want to send huge amounts of even parameters. Lets say we dont move data, but we move parameters. Even then you dont want to move loads and loads of parameters over the channel. So you want to maybe quantize it to see this. I believe Bayesian thinking is going to be very helpful. And again, the data needs to be private so you wouldnt want to send parameters that contain a lot of information about the data.

So first of all, of course, were going to move model parameters, were not going to move data. We have data stored at places and were going to move the algorithm to that data. So basically you get your learning update, maybe privatized, and then you move it back to your central place where youre going to update it.And of course, bandwidth is another challenge that you have to solve.

We have these heterogeneous data sources and we have very variability in the speed in which we can sync these updates. Here I think the Bayesian paradigm is going to come in handy because, for instance, if you have been running an update on a very large dataset, you can shrink your posterior parameters to a very small posterior. Where on another device, you might have much less data, and you might have a very wide posterior distribution for those parameters. Now, how to combine that? You shouldnt average them, its silly. You should do a proper posterior update where the one that has a small peaked posterior has a lot more weight than the one with a very wide posterior. Also uncertainty estimates are important in that aspect.

The other thing is that with Bayesian update, if you have a very wide posterior distribution, then you know that parameter is not going be very important for making predictions. And so if youre going to send that parameter over a channel, you will have to quantize it, especially to save bandwidth. The ones that are very uncertain anyway you can quantize at a very coarse level, and the ones which have a very peak posterior need to be encoded very precisely, and so you need much higher resolution for that. So also there, the Bayesian paradigm is going to be helpful.

In terms of privacy, there is this interesting result that if you have an uncertain parameter and you draw a sample from that posterior parameter, then that single sample is more private than providing the whole distribution. Theres results that show that you can get a certain level of differential privacy by just drawing a single sample from that posterior distribution. So effectively youre adding noise to your parameter, making it more private. Again, Bayesian thinking is synergistic with this sort of Bayesian federated learning scenario.

We can do MCMC (Markov chain Monte Carlo) and variational based distributed learning. And as theres advantages to do that because it makes the updates more principled and you can combine things which, one of them might be based on a lot more data than another one.

Then we have private and Bayesian to privatize the updates of a variational Bayesian model. Many people have worked on many other of these intersections, so we have deep learning models which have been privatized, we have quantization, which is important if you want to send your parameters over a noisy channel. And its nice because the more you quantize, the more private things become. You can compute the level of quantization from your Bayesian posterior, so all these things are very nicely tied together.

People have looked at the relation between quantized models and Bayesian models how can you use Bayesian estimates to quantized better? People have looked at quantized versus deep to make your deep neural network run faster on a mobile phone you want to quantize it. People have looked at distributed versus deep, distributed deep learning. So many of these intersections have actually been researched, but it hasnt been put together. This is what I want to call for. We can try to put these things together and at the core of all of this is Bayesian thinking, we can use it to execute better on this program.

Journalist: Fangyu Cai | Editor: Michael Sarazen

Like Loading...

Read this article:
Dr. Max Welling on Federated Learning and Bayesian Thinking - Synced

Read More..

7 crackpot technologies that might transform IT – CIO East Africa

Innovation is the cornerstone of technology. In IT, if youre not experimenting with a steady stream of emerging technologies, you risk disruption. Moreover, you can find yourself challenged when it comes to luring top talent and keeping ahead of competitors.

But knowing which bets to place when it comes to adopting emerging technologies can seem impossible. After all, most fizz out, and even those that do prove worthwhile often fall a little short of their hyped potential. Plus, most of what has most recently been considered cutting-edge today, such as artificial intelligence and machine learning, is already finding its way into production systems. You have to look far ahead sometimes to anticipate the next wave coming. And the farther out you look, the more risky the bets become.

Still, sometimes a great leap forward is worth considering. In that light, here are seven next-horizon ideas that might prove to be crackpot or a savvy play for business value emerging along the fringe. It all depends on your perspective. William Gibson used to say that the future is already here, its just not evenly distributed yet. These ideas may be too insane for your team to try or they may be just the right thing for moving forward.

Of all the out-there technologies, nothing gets more press than quantum computers and nothing is spookier. The work is done by a mixture of physicists and computer scientists fiddling with strange devices at super-cold temperatures. If it requires liquid nitrogen and lab coats, well, its got to be innovation.

The potential is huge, at least in theory. The machines can work through bazillions of combinations in an instant delivering exactly the right answer to a mathematical version of Tetris. It would take millions of years of cloud computing time to find the same combination.

Cynics, though, point out that 99 percent of the work that we need to do can be accomplished by standard databases with good indices. There are few real needs to look for strange combinations, and if there are, we can often find perfectly acceptable approximations in a reasonable amount of time.

The cynics, though, are still looking through life through old glasses. Weve only tackled the problems that we can solve with old tools. If youve got something that your programmers say is impossible, perhaps trying out IBMs Q Experience quantum cloud service may be just the right move. Microsoft has also launched Azure Quantum for experimentation. AWS is following suit with Bracket as well.

Potential first adopters: Domains where the answer lies in the search for an exponentially growing combination of hundreds of different options.

Chance of happening in the next five years: Low. Google and IBM are warring with press releases. Your team will spend many millions just to get to the press release stage.

Many of the headlines continue to focus on the dramatic rise and fall in value of bitcoin but in the background developers have created dozens of different approaches to creating blockchains for immortalizing complex transactions and digital contracts. Folding this functionality into your data preservation hierarchy can bring much needed assurance and certainty into the process.

The biggest challenge may be making decisions about the various philosophical approaches. Do you want to rely on proof of work or some looser consensus that evolves from a trusted circle? Do you want to fret over elaborate Turing-complete digital contracts or just record transactions in a shared, trustworthy ledger? Sometimes a simple API that offers timely updates is enough to keep partners synchronized. A few digital signatures that guarantee database transactions may just be enough. There are many options.

Potential first adopters: Industries with tight, synchronized operations between businesses that dont want to trust each other but must. These frenemies can use a shared blockchain database to eliminate some of the disputes before they happen.

Potential for success in five years: High. There are dozens of active prototypes already running and early adopters can dive in.

For the past few decades, the internet has been the answer to most communications problems. Just hand the bits to the internet and theyll get there. Its a good solution that works most of the time but sometimes it can be fragile and, when cellular networks are involved, fairly expensive.

Some hackers have been moving off the grid by creating their own ad hoc networks using the radio electronics that are already in most laptops and phones. The bluetooth code will link up with other devices nearby and move data without asking mother may I to some central network.

Enthusiasts dream of creating elaborate local mesh networks built out of nodes that pass along packets of bits until they reach the right corner of the network. Ham radio hobbyists have been doing it for years.

Potential early adopters: Highly localized applications that group people near each other. Music festivals, conferences, and sporting events are just some of the obvious choices.

Potential for success in five years: High. There are several good projects and many open source experiments already running.

If the buzzwords green and artificial intelligence are good on their own, why not join the two and double the fun? The reality is a bit simpler than doubling the hype might suggest. AI algorithms require computational power and at some point computational power is proportional to electrical power. The ratio keeps getting better, but AIs can be expensive to run. And the electrical power produces tons of carbon dioxide.

There are two strategies for solving this. One is to buy power from renewable energy sources, a solution that works in some parts of the world with easy access to hydro-electric dams, solar farms or wind turbines.

The other approach is to just use less electricity, a strategy that can work if questions arise about the green power. (Are the windmills killing birds? Are the dams killing fish?) Instead of asking the algorithm designers to find the most awesome algorithms, just ask them to find the simplest functions that come close enough. Then ask them to optimize this approximation to put the smallest load on the most basic computers. In other words, stop dreaming of mixing together a million layered algorithm trained by a dataset with billions of examples and start constructing solutions that use less electricity.

The real secret force behind this drive is alignment between the bean counters and the environmentalists. Simpler computations cost less money and use less electricity which means less stress on the environment.

Potential early adopters: Casual AI applications that may not support expensive algorithms.

Potential for success in five years: High. Saving money is an easy incentive to understand.

The world has been stuck on the old QWERTY keyboards since someone designed them to keep typewriters from jamming. We dont need to worry about those issues anymore. Some people have imagined rearranging the keys and putting the common letters in the most convenient and fastest locations. The Dvorak keyboard is just one example and it has some fans who will teach you how to use it.

A more elaborate option is to combine multiple keys to spell out entire words or common combinations. This is what the court reporters use to keep accurate transcripts, and just to pass the qualifying exam, the new reporters must be able to transcribe more than 200 words per minute. Good transcriptionists are said to be able to handle 300 words per minute.

One project, Plover, is building tools for converting regular computers to work like stenotypes. If it catches on, there could be an explosion in creative expression. Dont focus on the proliferation of inter-office memos and fine print.

Potential first adopters: Novelists, writers, and social media addicts.

Potential for success in five years: Medium. Two-finger typing is a challenge for many.

Wait, werent we supposed to be rushing to move everything to the cloud? When did the pendulum change directions? When some businesses started looking at the monthly bill filled with thousands of line entries. All of those pennies per hour add up.

The cloud is an ideal option for sharing resources, especially for work that is intermittent. If your load varies dramatically, turning to the public cloud for big bursts in computation makes plenty of sense. But if your load is fairly consistent, bringing the resources back under your roof can reduce costs and remove any worries about what happens to your data when its floating around out in the ether.

The major clouds are embracing solutions that offer hybrid options for moving data back on premises. Some desktop boxes come configured as private cloud servers ready to start up virtual machines and containers. And AWS recently announced Outposts, fully managed compute and storage racks that are built with the same hardware Amazon uses in its datacenters, run the same workloads, and are managed with the same APIs.

Potential first adopters: Shops with predictable loads and special needs for security.

Potential for success in five years: High. Some are already shifting load back on premises.

The weak spot in the world of encryption has been using the data. Keeping information locked up with a pretty secure encryption algorithm has been simple. The standard algorithms (AES, SHA, DH) have withstood sustained assault from mathematicians and hackers for some years. The trouble is that if you want to do something with the data, you need to unscramble it and that leaves it sitting in memory where its prey to anyone who can sneak through any garden-variety hole.

The idea with homomorphic encryption is to redesign the computational algorithms so they work with encrypted values. If the data isnt unscrambled, it cant leak out. Theres plenty of active research thats produced algorithms with varying degrees of utility. Some basic algorithms can accomplish simple tasks such as looking up records in a table. More complicated general arithmetic is trickier and the algorithms are so complex they can take years to perform simple addition and subtraction. If your computation is simple, you might find that its safer and simpler to work with encrypted data.

Potential first adopters: Medical researchers, financial institutions, data-rich industries that must guard privacy.

Potential for success in five years: Varies. Some basic algorithms are used commonly to shield data. Elaborate computations are still too slow.

View post:
7 crackpot technologies that might transform IT - CIO East Africa

Read More..