Page 3,900«..1020..3,8993,9003,9013,902..3,9103,920..»

Windows Server and the future of file servers in the cloud computing world – TechRepublic

We still run our businesses on files. How is Microsoft upgrading Windows Server to use files in a hybrid world?

We do a lot with servers today -- much more than the age-old file and print services that once formed the backbone of business. Now servers run line-of-business applications, host virtual machines, support collaboration, provide telephony services, manage internet presence. It's a list that goes on and on -- and too often we forget that they're still managing and hosting files.

There are occasional reminders of Windows as a file server, with Microsoft finally deprecating the aging SMB 1 file protocol, turning it off in Windows 10. It was a change that forced system administrators to confront insecure connections and the applications that were still using them. There's an added problem: many legacy file servers are still running the now unsupported Windows Server 2008R2.

Microsoft hasn't forgotten the Windows File Server and the services that support it. There's still a lot of work going into the platform, using it as a bridge between on-premise storage and the growing importance of cloud-scale storage in platforms like Azure. New hardware is having an effect, with technologies like Optane blurring the distinction between storage and memory and providing a new fast layer of storage that outperforms flash.

As much as organizations use tools like Teams and Slack, and host documents in services like SharePoint and OneDrive, we still run our businesses on files. We might not be using a common shared drive for all those files anymore, but we're still using those files and we still need servers to help manage them. Windows Server's recent updates have added features intended to help modernize your storage systems, building on key technologies including Storage Replica and new tools to build and run scale-out file servers.

Much of Microsoft's thinking around modern file systems is focused on hybrid storage scenarios, bridging on-premise and cloud services. It's a pragmatic choice: on-premise storage can benefit from cloud lessons, while techniques developed for new storage hardware on-premise can be used in the cloud as new hardware rolls out. That leads to a simple process for modernizing file systems, giving you a set of steps to follow when updating Windows Server and rolling out new storage hardware. In a presentation at Ignite 2019, Ned Pyle, principal program manager on the Windows Server team, breaks it down into four steps: Learn/Inventory, Migrate/Deploy, Secure, and Future.

You can manage multiple server migrations (to newer hardware or VMs) from the Windows Admin Centre interface.

Image: Microsoft

The latest version of SMB, SMB 3.1.1adds new security features to reduce the risks to your files. It improves encryption and adds protection from man-in-the-middle attacks. It's a good idea to migrate much of your file system traffic over to it, removing NTLM and SMB 1 from your network.

You shouldn't forget Microsoft's alternate file system technology ReFS. Offering up to 4TB files, it can use its integrity stream option to validate data integrity, as well as supporting file-system level data deduplication. You can get a significant data saving with ReFS as part of Windows Server's Storage Spaces.

Microsoft now offers a Storage Migration Service to help manage server upgrades. As well as supporting migrations from on-premise to Azure, it can help bring files from older Windows Server versions to Windows Server 2019 and its newer file system tools and services. It will map storage networks, copy data, ensure file security and validity, before obfuscating old endpoints and cutting over to the new.

Part of the future for Windows Server's file protocols is an implementation of SMB over the QUIC protocol, using UDP. It's designed to be spoofing resistant, using TLS 1.3 over port 443. Microsoft is working on adding SMB compression to file traffic, reducing payload size and offering improved performance in congested networks and over low-bandwidth connections.

One option for building a hybrid file system is using Azure Files. On-premise systems can use VPN connections with either NFS or SMB 3.0 connections to Azure to work with what looks like a familiar share, except that it's hosted on Azure. If you're not using a VPN you still have secure connectivity options, with SMB 3.0 over port 445 or using the Azure File Sync REST API over SSL. All you need is the Windows network name of the share, using it the same way you'd use any Windows Server share locally.

Those Azure file shares aren't only for on-premise data; they're accessible using the same protocols inside Azure. With data now a hybrid resource, you can use Azure for scalable compute and analytics, or for gathering and sharing IoT analytics with on-premise applications, or as a disaster recovery location that's accessible from anywhere in the world. There's no change to your servers, or the way you work, only to where that data is stored. With Azure storage able to take advantage of its economies of scale, you can expand those shares as needed, without having to invest in physical storage infrastructure.

SEE:Windows 10: A cheat sheet(TechRepublic)

There's certainly a lot of capacity in Azure file shares: over 100TB of storage per share, with 10,000 IOPS in standard drives (which can be 10 times faster if you pay for premium services). There's support for Azure Active Directory, so you can apply the same access control rules as in your on-premise systems. Ignite 2019 saw Microsoft add support for NFS shares, as well as increasing the maximum file size to 4TB, and adding support for Azure Backup. To simplify things further, Azure File Shares can be managed through Windows Admin Center.

Perhaps the most important recent change is the shift to workload-optimized service tiers. By picking a plan that's closest to your needs you can be sure that you're not paying for features you don't want. At one end of the scale is high I/O and throughput, with Premium storage on SSDs, while at the other archival storage on Cool disks with slow startup times keeps costs to a minimum.

Users will be able to access these Azure-hosted file shares as if they're a Windows Server file share, allowing you to begin phasing out local file servers and reduce the size of the attack surface on your local systems. Attackers will not be able to use the file system as a route into line-of-business servers, or as a vector for privilege escalation. Domain-joined Azure file shares will be accessible via SMB 3.0 over VPN connections or via ExpressRoute high-speed dedicated links to Azure.

A modern file server architecture will mix on-premise and cloud. Tiering to Azure makes sense, as it gives you business continuity as well as providing an extensible file system that no longer depends on having physical hardware in your data center. You're not constrained by space or power and can take advantage of it when it's needed.

Similarly, moving traffic to SMB 3.1.1 and using Windows Admin Center will improve performance and give you a future-proof management console that will work for both on-premise and in-cloud storage resources. Putting it all together, Microsoft is delivering a hybrid filesystem solution that you really should be investigating.

Be your company's Microsoft insider with the help of these Windows and Office tutorials and our experts' analyses of Microsoft's enterprise products. Delivered Mondays and Wednesdays

Read more:
Windows Server and the future of file servers in the cloud computing world - TechRepublic

Read More..

Government proposal to put police child abuse image database on the cloud raises hacking fears – Telegraph.co.uk

The Government is considering putting a police cache of tens of millions of child abuse images onto Amazon's cloud network, in a move privacy advocates warned would introduce new risks for the highly sensitive data-set.

Documents seen by The Telegraph show the Home Office has launched a study into uploadingthe "child abuse image database" onto the cloud. The database was set up in 2014and is comprised of millions of images and videos seized during previous operations.

Up until now, the images have only been accessible within police premises given they have been deemed "incredibly sensitive".

However, accordingto studydocuments, there have been a"number of limitations and concerns" in having thedata-set only accessible on physical sites.

The Home Office is looking into what the challenges would be in creating a copy of the images and then putting them on a cloud server, the documents said.

Such a move would likely prompt concerns over whether the database would be at higher risk of being stolen by criminals, given previously physical access has had to be granted.

A report released last year by cyber security firmPalo Alto Networks suggested there were tens of millions of vulnerabilities across cloud server providers, with these at risk of exploited by hackers to gain access to uploaded material although that same report said the fault did not lie with cloud providers themselves, but the way their systems were used.

The Home Office appears to have held initial conversations with Amazon Web Services, the cloud arm, over the database.

Amazon's cloud servers already play host to some of the most sensitive police data, with the company having been chosen as a supplier for the police super-database set up recently, which combined criminal conviction records withintelligence information. The companyisamong the biggest investors insecurity and compliance.

It is thought the feasibility study is being conducted as a "fact-finding" exercise, and that there are currently no plans to upload the images.

However, in the documents, it said such a move would bring more flexibility, as currently police cannot access the data remotely. The Home Office declined to comment.

Privacy advocated raised concernsover whetherimages of child abuse required even more protections, given they would be a "high value target".

A spokeswoman for Privacy International said the move would remove physical access controls and introduce a "different set of risks to what is a highly sensitive database".

"As the Home Office increasingly turns to cloud providers to hold sensitive data which would constitute a high value target, the public needs a great deal of reassurance."

The group urged for a consultation to be held with children's charities and those with technical expertise into whether the risks outweighed the benefits.

"Some of the justifications for such a move include a desire to facilitate remote access to the database and permit 'innovation activity'. This indicates that a broadening of access to a greater number of individuals outside the police, which is a clear cause for concern," the spokeswoman said.

Follow this link:
Government proposal to put police child abuse image database on the cloud raises hacking fears - Telegraph.co.uk

Read More..

Which [r]evolution to expect for cloud computing in 2020? – Data Economy

From a small revolution at first, cloud computing has in recent years evolved into a key strategic development driver not only for businesses but also for governments in short, for society at large. Whether with the launch of new players or the emergence of the first open source cloud platform, the past decade has been marked by outstanding innovations that have forever transformed the use of IT, both by and for businesses. The core of these innovations has been data, the new black gold without which future innovations would be compromised.

The new year 2020 heralds the start of a new era in which businesses intend to play a more instrumental role in the cloud offering at hand, including business models, in order to always make the most of their data. Even though it is difficult if not impossible to imagine what the next decade will be like, we can, however, anticipate some trends for 2020:

Hybrid cloud willcontinue to appeal to businesses

Business appetite for hybrid cloud has grown significantly in 2019. The challenges faced by businesses in terms of new skills, new application needs, legacy IT management etc. are constantly increasing as businesses realize that cloud computing is no panacea.

What is at stake is the significant cost associated with the extensive use of public cloud services and the ever more critical need for data control and security. Against this backdrop, businesses are turning away from exclusive public cloud offerings to move part of their data back to a private cloud.

On the other hand, they are abandoning on-premises cloud computing in favor of a hosted private cloud service that combines the best of both worlds greater cost control and a higher level of security, all with the elasticity and scalability of the cloud.

According to the Nutanix Cloud Enterprise Index, 92% of IT decision makers say this type of infrastructure best meets their needs.

and prefigures the advent ofmulti-cloud

Following in the footsteps of hybrid cloud but going one step further, there comes multi-cloud a combination of cloud environments ranging from on-premises cloud to hosted private cloud to public cloud, each dedicated to different use cases.

Given that no single cloud today can competitively provide for all solutions, the most mature businesses find in multi-cloud the promise of excellence selecting the best solutions from the entire cloud offering available to build a single application environment, in which all components are interdependent.

A business can choose to host its database with one provider, turn to another provider for its compute needs, store its data in yet another location, and orchestrate everything in a multi-cloud architecture.

As applications become less and less monolithic and their components communicate in an increasingly standardized way, it is a safe bet that multi-cloud has a bright future ahead of it.

Increasedinvestment will be made in orchestration and monitoring

While 2019 saw a decline in cloud budgets due to the ongoing consolidation, 2020 is expected to see investment pick up, increasing at a rate of 6.9% per year to $90.9billion by 2023 according to IDC forecasts in both public and private cloud. Initially seen as peripheral solutions, automation capabilities were the first to develop with the rise of containerization, providing simpler and faster portability.

Now has come the time for orchestration capabilities, which are at the heart of business concerns in order to better control traffic and align costs with actual needs and usage. First of all, Kubernetes is becoming the default orchestration technology. The aim here is to have applications that are capable of communicating natively and requesting resources in real time, deploying volumes and orchestrating everything based on application needs, the current contract and/or the parameters provided by the operator.

On the private cloud side, investment associated with data monitoring and observability capabilities will increase, allowing a more detailed understanding of infrastructure activity, in particular thanks to Machine Learning and AI applications.

Newsletter

Time is precious, but news has no time. Sign up today to receive daily free updates in your email box from the Data Economy Newsroom.

Security willremain a strategic issue

Previously, data security solutions focused on storage or networking capabilities. For example, if you wanted to store encryption keys securely, you had to rely on an HSM (Hardware Security Module), a monolithic solution that was poorly aligned with the cloud concept.

The ability to secure data in use, called Confidential Computing, is a big leap forward. More processors will embed this capability, which will therefore be increasingly available in infrastructures.

OVHcloud was one of the first to offer bare-metal offerings integrating and providing APIs to lease servers with this Confidential Computing capability built in. For instance, we already have partners who use such servers to offer key management features.

These servers, therefore, now make it possible to store and run all or part of software programs that require end-to-end security, thus greatly improving the security of data encryption and, in turn, of entire systems. Data encryption will be more readily available, whether for data in transit or at rest, to enhance data security.

Locally-based cloudservices will be increasingly in demand for legal reasons

With the introduction of data protection regulations and increased public awareness of this issue, businesses have realized the strategic nature of data sovereignty for themselves.

The issue of the legal framework for data goes beyond the scope of cloud providers alone and also affects businesses that use cloud solutions. Local initiatives are multiplying to set the rules for a trusted cloud, which meets everyones expectations in terms of data sovereignty.

Taking as example the recent French-German Gaia-X project, it would not be surprising that in 2020, private as well as public organizations were to favor their regional ecosystem face to the American-Chinese duopoly. We should see the development of new collaborative projects allowing the implementation of more local alternatives, made possible by a collective awareness by European vendors of their ability to provide a relevant cloud offering.

Many other topics could have been addressed here, such as open source, blockchain, AI and machine learning, but also applications related to smart cities, autonomous cars and connected health.

These technologies and fields of application involve the storage, exchange and processing of a large sometimes quite large amount of data, and are still in their infancy. In any case, one thing is for sure; society is evolving and cloud computing will continue to evolve as well, in order to better support it.

Read the latest from the Data Economy Newsroom:

See the article here:
Which [r]evolution to expect for cloud computing in 2020? - Data Economy

Read More..

Top 10 Cloud Computing Groups on LinkedIn in 2020 – Analytics Insight

The professional networking site, LinkedIn is a premier place for technology companies to gather information, connect with industry contemporaries, share ideas and develop a network. Moreover, even a cloud professional, be it a cloud engineer or IT professional with specialization in operating cloud solutions, can have access to additional insights into what the smartest in the industry are talking about. For them, various cloud computing LinkedIn professional groups are a proficient space, to begin with. Through this, cloud professionals will be able to stay aware and ahead in such a wide market space. Therefore, we have compiled a list of top 10 cloud computing groups on LinkedIn in 2020 where numerous similar expertise comes together to discover new insights, best practices and learn about software or tools their contemporaries are using in their daily operations.

About: A group for Cloud Computing, Cyber Security & Virtualization professionals to expand their network of contacts, share ideas and discuss industry-related topics.

The group covers VMware, SaaS, PaaS, Cloud Security, Cloud Computing & Server Virtualization technologies, Enterprise 2.0 Applications, technologies and architectures, CRM, cloud services, data center, Software as a Service, and on-demand applications.

Members Count: 494,628 professionals

About: This is a group for people involved in Current Big Thing Cloud Computing.

Members Count: 436,745 professionals

About: An exclusive group for Virtualization & Cloud Computing professionals to network and discuss industry-related topics such as Virtualization, VMware, Microsoft Hyper-V, Citrix Xen, Security, Enterprise, Mobile, Storage, VCP, VCDX, Cloud Power, SaaS, PaaS, Data Storage, Security. Technical Q&A and news are all supported and encouraged.

Members Count: 97,520 professionals

About: Amazon Web Services a global cloud solution provider. So far the main vendor to offer real tools for business to either move their applications to the cloud or build new solutions. Here you are welcome to link up with other users of AWS or you may want to learn from others. News of developers putting up new solutions into the AWS Cloud.

Members Count: 40,910 professionals

About: The IBM Cloud LinkedIn Group is for Cloud experts, customers, Business Partners, analysts, and other stakeholders to discuss, share, and collaborate on Cloud Computing. This network will help better understand the potential of Cloud Computing, the attributes of Cloud Computing, and what it means as an evolving model for IT infrastructures building on concepts such as virtualization, utility and grid computing, and SaaS and driving expectations for access, consistency, and quality in the user computing experience.

Members Count: 28,149 professionals

About: The Telecom IT Updates group facilitates relationships, education, and new opportunities for professionals in the IT and Telecom industries. This group is sponsored by Telarus, Inc.

Members Count: 70,627 professionals

About: The Cloud Storage group was formed in order to provide a common ground for the introduction and advancement of cloud storage and computing technology

Members Count: 45,664 professionals

About: The Cloud Networking group was formed in order to provide a common ground for the introduction and advancement of cloud networking and distributed network computing technology

Members Count: 12,711 professionals

About: The Cloud Computing and SaaS Best Practices Group is an open forum for Directors, VPs, and C-level executives working in Product Management, Operations and Information Technology. Join in to interact with peers and discuss best practices in Cloud Computing, cloud hosting, and SaaS.

Members Count: 2,829 professionals

About: Two of the most vibrant information technology trends are open source software and cloud computing. In combination, they are multiplying value at a near Moores Law rate. This group is for discussion and exchange of information about open source cloud computing (OSCC).

Members Count: 2,325 professionals

Continue reading here:
Top 10 Cloud Computing Groups on LinkedIn in 2020 - Analytics Insight

Read More..

Southeast Asia Cloud Computing Market size to USD 40.32 billion by 2025 according to a new research report – WhaTech Technology and Markets News

IaaS holds an approximate market share of 55% in the cloud computing market as of 2017 followed by Platform as a Service (Paas). IaaS is expected to witness a fair amount of growth in the coming years owing to increasing volume of business and acute financial data and other significant information among the businesses in multiple sectors.

The report on Southeast Asia cloud computing market, documents a comprehensive study of different aspects of the cloud computing market. It focusses on the steady growth in market in spite of the changing market movement.

Every market intelligence report covers certain important parameters that can help analysts define the market situation. It includes a thorough analysis of market trends, market shares and revenue growth patterns and the volume and value of the market.

It also covers methodical researches.

Report: http://www.adroitmarketresearch.com/contactsample/383

The Southeast Asia cloud computing market revenue is estimated to reach USD 40.32 billion by 2025 driven by the increasing demand for the cloud computing among the emerging small and medium size business organizations in this region. The cloud computing technology has proven to be the ultimate leapfrog technology that allows companies in small countries such as Indonesia, Thailand or Myanmar to connect to the rest of the word and compete with them.

Cloud computing uses a network of remote servers on the internet to manage, sore and process data instead of using a local server. This technology has gained popularity among smaller and growing businesses due to its cost effectiveness instead of using a local server.

The Southeast Asia cloud computing market share growth is parallel to the growing demand for data. Data access has been one of the key cloud computing market trends in driving and assisting in the growth of small noisiness, improvement in the ecommerce industry and development of new technologies such as artificial development (AI).

Read Complete Details at:www.adroitmarketresearch.com/industring-market

Singapore cloud computing market revenue share, by organization size, 2017 (%)

On the basis of organization size the Southeast Asia cloud computing market size was dominated by the small businesses primarily due to the presence of a large volume of small businesses across the region. Small business have financial restraints and have to operate on really tight budgets.

Therefore the installation and management of physical servers proves to be a costly affair to these businesses. Cloud computing platforms can cut costs as well render these small companies more competitive in the regional as well as the global markets.

These platforms establish a robust IT foundation for companies to incorporate the latest wave of technological developments in to their operations. This is one of the major cloud computing market trends that is expected to propel the market for this segment during the forecast period.

The Southeast Asia cloud computing market size is anticipated to be driven by Singapore during the forecast period and is expected to grow at a CAGR of more than 13%. The recent study by the Asia Cloud Computing Association (ACCA) projected Singapore as the most cloud ready country out of 14 Asia Pacific countries.

This is mainly due to the high quality of broadband services, enhanced cybersecurity and levels of business sophistication.

The Southeast Asia cloud computing market consists of major participants which includes Amazon, Akamai Technologies, CA Technologies, Alibaba, Cisco Systems and Google Inc. among others. The cloud computing market leaders are looking to this region for expanding further.

For instance, in August 2018, Google announced the building of its new data center in Singapore, and Alibaba Cloud has announced its second infrastructure zone in Malaysia. The expansion of Googles data centers in Singapore, takes the companys total investment to USD 850 million.

The launch of Alibabas new infrastructure in Malaysia will be certified for SAP hosting and bring new products such as elastic computing, database, networking and monitoring services to the market. These new developments from the cloud computing market leaders is expected to sustain the growth of the market during the forecast period.

Key segments of the Southeast Asia cloud computing market

Deployment Overview, 2015-2025 (USD million)

Product Overview, 2015-2025 (USD million)

Organization Size Overview, 2015-2025 (USD million)

Product Overview, 2015-2025 (USD million)

Country Overview, 2015-2025 (USD million)

Enquire more details of the report at:www.adroitmarketresearch.com/researchreport/383

This email address is being protected from spambots. You need JavaScript enabled to view it.

Follow this link:
Southeast Asia Cloud Computing Market size to USD 40.32 billion by 2025 according to a new research report - WhaTech Technology and Markets News

Read More..

Current research: Cloud Hosting Service Market status and prospect to 2026 – WhaTech Technology and Markets News

Key Players: A2 Hosting, SiteGround, InMotion, HostGator, DreamHost, 1&1 IONOS, Cloudways, Bytemark Cloud, Hostwinds, Liquid Web Hosting, AccuWeb, SiteGround, FatCow, BlueHost, Vultr.

2020 Report on Global Cloud Hosting Service Market is a professional and comprehensive report on the Cloud Hosting Service industry.

#Download Free PDF Sample Brochure of report Global Cloud Hosting Service Market 2020 across with 134 Pages and in-depth TOC Analysis @ http://www.reportsnreports.com/contactme=2891895

The report pinpoints on the leading market competitors with explaining Cloud Hosting Service company profile depends on SWOT analysis to illustrate the competitive nature of the Cloud Hosting Service market globally. Even more, the report consists of company recent Cloud Hosting Service market evolution, market shares, associations and level of investments with other Cloud Hosting Service leading companies, monetary settlements impacting the Cloud Hosting Service market in recent years are analyzed.

Development policiesand plans are discussed as well as manufacturing processes and cost structures are also analyzed. This report also states import/export consumption, supply and demand Figures, cost, price, revenue and gross margins.

The report focuses on global major leading Cloud Hosting Service Industry players providing information such as company profiles, product picture and specification, capacity, production, price, cost, revenue and contact information. Upstream raw materials and equipment and downstream demand analysis is also carried out.

The Cloud Hosting Service industry development trends and marketing channels are analyzed. Finally the feasibility of new investment projects are assessed and overall research conclusions offered.

Access this Latest Research Report @ http://www.reportsnreports.com/contactme=2891895

Geographically, this report is categorized into various main regions, including sales, proceeds, market share and expansion Rate (percent) of Cloud Hosting Service in the following areas, North America, Asia-Pacific, South America, Europe, Asia-Pacific, The Middle East and Africa.

Market segment by Type, the product can be split into- Linux Servers - Cloud- Windows Servers - Cloud

Market segment by Application, split into- Commercial Operation- Government Department- Others

List of Tables

Table 1. Cloud Hosting Service Key Market SegmentsTable 2. Key Players Covered: Ranking by Cloud Hosting Service RevenueTable 3. Ranking of Global Top Cloud Hosting Service Manufacturers by Revenue (US$ Million) in 2019Table 4. Global Cloud Hosting Service Market Size Growth Rate by Type (US$ Million): 2020 VS 2026Table 5. Key Players of Linux Servers - CloudTable 6. Key Players of Windows Servers - CloudTable 7. Global Cloud Hosting Service Market Size Growth by Application (US$ Million): 2020 VS 2026Table 8. Global Cloud Hosting Service Market Size by Regions (US$ Million): 2020 VS 2026Table 9. Global Cloud Hosting Service Market Size by Regions (2015-2020) (US$ Million)Table 10. Global Cloud Hosting Service Market Share by Regions (2015-2020)Table 11. Global Cloud Hosting Service Forecasted Market Size by Regions (2021-2026) (US$ Million)Table 12. Global Cloud Hosting Service Market Share by Regions (2021-2026)Table 13. Market Top TrendsTable 14. Key Drivers: Impact AnalysisTable 15. Key ChallengesTable 16. Cloud Hosting Service Market Growth StrategyTable 17. Main Points Interviewed from Key Cloud Hosting Service PlayersTable 18. Global Cloud Hosting Service Revenue by Players (2015-2020) (Million US$)

Download Free Sample Report @ http://www.reportsnreports.com/contactme=2891895

In the end, the Global Cloud Hosting Service Market reports conclusion part notes the estimation of the industry veterans.

This email address is being protected from spambots. You need JavaScript enabled to view it.

See the original post here:
Current research: Cloud Hosting Service Market status and prospect to 2026 - WhaTech Technology and Markets News

Read More..

Tachyum’s Reference Design Will Be Used In a 2021 AI/HPC Supercomputer – Business Wire

SANTA CLARA, Calif.--(BUSINESS WIRE)--Semiconductor company Tachyum Inc. announced today that its Prodigy Processor AI/HPC Reference Design will be used in a supercomputer which will be deployed in 2021. This reference design will provide customers, partners, OEMs and Original Design Manufacturers (ODMs) a proven blueprint for building and deploying ultra-high performance Exascale AI/HPC supercomputers and datacenters in 2021.

Tachyums Prodigy Universal Processor, slated for commercial availability in 2021, is a 64-core processor with a clock speed in excess of 4GHz that brings to market new Universal Computing capabilities. In normal datacenter workloads, Prodigy handily outperforms the fastest processors while consuming one-tenth the electrical power, and it is one-third the cost. In AI applications, Prodigy outperforms GPUs and TPUs on neural net training and inference workloads, and is orders of magnitude easier to program.

The reference design platform is architected to deliver the power-performance benefits of Prodigy to wide ranging applications in verticals, including AI/HPC supercomputing, Edge Computing, Datacenter operations (public and private cloud), Telecommunications, and Automotive. Today's datacenters are under relentless pressure to support both regular and AI powered workloads. To address this, a heterogeneous infrastructure is being built, with x86 servers for regular workloads and GPU/TPU accelerators for AI workloads. Additional hardware, plus time and energy are needed to move the enormous amount of data between the two computing silos.

Datacenters are provisioned with enough servers to accommodate peak customer demand; during off-peak hours, more than 50% of most cloud datacenter servers are powered down to save electricity costs. Servers equipped with Prodigy offer the highest performance at the lowest cost to power conventional applications and also provide low-cost AI on demand. Idle Prodigy servers can be seamlessly and dynamically powered up and used for AI training or inference workloads. With Prodigy, provisioning an AI environment becomes CAPEX-free, since the idle servers powered up to handle AI workloads are already on the books as capital equipment

We are excited that customers are interested in our technology. There is never a more exciting time for a startup than THE FIRST CUSTOMER, especially one who wants to deploy an AI/HPC supercomputer in 2021. The preparations at the supercomputer site are expected to start later this year, so that infrastructure is ready when Tachyums first hardware arrives. Tachyum has responded to its customers and partners need for an easy blueprint to enable hyperscalers and qualified ODMs and OEMs to begin changing the competitive landscape, while accelerate democratizing AI/HPC in the process, said Dr. Radoslav Rado Danilak, Tachyum founder and CEO. Every Prodigy-equipped datacenter is also a low-cost, scalable AI/HPC datacenter.

Tachyums Prodigy Universal Processor is the smallest and fastest general purpose, 64-core processor developed to date, requiring 10x less processor power, and reducing processor cost by 3x. Prodigy will directly enable a 32-Tensor Exaflop supercomputer and allow the building of machines more powerful than the human brain in 2021, years ahead of industry expectations. Prodigy reduces datacenter annual TCO (total cost of ownership) by 4x, through its disruptive processor architecture and a smart compiler that has made many parts of the hardware found in typical processors redundant. Fewer transistors, fewer and shorter wires, due to a smaller, simpler core, translates into much greater speed and power efficiency for the Prodigy processor.

Interested organizations can obtain the Prodigy server reference design, or consult with Tachyum at https://www.tachyum.com/contact.shtml.

Follow Tachyumhttps://twitter.com/tachyum https://www.linkedin.com/company/tachyum https://www.facebook.com/Tachyum/

About Tachyum

Named for the Greek prefix tachy, meaning speed, combined with the suffix -um, indicating an element (e.g. lithium), Tachyum is meant to evoke the notion of an element of speed. Tachyum emerged from stealth mode in 2017 to engineer disruptive intelligent information processing products. Tachyums founders have a track record of solving problems caused by device physics in semiconductors, to deliver transformational products to global markets, and are backed by IPM Growth, the Central & Eastern European venture capital platform, as Tachyums lead investor. For more information visit: http://tachyum.com.

Read more:
Tachyum's Reference Design Will Be Used In a 2021 AI/HPC Supercomputer - Business Wire

Read More..

Cohesity loses cohesion: Rapidly diversifying firm has an identity problem – Blocks and Files

You though you understood Cohesity well enough. It supplies hyperconverged secondary storage. It is basically doing a Nutanix on the secondary storage market and converging the file use cases for test and dev, compliance and other copy data users. The San Jose firm makes a golden master copy of a file and farms out virtual copies of it for temporary use, saving storage space.

Only it doesnt just do this. It provides a backup appliance. It tiers to the cloud. It provides file storage. It archives data. It can do disaster recovery. It can migrate data as well. So what is Cohesity in product positioning terms?

Thats a tough question to answer, in that it doesnt fit in the standard product boxes. There are three main boxes to consider here: file storage, backup, and data management. We can easily populate these boxes with suppliers because thats mostly how they define themselves; by product fit to a market sector. A diagram shows what we mean;

Certain companies and products are known for file storage Isilon, NetApp and Qumulo, for example.

Certain companies are known for backup, masses of them in fact, starting with Acronis and Asigra and running through the alphabet to Veeam and Veritas.

Other companies are known for copy data management, such as Actifio, Cohesity itself, and Delphix.

Some suppliers are known for file life cycle management, such as Komprise and file access acceleration, such as InfiniteIO.

Where Cohesity fits, according to Michael Letschin, its director for Technology Advocacy, who briefed us, is in all three boxes. As we understand it, Cohesitys technology is based on a Span File System, which is a highly scale-out filesystem with some unique properties. For example it can receive and present files using NFS, SMB and S3 protocols at the same time.

Cohesitys software runs in scale-out clusters which are managed, in single or multiple geos, by the Helios SaaS facility.

Its generalised file data processing platform receives data from a variety of sources, does things with them, and makes them available to a variety of target use cases.

As a file store, Letschin said, it cannot do tier 0 file access work; its not equipped for that low latency, high speed access activity. NetApp and Isilon and Qumulo can rest easy in that use case. But Cohesity can do the tier 1, tier 2, and 3 work, what we can call the secondary file data or unstructured data world. And here, because of the breadth of its coverage, the firm could potentially reign supreme.

Backup is a way to get data onto its platform, an on-ramp, an ingest method. It built a HW/SW appliance product to do that, but is now switching to a software-only product available through subscription. This can run on-premises or in the public cloud. Cohesity can back up applications in physical servers and in virtual servers (vSphere, Hyper-V Acropolis). It can back up relational and, via its Imanis acquisition, distributed databases. It can back up Kubernetes-orchestrated containerised systems.

The product can be a target system for backup products, such as Veeam. It can write backup data out to archives in the public cloud (AWS, Azure, GCP) and also to tape via a Qstar gateway. The archive data can be written in an immutable form (Write once: read many or WORM).

It can tier file data to the cloud, leaving a reference access stub behind, and so save space on primary (tier 0-class) filers. And it can supply data to Test and Dev, with personally identifiable information detected and masked out. It can move backed-up VMs to the cloud ready to spin up if a disaster happens (CloudSpin) and even run them on the Cohesity cluster as a stop-gap.

Third-parties have built applications that use Cohesitys software to do extra things, such as the Clam AV anti-virus product and the firms own Splunk facility for small log file populations.

Customers can download these from the Cohesity Marketplace,running on Cohesitys distributed infrastructure and using the Cohesity App SDK to access the data managed by the Cohesity DataPlatform. They have to buy the licence from the vendor or partner directly.

Mostly all its functions are policy-driven and set up through a clean UI.

It would seem that a lot of what a customer might want to do with secondary file/unstructured data can be done with Cohesitys software. (Were using secondary data to mean non-tier 0 data.)

This is why trying to position Cohesity in a single standard file storage activity-related box is like nailing jelly to a wall. All of which, its execs must be hoping, makes for remarkably sticky software.

Read this article:
Cohesity loses cohesion: Rapidly diversifying firm has an identity problem - Blocks and Files

Read More..

Options for the Windows Server 2008 End of Life Blues – ITPro Today

Windows Server 2008 end of life has finally passed its last incarnation. Some IT operations will still have servers running it though, since some industry- or workplace-specific applications work and play well on more recent Windows Server offerings. This means these shops are especially vulnerable to new security threats aimed at unsupported operating systems.

Microsoft isn't entirely abandoning those still dependent on Windows Server 2008. For three more years, users can continue to receive support by taking advantage of Microsoft's Extended Security Update program, which promises to supply "critical" and "important" security patches to those with active Software Assurance or subscription licenses. With a few restrictions, the program is also available to those still using SQL Server 2008, with patches limited to "critical" updates.

Related: Say Goodbye to Windows Server 2008 and Hello to Azure?

Extended Security Update is expensive, however, logging-in at "75% of the full license cost annually," according to Microsoft. That would represent a broad range of pricing, since licensing costs for any Windows Server version vary widely across different editions. When Windows Server 2008 R2 hit the market, for example, a license could be as inexpensive as $469 yearly for the Web Server edition, or as expensive as $3,999 for the Enterprise edition.

For those who want to consider support options beyond the Microsoft offering and want to keep their instance on-premises instead of lifting-and-shifting to the cloud, there is only one solution that fits the bill.

0Patch (as in "zero-patch," and not to be confused with Oracle's OPatch utility), is a service of Slovenia-based ACROS Security that typically supplies security fixes to companies running currently supported versions of Windows. The fixes either address critical zero-day exploits that haven't yet been addressed by the vendor, or patches to be used as a stopgap measure while vendor-supplied patches are being tested.

0Patch will keep some no longer supported software, including Windows Server 2008 as well as Windows 7, patched against security issues at a cost of a little over $25 annually per machine, with volume discounts starting at 20 computers.

ACROS CEO Mitja Kolsek told ITPro Today that while some of the patches might be based on vendor supplied patches, "We create a lot of patches ourselves."

"While having access to a vendor's patch is helpful in determining what the original developers thought was the best way of fixing the vulnerability, we often fix in a different way to minimize the code we change," he said. "Sometimes our fix is also better that the vendor's."

In addition, he said, the company has fixes for some security issues that have yet to be patched by Microsoft.

The company's reason for needing to "minimize" the changed code might be something that potential users might want to consider before signing up for the service. Any fix that 0Patch supplies is not in the form of a traditional patch, which replaces an entire changed file or application on the hard drive, but is a memory resident "micropatch" and is applied on the fly.

"0patch Agent is designed to inject a dynamic load library (DLL) into each running process so that it can then apply and un-apply micropatches in that process," 0Patch explains on its website. "While there are some processes that don't let themselves get injected this way, most processes will spend an additional 600-700 KB of memory each for hosting that DLL. On a typical Windows 10 system with [about] 100 running processes this means a memory consumption of 60-70 MB."

When asked if 0Patch's system presents a new security worry for users, Kolsek replied: "While we're trying hard to avoid that and utilize 20-plus years of experience in finding vulnerabilities, it's almost sure that there are vulnerabilities in our product, as are there in any other software product. We can also micropatch our own product, so fixing can be fast and deployment of the fix instant and unobtrusive for the user."

For those unwilling to pay for Microsoft support or to rely on a third party's unique solution for continued security updates following Windows Server 2008 end of life, the only solutions involve moving to the cloud.

The easiest solution here is probably Microsoft, which will supply free security updates for three years to organizations that move their Windows Server 2008 workloads to its Azure cloud to run on as a VM or managed instance.

A little more complex, but perhaps a more complete, long-term solution is being offered by Amazon Web Services with its End-of-Support Migration Program for Windows Server. With this program, users upload their unsupported workloads to the cloud and upgrade to a supported version of Windows Server in the process, using a compatibility layer to do things like redirecting APIs that have changed.

AWS says that the EMP technology is offered without cost, although users will have to pay a fee to have applications assessed and repackaged.

Other than these solutions, IT shops can face the final Windows Server 2008 end of life by upgrading to a newer version on their own, or they can continue to ride bareback and hope any security holes that surface don't lead to an attack by the black hats.

That last option is not recommended. As Rocky used to say to Bullwinkle, "That trick never works."

View post:
Options for the Windows Server 2008 End of Life Blues - ITPro Today

Read More..

Maintaining Uptime in the Data Center Is No Game of Checkers – Data Economy

Its only Monday but somehow there is already enough tech news to fill up your working day.

So to help you keep on top of things, Data Economy has listed the top five things you need to know today.

A recently published report predicts that the global data centresmarket will grow with a CAGR of 15.1% over the forecast period from 2019-2025,according to ReportLinker.

The study on data centres market covers the analysis of theleading geographies such as North America, Europe, Asia-Pacific, and RoW for theperiod of 2017 to 2025.

This time last week, Google Drive, Docs, Sheets, and Slidesall briefly went down due to an unknown issue that affected the consumer andbusiness versions of Googles productivity apps.

The word processing app, which also hosts Googles spreadsheet service Sheets and submission host Forms among others, failed to load for users worldwide at around 6:30 p.m. GMT on Monday, with many complaining of error messages saying systems had detected unusual traffic.

Google Docs is utilised by millions of individuals andbusinesses worldwide, and an outage of this scale has undoubtedly led tofrustration for the students and workers whose service has been disrupted, TimDunton, MD, Nimbus Hosting.

In fact, in todays digital climate simple safe andstress-free websites must be considered a priority for organisations.

Therefore, all businesses must ensure they have efficientcyber security protocols in place, including a website and internal ITinfrastructure which is modern, secure and constantly kept up to date.

Amdocs has announced the availability of its cloud-nativeAmdocs Service & Network Automation solution to support all aspects ofservice design, inventory and orchestration across physical, logical andvirtual elements for all lines-of-business (LOBs), including enterprise/B2B,mobile, consumer broadband, as well as NFV and 5G-based services.

Amdocs also announced that a number of its customers areaccelerating their network transformation and NFV journey with the deploymentof Amdocs cloud-native operations automation suite including three integratedservices providers in EMEA, a provider of telecommunications services in theAPAC, and a MSO in North America.

Pulse Secure has announced that Alex Thurber is joining theexecutive team as Chief Revenue Officer, where he will be responsible forglobal sales strategy, management and team development.

Newsletter

Time is precious, but news has no time. Sign up today to receive daily free updates in your email box from the Data Economy Newsroom.

We are committed to providing the easiest, most effectivesolutions for enterprises to secure access to applications and data acrosstheir hybrid IT environments, said Sudhakar Ramakrishna, CEO of Pulse Secure.

We are excited to have Alex join our team and spearhead ourforward momentum in the marketplace.

Alex has the experience to drive growth of ourindustry-leading Zero Trust solutions into an ever-evolving securitylandscape.

Edge computing firm Scale Computing reports it achievedrecord sales in Q4 driven by its OEM partnerships and edge-based deal activity,exiting 2019 at a growth rate of over 90% in total software revenue.

As a result, Scale Computing recorded its best year yet,extending worldwide sales as it added hundreds of new customers, includinggains in distributed enterprises, strategic partnerships, channel growth, andan expansion of its HC3 Edge product portfolio.

Across industries and segments, companies are looking todrive out complexity and maximize application uptime, said Jeff Ready, CEO andco-founder of Scale Computing.

Nowhere is this more apparent than in edge computing, wherehands-on IT personnel and on-site resources are limited.

Whether at the edge or in the data centre, customers wantto reduce the complexity and minimize the cost of their IT processes, and atScale Computing, we are delivering competitive solutions that solve the needsof our customers around the world, as evidenced by our ratings on Gartner PeerInsights, Spiceworks, TechValidate and TrustRadius.

In 2020, we anticipate even higher growth for Scale Computing as a leading player in the edge computing and hyperconverged space, and we look forward to the successes this year will bring.

Read the latest from the Data Economy Newsroom:

Read more here:
Maintaining Uptime in the Data Center Is No Game of Checkers - Data Economy

Read More..