Page 2,359«..1020..2,3582,3592,3602,361..2,3702,380..»

ISG to Conduct Study on Private and Hybrid Cloud Providers – StreetInsider.com

Get inside Wall Street with StreetInsider Premium. Claim your 1-week free trial here.

ISG Provider Lens report will examine the top providers of managed hosting, hyperconverged systems and other services

STAMFORD, Conn.--(BUSINESS WIRE)--Information Services Group (ISG) (Nasdaq: III), a leading global technology research and advisory firm, has launched a research study examining providers of private and hybrid cloud services targeted to enterprise clients.

The study results will be published in a comprehensive ISG Provider Lens report, called Next-Gen Private/Hybrid Cloud Data Center Solutions & Services, scheduled to be released in June. The report will cover a range of private and hybrid cloud services hosted in data centers.

Enterprise buyers will be able to use information from the report to evaluate their current vendor relationships, potential new engagements and available offerings, while ISG advisors use the information to recommend providers to the firms buy-side clients.

The new report will look at ways private and hybrid cloud providers are helping enterprise clients achieve their business goals, said Jan Erik Aase, partner and global leader, ISG Provider Lens Research. Private and hybrid cloud providers are an important piece of many enterprises IT infrastructure, he said. These providers free up enterprises resources to focus on the core business, while offering secure and reliable cloud services.

ISG has distributed surveys to more than 400 private and hybrid cloud providers. Working in collaboration with ISGs global advisors, the research team will produce five quadrants representing the services the typical enterprise client is buying in the private and hybrid cloud space, based on ISGs experience working with its clients. The five quadrants that will be covered are:

The report will cover the hyperconverged systems and hybrid cloud management platform markets on a global basis and the managed services, managed hosting and colocation services markets in the U.S., the U.S. Public Sector, Brazil, Germany, the Nordics, Switzerland, the U.K., Australia, the Benelux countries, France, and Malaysia/Singapore. ISG analysts Shashank Rajmane, Pedro L. Bicudo Maschio, Ulrich Meister, Wolfgang Heinhaus, Ian Puddy, Rohan Thomas, Angus Macaskill, Bruce Guptill and Richard Marshall will serve as authors of the report.

An archetype report will also be published as part of this study. This report, unique to ISG, is the study of typical buyer types of private and hybrid cloud services as observed by ISG advisors.

A list of identified providers and vendors and further details on the study are available in this digital brochure. A separate brochure for the U.S. Public Sector is also available. Companies not listed as private and hybrid cloud providers can contact ISG and ask to be included in the study.

About ISG Provider Lens Research

The ISG Provider Lens Quadrant research series is the only service provider evaluation of its kind to combine empirical, data-driven research and market analysis with the real-world experience and observations of ISG's global advisory team. Enterprises will find a wealth of detailed data and market analysis to help guide their selection of appropriate sourcing partners, while ISG advisors use the reports to validate their own market knowledge and make recommendations to ISG's enterprise clients. The research currently covers providers offering their services globally, across Europe, as well as in the U.S., Canada, Brazil, the U.K., France, Benelux, Germany, Switzerland, the Nordics, Australia and Singapore/Malaysia, with additional markets to be added in the future. For more information about ISG Provider Lens research, please visit thiswebpage.

A companion research series, the ISG Provider Lens Archetype reports, offer a first-of-its-kind evaluation of providers from the perspective of specific buyer types.

About ISG

ISG (Information Services Group) (Nasdaq: III) is a leading global technology research and advisory firm. A trusted business partner to more than 700 clients, including more than 75 of the worlds top 100 enterprises, ISG is committed to helping corporations, public sector organizations, and service and technology providers achieve operational excellence and faster growth. The firm specializes in digital transformation services, including automation, cloud and data analytics; sourcing advisory; managed governance and risk services; network carrier services; strategy and operations design; change management; market intelligence and technology research and analysis. Founded in 2006, and based in Stamford, Conn., ISG employs more than 1,300 digital-ready professionals operating in more than 20 countriesa global team known for its innovative thinking, market influence, deep industry and technology expertise, and world-class research and analytical capabilities based on the industrys most comprehensive marketplace data. For more information, visit http://www.isg-one.com.

View source version on businesswire.com: https://www.businesswire.com/news/home/20220114005084/en/

Press:

Will Thoretz, ISG+1 203 517 3119will.thoretz@isg-one.com

Erik Arvidson, Matter Communications for ISG+1 617 755 2985isg@matternow.com

Source: Information Services Group, Inc.

Visit link:
ISG to Conduct Study on Private and Hybrid Cloud Providers - StreetInsider.com

Read More..

Strata Identity Hosts Complimentary Webinar Featuring ESG Analyst on Identity and Policy Management for Multi-Cloud in 2022 – Business Wire

BOULDER, Colo.--(BUSINESS WIRE)--Strata Identity, the Identity Orchestration for multi-cloud company, announced today it will host a webinar featuring Jack Poller, Analyst for Enterprise Strategy Group (ESG) on identity and policy management challenges in a multi-cloud world.

WHO: Analyst Jack Poller covers Identity and Data Security for ESG. Drawing on more than 25 years of industry experience, Jack's expertise spans a broad range of systems, storage, networking, and cloud-based products and markets. Prior to joining ESG, Jack held marketing positions at storage and networking startups. He is also a software and hardware engineer, and has developed multiprocessor workstations and servers, 3D graphics, storage and networking systems.

WHAT: The reality of 2022 is that organizations live in a multi-cloud world. Todays hybrid mix of on-premises, SaaS, and public cloud infrastructures creates new unprecedented challenges for effectively managing identities and policies. In this webinar, Jack Poller will present new research findings from ESG on the leading multi-cloud identity management issues facing organizations. You will learn:

The top five unexpected ways sensitive data is exposed through identity The hidden risk that a lack of passwordless MFA access for key services poses How to enforce consistent security across data center and public cloud environments Why providing secure access to distributed environments for a remote workforce is critical

WHEN: Jan. 20, 2022 at 1:30pm EST

WHERE: This conference is accessible online with confirmed registration.

HOW: To register, visit this link. To schedule a conversation with Strata Identity, contact Marc Gendron at marc@mgpr.net or +1 617.877.7480.

About Strata

Strata is pioneering the concept of Identity Orchestration for distributed, multi-cloud identity. The Maverics Identity Orchestration Platform enables enterprises to seamlessly unify on-premises and cloud-based authentication and access systems for consistent identity management in multi-cloud environments. Stratas distributed approach to identity enables organizations to break decades-old vendor lock-in, preventing a broader transition of enterprise workloads to the public cloud. The companys founders co-authored the SAML open standard for identity interoperability, created the first cloud identity services, delivered the first open-source identity products, and are now building the first distributed identity platform. For more information, visit us on the Web and follow us on LinkedIn and Twitter.

Visit link:
Strata Identity Hosts Complimentary Webinar Featuring ESG Analyst on Identity and Policy Management for Multi-Cloud in 2022 - Business Wire

Read More..

Strengthening the availability chain – ITProPortal

What do you think of first when thinking about ensuring the high availability (HA) of your most important applications and data? If you or your customers need to be able to access those applications 99.99 percent of the time, its natural to think first about ensuring access to the compute and storage resources. If youre running SQL Server in the cloud, for example, you can configure a Windows Failover Cluster Instance (FCI) to respond to the failure of compute or storage resources by automatically moving the compute and storage loads to an alternate node of the failover cluster. HA problem solved!

But what if its not the compute or storage resources that fail? There are many links in the availability chain connecting you and your customers to those compute and storage resources. You need to consider all those links to ensure the HA experience you are striving to achieve.

If youre running your critical applications in the cloud, your cloud service provider is going to ensure the availability of the intranet connecting the components of your cloud infrastructure. AWS, Azure, and Google Cloud Platform all provide high speed, robust internal networks with multiple paths, so the core cloud networks are fully capable of supporting your 99.99 percent HA goal.

You cant control how your customers connect to your cloud-based applications, but you can control how you connect to them. You might be using a VPN Gateway or a dedicated connectivity service such as Azure ExpressRoute, AWS Direct Connect, or Google Direct Interconnect. All these options can provide you with a high-speed, low latency connection to the cloud, but they all offer different SLAsand several of them expose weak links in the availability chain. The basic configuration of Azure ExpressRoute offers only a 99.95 percent availability guarantee; the basic configuration of AWS Direct Connect is even loweronly 99.9 percent. If either service fails unexpectedly, access to your critical applications could be constrained for far longer than you are expecting. Indeed, the VMs configured for HA in the Azure or AWS clouds may continue to run without interruptionbut thats cold comfort if you cannot access them because ExpressRoute or Direct Connect is down.

You can configure Azure ExpressRoute or AWS Direct Connect for HA; it just takes planning. Youll need to configure at least two ExpressRoute circuits and four Direct Connect circuits to gain an SLA of 99.99 percent. If youre using the analogous services on GCP, youll want to use the Google Direct Interconnect Service for Production-Level Applications rather than the Google Direct Interconnect Service for noncritical Applications to get the 99.99 percent SLA.

Even if you strengthen the weak links in the network, though, there remain potential weak links within the cloud infrastructure itselfamong load balancers, DNS servers, identity and authentication servers, web server farms, and the like. Remember the very public outage at Facebook in October of 2021? Outages affecting access to Facebooks internal DNS serversnot the production systems supporting Facebooks primary lines of businesswere responsible for bringing down the entire organization for hours. You need to look at these components of your overall infrastructure as well to ensure that youre fully configured for HA.

Googles SLA for DNS server services is 100 percent, which is encouraging, but its SLA for Cloud Identity services is only 99.9 percent. Similarly, AWSs Route 53 private DNS service strives to offer a 100 percent SLA, but its Directory Services offering tops out at 99.9 percent. The Azure Active Directory Basic and Premium Services offer a 100 percent SLA, but the SLA for Azure Active Directory Domain Services tops out at 99.9 percent.

As with network connectivity, there are things one can do to improve the reliability of the internal infrastructure supporting your critical cloud-based applications. For example, you can configure your AWS environment with multiple domain controllers, which can boost the reliability of the AWS Directory Services offering closer to the 99.99 percent accessibility levels you seek.

There are times, though, as in the seven-hour AWS outage of December 7, 2021, where even the most prepared organizations may encounter unexpected downtime. In the case of the AWS outage, the issues stemmed not from systems that customers were using but, as AWS notes, from errors occurring on an internal network designed to host foundational services, including monitoring, internal DNS, authorization services, and parts of the EC2 control plane.* Indeed, in many cases the VMs upon which customer applications were running remained operational and fully compliant with HA SLAsyet customers could not access their applications because of issues with gateways, internal DNS services, load balancers, and other components whose ability to operate properly was compromised by the cascading effects of the errors occurring on the internal network.

How can your applications remain operational and accessible when the weak link in the availability chain turns out to be the cloud itself? Your best option here is to rely on a multi-cloud disaster recovery (DR) solution. Essentially, you would create a mirror infrastructure to support your most vital applications in an entirely separate cloud. If your critical SQL Server infrastructure runs on AWS, for example, you would create an identical instance of SQL Server on Azure or GCP, an instance you could start up manually if the AWS cloud went offline. You will want to select a DR management solution that runs in both the AWS and Azure/GCP environments and that can automatically orchestrate the replication of data from the SQL Server instance in AWS to storage attached to the infrastructure in your Azure/GCP cloud environment. If you dont deploy the same DR management solution in both environments, you may not replicate your data properly between the clouds.

Youll also want to configure a high-speed virtual private network (VPN) connection between your primary and DR infrastructures. AWS, Azure, and GCP all offer VPN services that can enable a secure cloud-to-cloud connection (and there are third-party options as well), and this becomes the conduit through which your DR management solution replicates your critical data between the cloud infrastructures. Yes, if you were using an AWS VPN Solution in December it might have gone offline during the outage but in this case that's okay. The DR management solution running on AWS replicates all the local write operations to its storage counterpart in the DR infrastructure as quickly as the network will allow, so by the time the AWS services went offline the DR software would have replicated all (or nearly all) of the critical AWS data to the DR infrastructure. As soon as it was apparent that the primary cloud had gone offline, you would spin up the infrastructure in the DR cloud and it could begin providing customer access to your critical applications with minimal disruption. You may not be up and running in the sub-five minute timeframe you expect of an HA solution, but you would be operational far faster than you would be if youd had to wait for seven hours for AWS to get its operations back online.

Ultimately, configuring for HA is all about configuring to ensure the high availability of your application. You can create FCIs that will ensure the HA of your VMs and storage without difficulty. All cloud service providers are accustomed to accommodating you at that level. For true end-to-end HA, though, you need to pay extra attention to all the other links in the availability chain. Some will be weaker than you realize unless you take extra steps to strengthen them.

Dave Bermingham, Senior Technical Evangelist, SIOS Technology

View original post here:
Strengthening the availability chain - ITProPortal

Read More..

NordVPN launches open source VPN speed testing tool – IT PRO

Virtual private network (VPN) service provider NordVPNhas developed an open source speed-test tool that lets users compare the speeds of different VPN services objectively.

The company says the tool aims to address limitations of conventional VPN testing tools, includingtheir lack of transparency, reliability, and universal applicability.

The tool is said to provide a unified, standard speed-testing methodology to accurately measure the network throughput of a VPN source.

The publicly available solution, hosted on GitHub, offers comprehensive test reports detailing the number of tests per hour, median download speed results, the arithmetic mean, and median winner results.

NordVPN recommends running a VPN performance test for three days with at least one test per hour for a more accurate comparison.

Test subjects can include NordVPN, ExpressVPN, SurfShark, PIA, and PureVPN. Adding support for additional VPN service providers involves creating a personal branch of the tool's GitHub repository and updating the code.

Lack of a unified approach towards how performance is measured leads to the industry's current situation: different researchers via different outlets showcase different results ranked in different order," explained NordVPN."This conflicting information is badfor VPN users and providers alike. The confusion can make it difficult for users to make informed decisions."

NordVPNs speed-test tool utilizes virtual private servers (VPSs) from two different cloud hosting providers Vultr and Linode to ensure test results are valid if the speeds of both VPS providers are comparable.

"Connection speed is one of the most important factors that affect user experience when they use a VPN," said Vykintas Maknickas, a cybersecurity expert at NordVPN. "People want to secure their internet traffic without having to sacrifice speed."

"However, testing VPN speed meaningfully is a complex procedure because countless factors can impact it," added Maknickas.

The challenge of securing the remote working employee

The IT Pro Guide to Sase and successful digital transformation

VMware Cloud workload migration tools

Cloud migration types, phases, and strategies

Practices for maximising the business value of digital infrastructure Consumption-as- a-Service subscriptions

IDC PeerScape

Container network security guide for dummies

Enforcing Kubernetes best practices

Read the original:
NordVPN launches open source VPN speed testing tool - IT PRO

Read More..

Nutanix Rajiv Ramaswami On His First Year As CEO – Forbes

Nutanix. Rajiv Ramaswami, President, and CEO

Nutanix was founded in 2009 and shipped its first product in 2011. Moor Insights and Strategy has been researching the company since around 2016, writing severaltechnical analyses and articles.

The company believed that a layer of intelligent software on top of relatively inexpensivecommodity servers could provide the same operational resiliency and data redundancy as more expensive high availability servers. Initially, Nutanix saw the opportunity in traditional storage arrays, developing software to combine local storage across several servers resulting in a solution that looked like a conventional pool of shared storage. The result was the same availability and similar performance to traditional storage arrays, but without the complexity of storage arrays or storage area networks.

Combining the Nutanix AOS Distributed Storage virtual servers established the hyper-converged infrastructure (HCI) market - a fabric of servers and storage integrated, easy to use, and less expensive.

It has not always been smooth sailing for Nutanix, from shipping an appliance as a complete hardware and software stack to discontinuing appliances in favor of software sales on standard OEM configurations.And to now switching from perpetual software licensing to software subscriptions.

A year ago, Rajiv Ramaswami was appointed President and CEO of Nutanix.I had the chance to speak withRamaswamito review his first year and hear his vision for the future of Nutanix.

The second innings for Nutanix

Rajiv characterized 2021 as an eventful year" and the beginning of a second innings for Nutanix. For Rajiv, I'm pretty sure the analogy is cricket rather than baseball! Nutanix was a pioneer in the HCI space with a great platform, well-liked by customers in the first innings. The second innings will be a period of growth and profitability built around four key priorities to become the de facto hybrid multi-cloud company.

Priority 1: Become a software subscription company

As I mentioned, Nutanix has transitioned from selling appliances to selling software to now well on the way to total subscriptions. There is a growing customer desire to subscribe to more services and move away from perpetual licensing models. Adobe and Autodesk are examples of companies showing that the shift has changed the customer relationship to be less transactional, more service-oriented, and more profitable with predictable quarterly revenues.

Nutanix, as an evolving subscription company, has yet to be fully recognized by investors as it trades at a much lower valuation than similar peers.

Priority 2: Simplify the product portfolio

As a former product guy, I can relate to why this priority was high on Rajiv's list. Rajiv noted, "we had over 20 individual products. and that's very hard for sellers to sell and customers to buy and deploy. Confusing product portfolios are nothing new. Many software companies, large and small, fall into this trap.

This past year, Rajiv has made progress in aligning individual products into solutions packages. The goal is to stop selling point products.For example, Nutanix will no longer sell networking as a standalone product. Networking will be part of the cloud infrastructure stack. Similarly, when it comes to management, operations, and automation, it will be sold as a package rather than trying to sell every piece separately.

Priority 3: Expand partner relationships

There is much competition in the hybrid cloud world. You can't swing a cat without hitting somebody who's trying to provide a solution, from infrastructure companies like HPE to software giants like VMware. Partnering in this space requires a high level of finesse. Rajiv is reportedly an avid bridge player, a skill that may have come in handy this year.

This year Rajiv has expanded partner relationships with several companies, includingHPE,Citrix, andRed Hat.

Priority 4: Continue top-line growth

Nutanix is growingannual contract value (ACV) billing - themetric that shows how much a customer contract is worth by averaging and normalizing its value over one year by 33% this past quarter,the highest growth rate in over two and a half years.

Revenues for thefirst quarter fiscal 2022were$379 million, a 20%

increase year over year. And the goal is to achievesustainable free cash flow by the end of calendar 2022 and operating income positive with two quarters after that.

Nutanix beat and raised estimates every quarter for the last four quarters, and clearly, that needs to continue.Rajiv also noted that an essential element of continued growth is tocontinue to build our talent base becoming more diverse across everything, across locations, across gender, and different perspectives."

To be the hybrid multi-cloud company

Nutanix wants to be a hybrid multi-cloud platform company.

The company started by making infrastructure invisible. And for the first ten years, Nutanix did just that, breaking the silos across compute, storage, and the network with HCI.

That vision has now morphed to making clouds invisible." Rajiv noted, " We see the same opportunity in the cloud. And that's a natural extension of what we already do. We're essentially building a hybrid multi-cloud platform while providing flexibility and choice at every layer of the stack."

The focus is on maintaining simplicity and delivering data across multiple cloud providers. There is still much growth in the core HCI market, but workloads have now expanded, and now Nutanix can run any virtualized workload such as ERP, database, security, and modern cloud-native workloads. The direction now is to extend the platform into the public cloud, onto multiple public clouds as a continuum, and then very selectively focus on a set of services above the infrastructure layer.

Today, the big bet for Nutanix is around database as a service (DaaS).The idea is a service that enables customers to set up, operate and scale databases without the need for setting up physical hardware, installing software, or configuring for performance. All of the administrative tasks and maintenance are taken care of by the service provider so that all the user or application owner needs to do is use and access the database.

Wrapping up

It is the first time I have spoken with Rajiv, which is surprising since our careers followed similar trajectories across dot com startups, systems, and chip companies. I found him to be very pragmatic and focused on very realistic goals. After only a yearunder Rajiv's leadership, Nutanix has scaled its product portfolio, partnerships, and customers, and its path to profitability is on track for the second half of 2022.

Rajiv has no illusions about being a smaller web player in the land of giants," and success can only come from clarity on focus rather than trying to boil the ocean." A large part of "focus" is knowing what you will not do, so I posed that question to Rajiv. Rajiv responded by giving two examples. Nutanix had an ambitious program to become a full-stack provider with Kubernetes.Nutanix Karbonis an enterprise-grade Kubernetes Certified distribution that integrates seamlessly with the entire Nutanix cloud-native stack. Nutanix provides aKubernetes distribution but now chooses to partner for all other aspects such as management, backup, and observability.

A second example is the successfulXi Leap Disaster Recovery Service. Nutanix was on a quest to build many data centers worldwide to support the service. Although customers love the service, Nutanix cannot operate data centers at scale. The decision was made to limit the number of data centers and use a public cloud.

In the final minutes of the call, I had a chance to get to know Rajiv beyond the Twitter description of avid reader, tennis and bridge fan." I discovered a passion for education, particularly a charity he is personally involved with that had built over fifty schools in rural villages and towns in India - the hard work of actually building schools, getting them into operation with support from local governments. The charity is calledOne School at a Timewhich is all volunteer-driven, and Rajivs a passion outside of work.

I look forward to discussing Rajivs report card in 2022.

Note: Moor Insights & Strategy writers and editors may have contributed to this article.

Moor Insights & Strategy, like all research and tech industry analyst firms, provides or has provided paid services to technology companies. These services include research, analysis, advising, consulting, benchmarking, acquisition matchmaking, or speaking sponsorships. The company has had or currently has paid business relationships with 88,A10 Networks,Advanced Micro Devices, Amazon,Ambient Scientific,AnutaNetworks,Applied Micro,Apstra,Arm, Aruba Networks (now HPE), AT&T, AWS, A-10 Strategies,Bitfusion, Blaize, Box, Broadcom, Calix, Cisco Systems, Clear Software, Cloudera,Clumio, Cognitive Systems, CompuCom,CyberArk,Dell, Dell EMC, Dell Technologies, Diablo Technologies,Dialogue Group,Digital Optics,DreamiumLabs, Echelon, Ericsson, Extreme Networks, Flex, Foxconn, Frame (now VMware), Fujitsu, Gen Z Consortium, Glue Networks, GlobalFoundries, Revolve (now Google), Google Cloud,Graphcore,Groq,Hiregenics,HP Inc., Hewlett Packard Enterprise, Honeywell, Huawei Technologies, IBM,IonVR,Inseego, Infosys,Infiot,Intel, Interdigital, Jabil Circuit, Konica Minolta, Lattice Semiconductor, Lenovo,Linux Foundation,Luminar,MapBox, Marvell Technology,Mavenir, Marseille Inc, Mayfair Equity, Meraki (Cisco),Mesophere, Microsoft, Mojo Networks, National Instruments, NetApp, Nightwatch, NOKIA (Alcatel-Lucent), Nortek,Novumind, NVIDIA,Nutanix,Nuvia (now Qualcomm), ON Semiconductor, ONUG, OpenStack Foundation, Oracle, Panasas,Peraso, Pexip, Pixelworks, Plume Design, Poly (formerly Plantronics),Portworx, Pure Storage, Qualcomm, Rackspace, Rambus,RayvoltE-Bikes, Red Hat,Residio, Samsung Electronics, SAP, SAS, Scale Computing, Schneider Electric, Silver Peak (now Aruba-HPE), SONY Optical Storage,Springpath(now Cisco), Spirent, Splunk, Sprint (now T-Mobile), Stratus Technologies, Symantec, Synaptics, Syniverse, Synopsys, Tanium, TE Connectivity,TensTorrent,TobiiTechnology, T-Mobile, Twitter, Unity Technologies, UiPath, Verizon Communications,Vidyo, VMware, Wave Computing,Wellsmith, Xilinx,Zayo,Zebra,Zededa, Zoho, andZscaler.Moor Insights & Strategy founder, CEO, and Chief Analyst Patrick Moorhead is a personal investor in technology companiesdMYTechnology Group Inc. VI andDreamiumLabs.

Go here to see the original:
Nutanix Rajiv Ramaswami On His First Year As CEO - Forbes

Read More..

Emby vs Plex: Which media server is right for you? – nation.lk – The Nation Newspaper

Cord-cutting can be a scary change. Most of us want to keep our entertainment collections and services as streamlined as possible like with one cable subscription. Moving online doesnt have to mean a chaotic assortment of apps though. Media servers like Emby and Plex can help you keep everything in one place. So, which should you choose between Emby vs Plex?

Below, we break down how these two services work, how they compare to each other, and which one might be best for you.

Related: The best live TV streaming services

Emby and Plex do a lot, and its frankly hard to sum them up. Theyre both media servers, which means they allow you to stream various types of content from one place. If youve ever gotten tired of jumping from app to app or remembering passwords for different services and cloud storage solutions, youll immediately understand the appeal of these services.

For all their faults, cable packages are pretty convenient. You pay a fee and get access to all your channels in one place. Media servers arent exactly as streamlined as that, but they make the transition to cord-cutting a lot easier.

Both media servers bring together a users various home media like video, audio, and photos centralizing collections and online services. From there, users can view and stream their content players on their various devices.

Check out: The best streaming services

You can use the servers to organize your third-party media services without signing up for anything. For example, if you already subscribe to Netflix or Tidal, you can just add those services to your Plex or Emby account and use them through their respective interfaces.

You can also use them to organize your own media files, though. And the services also include access to huge libraries of on-demand films, all free and ad-supported. You can watch classic titles and new releases without a subscription.

Emby and Plex are both free, but they do have paid tiers too.

If you just want the bare-bones service, just download either one and get started free of charge.

If you want some of the bells and whistles that really make these services shine, then there are a few differences to keep in mind, mentioned in the next section. Broadly speaking, both services offer similar paid tiers, which give you access to features like cloud syncing and DVR storage.

If you want to pay for a whole year, though, Emby is $54, while Plex offers the discounted rate of $39.99.

Both also offer users the option of a lifetime subscription with a one-time payment of $119.

One of Embys standout exclusive features is Cinema Mode, which comes with a paid subscription. That gives you trailers and custom intros before films and gives you a sense that youre at the movies in your home. Its nothing huge in terms of general functionality, but its definitely a fun bonus for those who want it. Similarly, you get more customizability with the user interface on Emby.

Read: The best original streaming shows

Plex, on the other hand, offers more add-ons, like the popular Unsupported App Store, where you can access unofficial extra channels with even more content. Its also generally easier to use and set up than Emby. That means you have less control over functionality but a generally smoother experience overall.

Emby and Plex are both entirely legal.

How you use them, on the other hand, is up to you. If you have a massive library of movies and TV shows that you downloaded illegally using torrent sites, you can certainly use these services to organize and stream your legally dubious stuff, but neither Emby nor Plex will magically change how you acquired your media collection.

The legally preferred use for these services is as content aggregators that let you bring together media and streaming services you already have legal access to and make them easier to access.

Read: The best original streaming movies

If you are using Emby or Plex for less than legal streaming, well, the next section might be especially interesting for you.

All jokes aside, there are plenty of very valid reasons to want digital privacy that have nothing to do with breaking the law. Even if youre using Emby and Plex to do entirely legal stuff, youll want to know if youre protected.

Emby is an open-source platform. Everything you do on Emby is stored on your own server and Emby wont track it. You dont even need to be connected to the internet while using it. (Remote streaming via Emby Connect requires internet access, as do all web-based streaming uses, of course.)

There are plenty of reasons to want privacy beyond illegal downloading.

Plex, on the other hand, does collect user information. The stated purpose of this is to improve services, but if youre at all skittish about how your personal streaming data (legal or otherwise) might be used, its important to know that.

Theres so much overlap with these two services that we have to dig into the minutiae to really pick a winner.

Both services offer parental controls, are available on every major platform, offer add-ons and VOD titles, and more. And they come at similar prices.

But wheres the fun in calling it a tie? In the Emby vs Plex battle, who takes the gold?

Plexs streamlined user experience, lower yearly price point, and slightly better add-ons make it inch ahead of Emby overall. Plex takes it.

Obviously, your specific preferences and needs may differ, so if you want to customize your experience or make sure you have top-notch privacy, Emby is still a solid option and likely the way to go for you.

Both services hit all the major bases youd want, so theres really no wrong choice at the end of the day.

Read the original post:
Emby vs Plex: Which media server is right for you? - nation.lk - The Nation Newspaper

Read More..

ThycoticCentrify adds new security controls and automation to Secret Server – SecurityBrief Asia

ThycoticCentrify has announced new and expanded capabilities for its specialised PAM solution, Secret Server.

With the addition of new security controls, automation and design updates, Secret Server builds on its secrets management capabilities and ease of use to offer more protection and higher productivity, the company states.

According to the Verizon 2021 Data Breach Investigations Report, credentials are the primary means by which bad actors hack into an organisation, with 61% of breaches attributed to compromised credentials.

To reduce this threat, all organisations independent of size, location or industry need robust, easy to use solutions in place to protect the accounts and credentials that allow access to these privileges, ThycoticCentrify states.

The latest Secret Server release brings stronger security controls to reduce risk. It allows organisations to rotate Secret Servers master encryption key on demand.

Rotating individual secrets housed within the digital vault provides an additional layer of protection to block external actors from gaining access to it.

Secret Server also streamlines the connection process for organisations that use jump boxes to protect access to critical resources.

Rather than taking time to inject unique credentials at every connection point, users can now use a single key to navigate an entire route from launch, to jump box, to destination within a single session.

Users can launch the end-to-end route via Secret Server or the interface of the Connection Manager session management tool.

"Our continued focus on decreasing the steps required to safeguard secrets reduces the workload on security administrators and the attack surface area," says Jon Kuhn, SVP of Product Management at ThycoticCentrify.

He says, "As an example, our master encryption key rotation capability is simple to implement and provides an additional layer of protection to block external actors from gaining access to all the other keys stored on the platform."

To enhance auditing and compliance, Secret Server ensures that only one privileged user at a time can use a secret. When secrets arent checked back in to Secret Server after use, critical maintenance operations cant be performed and productivity slows.

The latest release also automatically checks in secrets for API connections after expiration. Additionally, users now have more visibility into remaining time on a secret checkout and can extend the checkout if required.

Finally, the latest release includes enhancements to the Secret Server interface, logging and reporting to increase usability and accessibility through improved keyboard navigation and screen reader hints.

ThycoticCentrify is a cloud identity security vendor, focused on enabling digital transformation at scale. The company's PAM solutions reduce risk, complexity and cost while securing organisations' data, devices and code across cloud, on-premises and hybrid environments.

ThycoticCentrify is used by more than 14,000 leading organisations around the globe including more than half of the Fortune 100, and customers include large financial institutions, intelligence agencies and critical infrastructure companies.

Read more:
ThycoticCentrify adds new security controls and automation to Secret Server - SecurityBrief Asia

Read More..

PCIe 6.0 is here with double the bandwidth at 128Gbps – comments – GSMArena.com

A

GSM is trying be like nokebookcheck lol

L

Wereweeb, 13 Jan 2022PCIe 5.0 and 6.0 will mostly belong in servers for at least another five years. PCIe 4.0 is mo... moreZen 4 will support PCIe 5.0 anyway

?

Wereweeb, 13 Jan 2022Bandwidth doesn't matter if the latency is astronomically higher than RAM. Every modern C... morePlease, next time read my comment with your eye not with your bottom.Where I've said Bandwidth?Use Google Translate if you have not enough english to understand.Even you have no idea about Bandwidth.

W

Trollhammeren, 13 Jan 2022Double the bandwith double the cost .... MB prices are astronomical as it is ... Other then... moreIncreasing SSD burst-workload sequential speeds is also of no importance to 99% of consumer workloads. Random writes/reads speeds and peak latency are the most important for most consumers (They determine how consistently "snappy" the drive is), with sustained sequential speeds being important for those who constantly move TB's of data around, and power consumption also being important to laptop users.

W

sadh, 13 Jan 2022Wtf, i bought a newest laptop and came with best PCIe and it's a PCIe 4, i thought that w... morePCIe 5.0 and 6.0 will mostly belong in servers for at least another five years. PCIe 4.0 is more than enough for consumer devices.

W

Anonymous, 13 Jan 2022It is nearly about RAM Bus speeds, may be soon SSD's will be used as real RAM, not virtua... moreBandwidth doesn't matter if the latency is astronomically higher than RAM. Every modern CPU is already slowed down if the data it needs is in RAM instead of already having been pre-fetched to it's internal caches, and SSDs are absurdly slow when compared to RAM. (1-15ns latency for caches, 100ns for RAM, 5000-150000ns for NVMe drives)

Nothing will replace RAM in the next two decades. At most it can be supplemented by more CPU cache.

?

sadh, 13 Jan 2022Wtf, i bought a newest laptop and came with best PCIe and it's a PCIe 4, i thought that w... moreWelcome to hardware news--where most manufacturers don't catch up with latest technology until years later.

?

wow I can't wait to experience pcie 6 in 2030!

Yuri84, 12 Jan 2022Maybe in a few years, and then some. Usb4 is still out there somewhere... finalized a while ag... moreMaybe but this is here and now so lets have a lot of fun.

T33370

Double the bandwith double the cost .... MB prices are astronomical as it is ... Other then the increased SSD speed , there are 0 benefits at the moment for using a PCE-E 4.0 16x ... nothing can saturate it on the consumer side.

?

It is nearly about RAM Bus speeds, may be soon SSD's will be used as real RAM, not virtual ram or extended ram but real RAM

P

sadh, 13 Jan 2022Wtf, i bought a newest laptop and came with best PCIe and it's a PCIe 4, i thought that w... morePCIe 4.0 took them a while to make! It was released back in 2017 but new standards aren't adopted right away, PCIe 5.0 was released back in 2019 but now we got the first consumer hardware with PCIe 5.0, 6.0 just got released but it will take a few years to be released in the consumer market!

?

Everyone must work on photonics I/O which can offer upto 10,000GB/s , not this puny speeds

i have a laptop which has pcie 2.0 x4 lane. maybe next year i would buy pcie 3.0 or 4.0

?

sadh, 13 Jan 2022Wtf, i bought a newest laptop and came with best PCIe and it's a PCIe 4, i thought that w... moreYour laptop was outdated since you bought it.

?

Ohh man... I was planning to change my PC motherboard and now this?! Was really hype to get new motherboard with PCIe 5 on board

paco2x, 13 Jan 2022Looks good... until you need a good graphics cards to get the best of it :(Is mostly for storage, GPU's don't use all that bandwidth.

?

sadh, 13 Jan 2022Wtf, i bought a newest laptop and came with best PCIe and it's a PCIe 4, i thought that w... moreThat's mostly because consumer pc products cannot even saturate the pcie 4.0 lanes. Most uses pcie 5 and 6 atm is for cloud/server where they huge amounts of data passed very quickly.

p714164

Looks good... until you need a good graphics cards to get the best of it 🙁

I2483901

sadh, 13 Jan 2022Wtf, i bought a newest laptop and came with best PCIe and it's a PCIe 4, i thought that w... moreYour buyer's remorse is delicious to me

Continued here:
PCIe 6.0 is here with double the bandwidth at 128Gbps - comments - GSMArena.com

Read More..

‘Our servers are secure’ — NIMC responds as hacker claims he gained access to NIN database – TheCable

The issue of data security has been at the forefront since the federal government introduced the national identity database.

In December 2021, Isa Pantami, minister of communications and digital economy, had announced that 71 million Nigerians had been captured on the database.

As more Nigerians registered, is the NIN database free from hackers?

On Monday, a hacker identified asSamclaimed he successfully found a bug on the server of Nigerias National Identity Management Commission (NIMC) revealing how easy it was for him to breach the server and access the personal information of millions of people.

According to Sam, he came across these data while sourcing for something else to help him decompile some applications he was working on.

As usual, I am hunting for something in the source code of the application, As the scope is huge, So I collected all the applications and decompiled them all at once with apktool with this command:find . -iname *.apk -exec apktool d -o {}_out {} ;he said.

Now I started to look for something juicy in decompiled files, but as there are about 50+ applications, I cant look at each of them manually right? I just got an idea of nuclei, and boom I knew there are templates for android applications, I just downloaded them and, started nuclei on the whole directory,

After 1819 mins of a run, Nuclei gave an output saying S3 Bucket Found,I tried to access it via AWS CLI, and its like:Acess denied, No luck there.

Then after a few mins of running, Ive got one more output for s3 bucket, I casually tried to access it without any hope, and damn! the s3 bucket is full of juice.

And I was just like: I just simply got access to their data of internal files, Users, and everything they have, I can download everything, Even the whole bucket.

The hacker also posted the data he obtained in the process a copy of the national identity slip from NIMC but defaced it to hide vital information.

A security expert explained that Amazon secures S3 buckets by default but for a bucket to be publicly accessible to any hacker, as was the case with Sam, someone must have leaked it.

Hours later, the hacker recanted that the leaked sever was not from any Nigerian portal but Tecno Mobile.

He said he reported the case to Tecno, and the bug fixed.

He also edited the article published on Medium and removed a copy of the national ID posted as a screenshot in the story but failed to explain why he mentioned Nigerias ID database in the previous version.

Speaking with TheCable on the development, Boye Adegoke, senior program manager at Paradigm Initiative, said there is the possibility of negligence on the part of NIMC.

If the story is true, it is negligence on the part of NIMC, but what is more worrisome is the fact that after this, what happens next? Are we going to talk and act as if nothing happened? Will someone get punished? Adegoke asked.

The data privacy activist noted that the approach and attitude of NIMC toward the management of national data is poor.

I wouldnt really be surprised if this is true because I have always believed that the cyber security approach and our attitude show we dont understand the process and how it works, he added.

In a statement on Tuesday, NIMC said its servers are secure for identity management and optimised.

The National Identity Management Commission (NIMC) wishes to inform the public that its servers were not breached but are fully optimised at the highest international security levels as the custodian of the most important national database for Nigeria, the statement reads.

The NIMC Director-General stated that the Commission does not use nor store information on the AWS cloud platform or any public cloud despite the usefulness of the NIMC Mobile App available to the public for accessing their NIN on the go.

See original here:
'Our servers are secure' -- NIMC responds as hacker claims he gained access to NIN database - TheCable

Read More..

How this Mumbai startup is carving a niche for itself in the crowded ecommerce delivery space – YourStory

With the third wave of the pandemic upon us, life has moved online again. While there are a number of ecommerce delivery startups trying to make life easier for customers, Mumbai-based edobo is carving a niche for itself in the space.

edobo is an ecommerce grocery and household supplies app that is working to meet the needs of customers from the comfort of their homes.

Launched in 2019 by Ravi Narayan Jadhwani, the startup offers over 3,500 products to residents in Mumbai, Navi Mumbai, and parts of Thane across 120 pin-codes through its app and website.

Ravis family has been running a successful infrastructure business for over forty years. However, he personally felt drawn to the technology space. Labelling the power of the internet and the accessibility provided by smartphones as a boon for businesses everywhere, Ravi was keen to take advantage of the opportunities provided by modern technology.

He began his journey by setting up an outbound call center named 'Xzines Diligent, which unfortunately shut operations after the 2008 recession. He then joined his family business but kept an eye out for developments in technology.

In the last decade, the use of smartphones has skyrocketed, and people are relying increasingly on mobile friendly apps to meet their needs. It is the perfect medium for revolutionary service delivery. This inspired me to get into ecommerce by establishing edobo, shares the founder.

Ravi claims there was no single app that fulfilled the needs of consumers while focusing on safety and timely delivery, which made him establish edobo.

edobo is thoughtfully designed to adapt to new challenges, be it lockdowns or remote work. Customers can now grocery shop from home as easily as they work from home. Unlike other ecommerce providers, edobo ensures that our products remain 99.9 percent bacteria-free. This extends the shelf life of the products and promises safety for our customers, Ravi says.

There are multiple delivery slots in a day and each slot lasts for three hours, allowing customers to choose the slot, which is most convenient for them. Customers can also opt for a 30 minute express doorstep delivery.

To maximise efficiency and ensure customer satisfaction, the startup claims to offer over 3,500 products priced lower than its counterparts sold in supermarkets and local grocery stores. It plans to include over 17,000 products in various categories soon.

Ravi shares that edobo is committed to empowering small and medium businesses (SMBs) in the neighbourhood.

edobo says it delivers to these MSMEs early in the morning in the exact required quantities, which also allows them to save money by preventing wastage of resources and capital.

We also give them backend remote support by providing credit, which they can pay back by accepting QR payments directly from the customers and crediting straight to their partner wallet to re-procure daily needs, Ravi says.

Edobo provides around 14 categories of products to customers, including fresh fruits and vegetables, grocery staples and oils, organic items, eggs, meat and fish, snacks and packaged foods, beverages, desserts and ice cream, cleaning and household goods, personal care items, health and wellness products and pet food as well.

According to the founders, its delivery executives wear uniforms and are trained at regular intervals for enhanced customer experience.An in-house customer support team handles customer communication and delivery queries.

Ravi stresses on the fact that one can enjoy the lowest prices and frequent discounted offers on every product. He shares, We provide fast and secure payment solutions through UPI, net banking, credit and debit cards, and cash on delivery. edobos heart is in the most secured cloud servers on earth. We use AWS infrastructure for scalability.

According to a Statista report on the market size of the ecommerce industry across India, theaverage retail ecommerce revenue collected per user in Indiain 2018 was more than $50. It is estimated to cross $75 by the year 2024. Hence, going by the projection of this report and general consumer preferences, localised players like edobo are well-placed for growth.

Ravi launched the business with an initial investment of Rs 3 crore, the startup claims to have seen significant growth since then. Currently, edobo has over 20,000 active customers.

He says, The pandemic changed the ecommerce industry tremendously. People have completely changed their buying behaviour from experiential buying to virtual buying. The Indian ecommerce industry has witnessed a drastic shift in the number of online buyers and this is continuously increasing.

Ravi Narayan Jadhwani, Founder, edobo

Ravi shares they rely heavily on social and digital media to reach their customers. Word of mouth form satisfied customers and society activation measures in selected locations are other ways of organic advertisement about their services.

While edobo sees no direct competitors due to its smaller scale, it says, apart from same-day delivery because of smaller area to service and hand-picked products, its USP is the UV sanitisation of products to keep them 99.9 percent bacteria-free.

All the products undergo UV sanitisation at its warehouse to maintain the hygiene levels. It is one of the mandatory steps of the delivery process, claims the founder.

When asked about challenges, he says, Our biggest challenge till now has been to provide quality products in categories like fresh fruits, vegetables, and other groceries. Further, it has been our constant endeavour to keep our employees and staff members hygienic and healthy in the current situation, which has also been challenging.

Looking to the future, edobo is ready with plans to expand to other cities and also for opening its own brick and mortar stores. It is also looking for investors to raise funds in the near future.

Read the original:
How this Mumbai startup is carving a niche for itself in the crowded ecommerce delivery space - YourStory

Read More..