Page 3,456«..1020..3,4553,4563,4573,458..3,4703,480..»

How cloud is turning to be an effective tool for healthcare industry during Covid-19 – Express Computer

By Khushboo Jain

Healthcare in the digital age has become a place where a tremendous amount of data is generated on a daily basis. Patients medical and financial details, as well as any research, are just some of the data that is generated, and maintaining a quick and secure database is of utmost importance.

With the coronavirus outbreak, hospitals and clinics are being overwhelmed with patients. The amount of data that needs to be generated or shared and the speed at which it needs to occur puts a lot of pressure on healthcare professionals. Luckily for them, cloud computing could provide a quick, secure, and cost-effective solution.

Cloud computing comes with a unique set of benefits that can greatly benefit the healthcare sector.

Management of serversThe advantage of cloud-based systems for healthcare is that managing data is not the job of the healthcare provider. With talented IT professionals keeping a watch and managing the system, healthcare providers are able to focus on other important facets of healthcare.

Cost benefitsWith cloud computing, it is easier to oversee the services you pay for and take decisions that are cost-effective. By making a custom plan to fit your needs, you can negotiate a deal that is a lot more cost-effective than setting up your own systems.

Designed to manage a tremendous amount of dataAs stated earlier, Healthcare and its related sectors generate a lot of data. For example, medical images like scans are extremely detailed and generate high-resolution images, utilizing a lot of data. A lot of this data needs to be stored for the patients entire lifetime, not to mention be kept secure. Physical storage is inconvenient and cloud computing provides an easier alternative.

Fast speedsWith patient numbers increasing, speed is of utmost importance. Accessibility to faster cloud servers makes it easy to upload, share, and recover data at a quick pace. It also gives us the ability to make changes faster. Exchange of data and communication between healthcare workers, hospitals, research centers, and funding services like medical crowdfunding creates a better healthcare environment. Time is of the essence in healthcare and with cloud, we can now be a lot more time-efficient.

Security and protectionCloud computing has come a long way when it comes to addressing security concerns. The use of private and hybrid cloud systems has ensured that the medical and financial details of a patient remain secure. For example, if a hospital has a patient that needs to raise funds using a crowdfunding platform, there can be a secure exchange of data between the platform and the hospital using cloud systems. Moreover, the remote servers keep it more protected from any on-location hazard and also reduces any hassles during data recovery.

The opportunities that Cloud computing gives to the Healthcare systems:

ScalabilityThe needs of the healthcare service provider may change with time. Scaling the cloud services according to their requirements is easy. Cloud allows you to scale up or down quickly, allowing you to meet your current needs or prevent unnecessary expenditure, and also allow for future growth.

Ability to updateTechnology is in a constant state of change and innovation. As systems upgrade, data will need to be changed/updated. Whenever these changes do occur, updating data using cloud will be much easier and quicker. Having a cloud-based system will enable you to update your data, applications, and systems as quickly as possible.

Allowing easier collaborationsDuring the digital age, the sharing of resources is important to create better opportunities for patients. For eg. collaborating with other healthcare providers can provide better services while collaborating with crowdfunding and other alternative funding options enables patients to afford them. Collaborations like these create a better healthcare system for everyone.

Using cloud data in telemedical practicesDuring this pandemic, doctors and patients alike are at risk of contracting the virus in hospitals. During this critical time, telemedical practices can help healthcare workers continue to provide safe healthcare remotely. These modern medical systems need to transfer the patient data back and forth at high speeds, something that cloud can be used easily, while also maintaining the doctor-patient privacy. By involving cloud computing in telemedical systems, we can now have a safe system, both physically and digitally.

More resources to focus on the medical needsFrom all that we can assimilate from the advantages that cloud-based systems have, we can come to the conclusion that such systems can drastically reduce the number of resources that would be required from the healthcare systems to manage data. It saves time, money, and other important resources. The availability of these resources allows healthcare service providers to concentrate on providing better services, which should be their primary focus.

The early adopters of cloud services have been able to reap the benefits of it for some time now. This has only proved that cloud computing is not only viable, but essential to healthcare, and needs to be adopted now more than ever before.

(The author is Co-Founder and COO, ImpactGuru.com)

If you have an interesting article / experience / case study to share, please get in touch with us at [emailprotected]

Originally posted here:
How cloud is turning to be an effective tool for healthcare industry during Covid-19 - Express Computer

Read More..

The pros and cons of moving to the cloud – Accounting Today

There has been much talk in recent years about migrating your accounting firm to the cloud. With the recent outbreak of COVID-19, it has become even more appealing as the work-from-home movement is in full force.

While it is a hot buzzword, many do not understand what it means to move to the cloud. Essentially, moving to the cloud means that you do not need to physically be in an office in order to access centralized data. This can be your clients tax returns, documents, spreadsheets, and anything else that in the past you needed to login at your desk in the office to access.

Change is never easy, but during these unprecedented times it may be a good idea for you to migrate your firm to the cloud. Some organizations are opting to dip their toes in the water and do a hybrid solution, where the main server is cloud-based, but the other machines are still physical.

While there are a huge number of advantages to moving to the cloud, there are some cons that are associated with it as well.

To help you decide which is best for your firm, weigh the following pros and cons.

Continued here:
The pros and cons of moving to the cloud - Accounting Today

Read More..

Veritas taps NetBackup as the beating heart of universal data management – Blocks and Files

Veritas is laying the foundations for a universal data management platform, with NetBackup at its centre.

The company released NetBackup 8.3 today and dont let the point release fool you. This is a huge update, with a slew of new features.

NetBackup customers can standardise on a single platform covering hybrid and multi-cloud environment and save money, according to Veritas, which has combined the software with the Veritas Resiliency Platform (VRP) and CloudPoint into the Enterprise Data Services Platform (EDSP).

Deepak Mohan, EVP for the Veritas Products Organisation, said a prepped quote: Were extending enterprise-grade data protection and the most robust set of recovery options to every corner of our customers IT environments from on-premises physical to virtual, to cloud and even to containers.

NetBackup 8.3 includes:

Theres more. NetBackup 8.3 has cloud-native data protection for AWS, Azure and GCP. There is workload and data portability in hybrid and multi-clouds, with to-the-cloud and between-cloud storage tiers. Veritas has extended cloud-to-anywhere portability, adding Azure Stack to Azure Stack and Azure region-to-region, with push button orchestrated disaster recovery using Veritas Resiliency Platform integration.

Veritas has added storage cost optimisation with integrated management and reporting from Veritas APTARE IT analytics.

NetBackup 8.3 reduces discovery time from hours to minutes for large environments, with 50x speed-up for VMware vCenter and vCloud. There is 25 per cent faster dynamic NAS data protection via auto-discovery of resources and load balancing with the ability to restore data anywhere on any NetBackup target. This removing vendor lock-in, according to Veritas.

Check out the NetBackup 8.3 data sheet .

Veritas is a veteran enterprise backup supplier with more than 80,000 customers, and traditionally competes with the like of Commvault and Dell EMC. In common with these rivals, is fighting a catch-up war with three groups of vendors.

Veeam and Acronis are backup vendors that have ridden the server virtualization wave. Actifio, Cohesity and Rubrik are in a second group that has pioneered secondary data management functions such as copy data management. They use backup as a data generating source for these functions.

A third group consists of vendors such as Clumio and Druva that provide backup as a service and in-cloud backup. Some venders in this category HYCU, is an example specialise in data protection for Nutanix and Azure.

Veritas is on the money in articulating the need for universal data management across on-premises, hybrid, and multi-cloud environments. The core features should include cloud protection, workload migration and disaster recovery functionality. However, this is a big ask and Veritas is in a race to deliver such as service and prevent customer erosion.

Continued here:
Veritas taps NetBackup as the beating heart of universal data management - Blocks and Files

Read More..

The Style Of Cloud Networking In The Corporate Datacenter – The Next Platform

It is easy to understand the lure of the public clouds siren call. Theres the flexibility and agility to enable immediate elastic scaling up or down as needed, the tools and services needed for running modern workloads like artificial intelligence and data analytics, the removal of headaches related to deploying and managing vast numbers of systems enterprises in large part no longer want to run their own datacenter infrastructures and the cost efficiencies, with no longer having to pay upfront for hardware and instead leveraging models like pay-per-use.

The benefits of the cloud have been put in even greater focus during the COVID-19 pandemic, which has forced businesses to accelerate their digital efforts to adapt to a rapidly evolving business environment where most people are working remotely. In addition, businesses are seeing their revenues and budgets shrink in the wake of the public health crisis, driving many of them to look to the cloud to run more of their workloads. Synergy Research Group noted that in the first quarter, while spending on traditional datacenter hardware and software fell 4 percent year-over-year, revenues in the public cloud datacenter infrastructure market grew 3 percent.

Despite all this, the majority of workloads some estimates put it at 70 percent still run in traditional datacenters. There are myriad reasons, from security and compliance concerns to the costs that come with moving some larger applications and data sets to the cloud. While a report by Virtustream a cloud company owned by Dell Technologies last year found that organizations were moving more mission-critical applications to the cloud, the belief among most vendors is that for the foreseeable future, most enterprises will continue to run in this hybrid mode, with some workloads remaining on premises in the datacenter and others running in the public cloud, and in most cases in multiple public clouds.

Given that, a growing trend is to make the movement of workloads and data between on-premises and public cloud easier and faster and to bring cloud-like features to the datacenter. Most recently, Hewlett Packard Enterprise is leveraging its GreenLake platform as the foundation for its efforts and Pure Storage rolled out its Purity 6.0 for FlashArray operating system, with a range of features that are available through its Evergreen subscription model. Oracle earlier this month announced Oracle Cloud@Customer, which is a way to bring the full cloud experience into the datacenter.

VMware also has become a significant player in the hybrid cloud space over the past several years and has a goal to make an enterprises traditional datacenter or private cloud as flexible, efficient and cost-effective as a public cloud.

We are helping our customers make their private cloud as agile, as efficient, as flexible as the public cloud infrastructure, Tom Gillis, senior vice president and general manager of VMwares Networking and Security Business Unit, said at a recent press briefing. This is increasingly important because our customers say that if they dont deliver this level of efficiency, more or more of their internal constituents are going to look to the public cloud. But the public cloud cant always meet the security requirements and the cost requirements or other concerns, data privacy concerns. Having the ability to make your private cloud infrastructure programmatic and efficient is really critical.

VMware began its journey beyond its server virtualization roots and into the software-defined datacenter (SDDC) space when it bought software-defined networking (SDN) startup Nicira in 2012 for $1.26 billion, bringing aboard the technology that would form the basis of its NSX networking platform. Over the next several years it would build on the technology with NSX-T, support for virtual machines (VM), containers and bare-metal infrastructures. VMware created its vRealize management suite, bought VeloCloud for software-defined WAN and two years ago launched is Virtual Cloud Network to enable organizations to connect and secure applications and data as workloads moves outside of the datacenter.

A year ago, the company bought startup Avi Networks, whose technology essentially balance workloads and application delivery both in the cloud and in the datacenter. It was the one-year anniversary of that acquisition that brought Gillis and other VMware officials to speak to the media this week about not only the adoption of VMwares NSX Advanced Load Balancer based in large part on the technology inherited through the Avi deal but also new features in the latest version of the product.

Networking is obviously connectivity that you get from switching and routing, and NSX is known for that. But as weve expanded the NSX portfolio created this family of products we now have the NSX Services-Defined Firewall and the NSX Advance Load Balancer, which gives you all the necessary services you need to fully define a workload and deploy it with a single-click strategy, Gillis said. Thats what motivated the acquisition of Avi. They had built a very unique software architecture. There are other software load balancers on the market, but there is only one software load balancer that has a scale-out architecture, which means you can keep adding little data plans and create one giant logical load balancer. Having that as part of the NSX portfolio has allowed us to really complete this vision of a public cloud experience in your private cloud infrastructure.

VMware not only is integrating the Avi load balancing technology into its own networking portfolio but also in other products, such as its Carbon Black security offerings, automated orchestration tools and Tanzu Kubernetes platform. Since the Avi acquisition, 7,000 traditional hardware-based load balancers have been replaced with VMware software and the customer base for NSX Advanced Load Balancer has grown about 70 percent, from 6,500 companies to more than 15,000. That includes six of the top 10 financial services companies, Gillis said. In addition, VMware has been able to deliver more than a million transactions per second for a single application.

The software load balancer, which runs on standards x86 servers, can scale horizontally in seconds or minutes using the software load balancer rather than the weeks or months needed for hardware appliances. This becomes even more important given the changes being forced on businesses by the coronavirus outbreak.

The first thing we needed to do was just react to a global shift that suddenly everybodys branch office was in their living room, he said. We have a number of customers that are ramping up remote access technologies like VDI virtual desktop infrastructure so having a load balancing solution specifically designed to solve those use cases has been a real win for us and weve been able to help our customers adapt and adjust. The second thing we need to do is stabilize the operation and find some efficiency here. Thats the phase that many customers are in now. COVID, while it presents so many challenges, it also creates opportunity. Smart companies are using this as a time to rethink how we accelerate our digital initiatives and how we can be faster for the future and not be beholden to old legacy infrastructure, and it is infrastructure that requires people to be onsite in buildings. All of those things are things that were leaving behind us and that really has created some uplift for the VMware portfolio and the NSX Advanced Load Balancer in particular.

VMware is putting new features into version 20.1 of NSX Advanced Load Balancer, a platform that includes not only load balancing but also a web application firewall (WAF), application analytics and Kubernetes ingress services in the datacenter and cloud, with the software available for both VMware and non-VMware environments. The new features include enhancements to more easily install global load balancing updates and to offer full integration with Google Cloud Platform and VMwares NSX-T. Security updates include automated Pulse cloud services and case management, WAF threat feeds, while consolidated VMware solutions with vRealize Orchestrator and vRealize Automation. VMwares new architecture for consolidated Kubernetes Ingress Services is aimed at streamlining container deployments for multiple clusters and multiple sites.

The Kubernetes Ingress services, the ability to support modern applications, to have the ability to provide all of those networking services into Kubernetes applications, has been an important part of the product, said Chandra Sekar, a senior director of marketing at VMware who came to the company with Avi. Now, the critical piece here is the ability to interact and integrate with a lot of the newer technology stack as well. Weve always had integrations with vCenter. We have full access integration with NSX-T as well. When the business continuity initiatives started with enterprises that were dealing with the aftermath of COVID, we were able to provide VDI services, with load balancing with VDI services with Horizon [VMwares desktop and application virtualization product] and we also have several integrations with automation frameworks, including vRealize Orchestrator and vRealize Automation. Everythings available in one single platform that can be deployed and managed centrally across different environments.

See the original post:
The Style Of Cloud Networking In The Corporate Datacenter - The Next Platform

Read More..

Storage skills in the age of the cloud and convergence – ComputerWeekly.com

The volume of data that needs to be stored just keeps on increasing, which means potential employees with storage skills are always in demand.

But in-demand skillsets change over time. In storage, skills that were mainstream 10 years ago have all but disappeared and new disciplines have emerged.

So, what trends are driving demand for storage and what skills are necessary to get a job in this ever-changing sector?

Storage has changed a lot in recent years. In terms of jobs and skills, the idea of storage as a discrete discipline has been eroded by virtualisation, hyper-converged technology and the cloud.

And often, IT roles have converged. There are fewer roles for IT professionals in traditional, specialist areas such as storage. Instead, IT professionals must become generalists to successfully manage on-premise, cloud and software-as-a-service (SaaS) infrastructures.

This change has been heightened by the pandemic, but even beforehand, the lines between IT responsibilities were becoming blurred, says Sascha Giese, head geek at SolarWinds.

IT convergence has also been exacerbated by flat to shrinking budgets, but in all, it has added complexity for IT pros, making it difficult to know what skills should be focused on.

The situation is that siloed roles in storage or compute have been broken down to make way for a new type of IT professional who can cover multiple disciplines and is central to keeping IT environments operating at full capacity.

Data has evolved in its use over the past decade. The explosion of analytics has presented an enormous challenge in how data is stored and processed.

Meanwhile, the cloud has been instrumental in moving storage away from on-premise systems, with a consequent need for fewer on-premise storage specialists.

The rise of hyper-converged infrastructure which bundles compute, storage and hypervisor has also had an impact on the storage skills landscape, with logical unit number (LUN) design becoming a thing of the past.

Meanwhile, provisioningstoragefor a new virtual machine (VM) has for some time no longer been done in thestoragearray console, but from the hypervisor, and that had a knock-on effect on roles.

This greatly simplified operations and allowed infrastructure and operations specialists to diversify their responsibilities and take on additional areas, says Robert Rhame, director of market intelligence at Rubrik.

Other features of the storage landscape are also changing, with the rise of object storage, software-defined storage, and non-volatile memory express (NVMe) flash and its networked, over-fabric iterations.

According to IT JobsWatch, the top skills in storage engineer job roles in the first six months of 2020 were Windows, storage area network (SAN), VMware/VMware infrastructure, Linux, Windows Server, and infrastructure engineering.

But according to Thomas Harrer, chief technology officer (CTO) for IBM Systems hardware sales in Europe, demand in storage skills is not a matter of specificity, but extended scope.

He says IT professionals that interact with or manage storage environments need to broaden their technology skills to fully understand the impact, connectivity and optimisation potential of these environments as a whole.

So, for example, storage specialists may need familiarity or proficiency with DevOps or cloud to be more attractive to potential employers and command a bigger salary.

They equally need cloud knowledge public and private and application development knowledge to achieve an ideal balance between flexibility, cost efficiency, quality, security and cyber resiliency when implementing or upgrading their storage environments, says Harrer.

His view is that understanding the application and objective, as well as supplier-agnostic options to achieve these objectives whether storage, cloud or app development-related are critical.

Ideally, we are talking about skilled professionals who have developed architectural or strategic storage skills to fully understand the impact of supplier or environment components that storage might be a part of, he adds.

In a recent survey IT trends report 2020: the universal language of IT IT professionals were asked to cite the non-technical skills they saw as most valuable to their role, and project management (69%), interpersonal communication (57%) and people management (53%) were the top skills listed.

Core storage skills are crucial for designing, building and maintaining the infrastructure needed to store and process data, but the soft skills of explaining and demonstrating the value of storage systems to C-level decision-makers is also essential. So, building up interpersonal skills is vital.

Sometimes interpersonal skills are relegated to the category of soft skills, which is misleading, considering their overall importance in leadership and management, says Giese at SolarWinds. Ultimately, interpersonal skills are human skills that help teams break though jargon and better address business challenges by relating to other people and speaking in a clear way.

He adds that it is worth remembering that IT professionals dont just speak to other IT professionals. Theyre increasingly talking to customers, cross-functional teams, and other business stakeholders. Good communication and personal understanding are the mastery of any of those domains, he says.

For many organisations, storage is more strategically important than ever. So, it is not just about putting together all-purpose storage systems; specific solutions are key now. This means designing systems and infrastructure that focus on privacy and security, or systems that enable data analytics via artificial intelligence.

People with storage skills can go for roles such as infrastructure automation engineer, DevOps engineer, and cloud engineer, to name a few. Rubriks Rhame recommends job seekers should search for terms such as Kubernetes, Terraform, Ansible, Chef, Puppet or other tools to see what jobs come up.

My advice for someone seeking a job in these areas is not to get hung up on not having the necessary skills that are being requested. These are scarce commodities right now, and those who have them are not looking for jobs, he says.

He adds that people should showcase their other skills, technologies and programming capabilities that remain relevant in this changing world and focus on demonstrated cases where you picked up a new skill.This is emerging and has a good future indeed, he says.

For those who want to stay in the datacentre, there is a future here as well, and the role to look for will likely have infrastructure and operations or simply operations in the description.

See the original post:
Storage skills in the age of the cloud and convergence - ComputerWeekly.com

Read More..

Report: Despite Covid-19 disruption in 2020, data center capex poised to hit more than $200B over next five years – FierceTelecom

While the Covid-19 pandemic is expected to disrupt the demand for data center equipment this year, data center capex will grow at a 6% CAGR to reach just over $200 billion over the next five years.

A report by Dell'Oro Group said data center capex, which includes capex for servers and other data center infrastructure equipment, growth will be a mixed bag depending on the customer segment. The cloud, which accounts for more than 60% of the worldwide data center capex, will continue to flourish when compared to enterprise/on-premise data center deployments.

Telco edge data centers could emerge over the long-term as telcos build their edge compute services and applications.

How Fierce are you about applying AI to your business?

The editorial teams behind Fierce Life Sciences, Fierce Healthcare, Fierce Telecom and Fierce Electronics bring you Fierce AI Week, August 10-12. This is the only virtual event focused on the application of AI to drive business, customer and process value through a discuss, debate and define format. #FierceAI #BeFierce #FierceAIWeek

The coronavirus pandemic has hit several industry verticals hard over the past five months, including brick-and-mortar retail, travel, hospitality and small-to-medium sized business, which led to them reining in their IT spending this year.

On the flip side, Covid-19 has led to some organizations accelerating their digital transformations, which includes putting data, workloads and applications in the cloud. Dell'Oro's report said that as enterprises look to conserve capital spending, the public cloud, which has a flexible and consumption-based infrastructure, could help meet the growing demand for remote work and distance learning.

"The Covid-19 pandemic and the ensuing recession may have the long-lasting effect of accelerating the permanent migration of certain industries and workloads to the cloud," according to Dell'Oro.

While Microsoft reported its earnings on Wednesday, the other major cloud providers, including Amazon Web Services and Google Cloud Project, will be conducting their earnings calls over the coming weeks. Microsoft Azure's revenue growth was 47% in the fourth quarter compared to 59% in the third quarter.

RELATED: Hyperscale data center count reaches 541 with 176 more in the works

Dell'Oro said the top-four U.S. cloud service providersAmazon, Facebook, Google, and Microsoftwere well positioned to continue their momentum of expansion over the next five years.

"Servers will continue to be consolidated in fewer mega cloud data centers that could potentially provide greater capacity than the same number of servers spread out across thousands of enterprise data centers," according to Dell'Oro Group.

Those top four U.S. cloud service providers have been prolonging the life of their servers to lower server depreciation expenses while maintaining the reliability of their server fleets. Last year, Arista Networks saw an impact on its fourth quarter earnings due to declining switch revenue from an unnamed cloud provider.

Also on the trend front, Dell'Oro Group said the Intel server processor refresh cycles have historically influenced IT spending.

"While the major cloud service providers typically ramp server capacity outside of the processor refresh cycle, the upcoming Intel 10 nm Whitley server platform refresh due later this year could generate an uplift on server spending. Viable alternatives to Intel processors, AMD EPYC and ARM, for server and storage system applications are starting to materialize in certain markets," according to Dell'Oro Group.

Dell'Oro Group also cited open source groups coming together to share and standardize best practices in the design of sustainable data center infrastructure as a factor going forward.

"The Open Compute Project (OCP), in particular, has introduced various technological innovations in the areas of server and server connectivity, rack architecture, and networking switches, which could shape the future development of data center infrastructure," Dell'Oro Group said.

Facebook launched OCP nine years ago as an open-source hardware initiative to drive the deployment of web-scale operations and services. OCP has thousands of engineers from close to 200 member organizations working on more energy-efficient hardware equipment for the likes of hyperscale data centers and large service providers.

See the original post:
Report: Despite Covid-19 disruption in 2020, data center capex poised to hit more than $200B over next five years - FierceTelecom

Read More..

Ephesoft Releases New Version of Transact; Touts Time-to-Value of Cloud-based Document Processing Solution – PR Web

Ephesoft Transact with ID Extraction

IRVINE, Calif. (PRWEB) July 28, 2020

Ephesoft, Inc., a leader in intelligent data capture and enrichment solutions, today announced the release of Ephesoft Transact 2020.1.02 with enhancements to meet the growing global demand for nimble, cloud-based solutions to quickly and accurately capture and process high volumes of documents across hundreds of industries. Enhanced features include cloud hosting on Amazon Web Services (AWS) for secure deployment in as little as 24 hours; handprint extraction, checkbox and signature detection for cloud and on-premises processing; and, the addition of Ephesoft Transact QuickScreen to seamlessly read and extract data from over 1,000 different types of global IDs.

Used by hundreds of enterprises worldwide, Ephesoft Transact is a modern capture productivity platform that leverages machine learning and cloud-based web services to empower human and digital workers in a wide range of document-intensive industries. Short implementation and enhanced features enable organizations of all sizes to improve their bottom line.

Market research indicates that the COVID-19 pandemic has led to a growing demand for cloud computing and software services. We see this with our customers who are looking for ways to automate their enterprise, starting with their data. Banks, insurance companies, healthcare and government agencies around the globe seek modern, scalable solutions that drive productivity, starting with document classification and extraction digital transformation initiatives, said Ike Kavas, founder and CEO, Ephesoft. Customers are looking for ways to easily and quickly unlock their data and put it to immediate use with the ability to implement successful outcomes in as quickly as days to hours. Were at a pivotal point in time where technology can push you ahead of your competition and help organizations navigate through uncertain times.

Transact CloudA March 2020 COVID-19 Impact on IT Spending (1) survey by IDC finds cloud computing, workforce performance management and cloud software are the three tech investments IT decision-makers believe are most likely to benefit from increased demand. This is not surprising as organizations with an expanded remote workforce seek the flexibility and cost savings offered by cloud and hybrid solutions like Ephesoft Transact to quickly, remotely and securely leverage their data for business continuity.

Hosted in the Amazon Web Services cloud, Ephesoft Transact Cloud provides secure, scalable, intelligent content acquisition capabilities for organizations to automate their document processing without the added burden and expense of server management. Transact Cloud customers can go live in hours or days versus the months often required for on-premises deployment. Other benefits of the latest version of Transact Cloud include data import functionality from AWS S3 buckets; server performance monitoring; lower cost of ownership and capital expenditure; and accelerated feature deployment for continuous product updates.

Transact QuickScreen Extracts Data from More Than 1,000 Types of Global IDsThe coronavirus pandemic has spurred an uptick in mobile employee onboarding and off-boarding, contactless loan applications and processing, and expanded COVID-19 screenings. This has led to an increased need for technology to quickly and securely scan and process identification documents and forms such as government-issued IDs.

Now available in the cloud, on-premises or as a hybrid solution, Transact QuickScreen offers out-of-the-box capabilities to read more than 1,000 types of IDs, such as drivers licenses, passports, visas, healthcare cards, international documents, tax forms and patient paperwork from 195 countries. A customer can capture an ID, form or document on any device such as a mobile phone or scanner, and upload it for automatic processing. The Transact platform classifies, extracts, validates and delivers the data into the customers line of business systems such as RPA, ECM, EHR, CRM and ERP.

Customers across any high-volume, document-intensive use case will benefit from using Transact QuickScreen, whether in healthcare, government, human resources, banks or finance organizations. For example, healthcare organizations can securely capture patient IDs and test kit barcodes to reduce wait times for COVID-19 tests. HR departments can reduce the time and cost of onboarding and offboarding employees by automatically processing forms, such as I-9, W-4, P60 and employee IDs. Likewise, banks and mortgage companies can efficiently qualify loans by eliminating the manual data entry of credit card statements, bank statements, IDs and paystubs.

Native ICR and OMR ExtractionResearch has shown a growing demand for expanded handprint recognition and identity verification solutions across many industries, including enterprise, financial, government and healthcare. Native Intelligent Character Recognition (ICR) and Optical Mark Recognition (OMR) extraction is now available for on-premises, hybrid and cloud solutions, integrating directly into the Ephesoft Transact user interface alongside its traditional key-value extraction rules for OCR extraction. This offers a quick and easy way for users to define an index field extraction rule to extract handprint values from a document, or to detect signature or checkbox filled areas. In some use cases, it took less than half the time to configure compared to traditional methods.

In the Ephesoft Transact platform, handprint data extraction rules are easily configured in most cases, with no templates or zonal page mapping required. In several customer scenarios, the solutions embedded ICR engine reduced professional service hours by converting handwritten data to machine-readable text and outputting textual data for review and validation by a human after the labor-intensive processing is complete. Similarly, Transact Cloud easily interprets checkbox data without the need for template or fixed form projects where the characters representing the value are typed or filled in from a digital application. And, if forms contain signature fields, the system can identify those areas and determine whether or not the document has been signed.

Ephesoft Transact will be competitively priced with new pricing bundles rolling out Q3 2020. A free 10-day trial of Ephesoft Transact Cloud is available by contacting sales at info@ephesoft.com. For more information about Ephesoft Transact, including the latest Ephesoft Transact version 2020.01.02 enhancements, visit https://ephesoft.com/products/transact/.

About EphesoftEphesoft is the leader in Context Driven Productivity solutions, helping organizations maximize productivity and fuel their journey towards the autonomous enterprise through contextual content acquisition, process enrichment and amplifying the value of enterprise data. The Ephesoft Semantik Platform turns flat data into context-rich information to fuel data scientists, business users and customers with meaningful data to automate and amplify their business processes. Thousands of customers worldwide employ Ephesofts platform to accelerate nearly any process and drive high value from their content. Ephesoft is headquartered in Irvine, Calif., with regional offices throughout the US, EMEA and Asia Pacific. To learn more, visit ephesoft.com.

(1) IDC, "Leaning on Digital Transformation Investments to Meet the Challenges of the COVID-19 Pandemic," Doc# US46201920, April 7, 2020.

Continued here:
Ephesoft Releases New Version of Transact; Touts Time-to-Value of Cloud-based Document Processing Solution - PR Web

Read More..

Is Application Awareness on the Cusp of a Renaissance? – Redmondmag.com

Posey's Tips & Tricks

With cloud suites like Microsoft 365 scattering user data across multiple apps and services, application-aware backups and restorations are due for a surge in renewed interest.

As someone who has worked in IT since the early 1990s, I've seen a lot of changes over the years. The industry is almost unrecognizable compared to when I first started out. This is especially true for backups.

Early in my career, making a backup meant putting a tape into a drive just before you went home for the day. The backup job typically started late at night when nobody was in the office and completed just prior to everyone's arrival the next morning.

These prehistoric backups were file backups. The reason we ran them late at night was because the backup software of the time was incapable of backing up open files. Since the jobs ran late at night, nobody should have been logged in, so theoretically there weren't any open files that would disrupt the backup.

The really crazy thing to think about is that the concept of application-aware backups did not exist back then (or if it did exist, I wasn't exposed to it). Application awareness was never an issue until the organization that I worked for adopted Microsoft Exchange Server.

Unlike most of the other applications of the day, Exchange had to be backed up in a very specific way because of the way the Exchange Server databases worked. Creating a file-level backup of an Exchange Server simply wasn't an option. If a file-level backup of the Exchange Server even managed to complete, the backup would have been corrupt.

Seemingly overnight, application-aware backups became a big thing.

A big part of selecting a backup application was making sure it would be able to work with the organization's applications. Today, of course, most backup vendors design their software to work with all of the most popular business applications. Even so, I think we are probably about to see a renewed interest in application-aware backups, but for a completely different reason than before.

This time around, I think application awareness is going to be more closely associated with the data-restoration process than with the backup process. Let me explain my reasoning.

In the past, commercial applications, especially software-as-a-service (SaaS) applications, had a very cozy relationship with their data. Consider Exchange Server. All of that messaging data is stored in a series of databases residing on the Exchange mailbox servers.

Of course, there are plenty of applications that store their data in external databases, but even this data tends to be application-centric. For example, Microsoft System Center applications such as Virtual Machine Manager and Operations Manager store data in a SQL Server database. Although the SQL Server itself may store data that is unrelated to the System Center applications, the databases used by the System Center products are reserved exclusively for that product's use.

The thing I find interesting, however, is that cloud applications are increasingly scattering their data across a wide range of applications and services. Microsoft Teams, for example, is part of Microsoft 365 and leverages some of the other Microsoft 365 applications for data storage. Some of the Teams data is stored in Exchange, some is stored in SharePoint and some is stored elsewhere.

So think about the Teams architecture from a backup standpoint. Any backup application that is able to back up Microsoft 365 should theoretically be able to back up Teams -- or any of the other Microsoft 365 applications, for that matter. However, performing a restoration may prove to be a more complex operation. Unless a backup application happens to be "Teams-aware," an administrator would have to know all of the various locations in which Teams stores its data in order to do a successful restoration.

This problem isn't unique to Teams. It applies to all of the Microsoft 365 applications.

This is why I think that application awareness is going experience a renewed interest. It's one thing to be able to back up Microsoft 365, but it's quite another to be able to restore data from an individual Microsoft 365 application.

So far, application-aware restorations haven't been all that big of an issue for Microsoft 365. Exchange, SharePoint and OneDrive tend to be among the most popular Office applications, and these also happen to be the applications that are commonly supported by backup products. As the other Microsoft 365 applications gain traction, however, I think there will likely be an increased demand for backup vendors to enable granular restorations for those applications.

About the Author

Brien Posey is a 16-time Microsoft MVP with decades of IT experience. As a freelance writer, Posey has written thousands of articles and contributed to several dozen books on a wide variety of IT topics. Prior to going freelance, Posey was a CIO for a national chain of hospitals and health care facilities. He has also served as a network administrator for some of the country's largest insurance companies and for the Department of Defense at Fort Knox. In addition to his continued work in IT, Posey has spent the last several years actively training as a commercial scientist-astronaut candidate in preparation to fly on a mission to study polar mesospheric clouds from space. You can follow his spaceflight training on his Web site.

Read the original here:
Is Application Awareness on the Cusp of a Renaissance? - Redmondmag.com

Read More..

NAKIVO Backup & Replication v10 adds vSphere 7 support and other new features – Continuity Central

DetailsPublished: Tuesday, 28 July 2020 08:55

NAKIVO Inc., has announced the release of NAKIVO Backup & Replication v10. Key new features include vSphere 7 support and backup to Wasabi.

NAKIVO Backup & Replication provides businesses with the tools they need to protect their entire IT infrastructure from VMware, Hyper-V and Nutanix AHV VMs and Amazon EC2 instances to physical servers and workstations, Oracle databases, and Microsoft Office 365 application data. The introduction of Backup to Wasabi in v10 gives customers the power to leverage scalable cloud storage while retaining the option to store confidential data on local or off-site storage devices and tape media.

Wasabi has disrupted the cloud storage market by offering fast and reliable cloud object storage at one fifth the price of major competitors. With only a single, all-purpose storage tier and no fees for egress and API requests, Wasabi has made it easier for businesses to forecast storage costs and manage cloud-based data. And now, with the release of Backup to Wasabi, NAKIVO customers can perform a range of full and granular recoveries tailored to their specific business needs.

NAKIVO Backup & Replication allows customers to unlock the full potential of Wasabi Hot Cloud Storage:

NAKIVO Backup & Replication now supports backup, replication and recovery for VMs running on vSphere 7, the latest version of VMwares server virtualization platform. This enables users to leverage the advanced functionality of vSphere 7 while ensuring their virtual infrastructure remains protected.

NAKIVO Backup & Replication can now recover physical machine backups to VMware VMs. Full P2V Recovery creates virtual versions of physical servers and workstations that are ready for production environments, simplifying physical to virtual migrations and recovery in case of machine failure.

NAKIVO customers can now perform application-aware, incremental backups of Linux workstations running Ubuntu 18.04 Desktop and Ubuntu 20.04 Desktop. In addition to full recoveries of entire Linux workstations, NAKIVO Backup & Replication also offers granular recoveries to restore individual files and application objects from compressed and deduplicated backups.

http://www.nakivo.com

Here is the original post:
NAKIVO Backup & Replication v10 adds vSphere 7 support and other new features - Continuity Central

Read More..

DH2i to Address How to Mitigate Microsoft SQL Server Costs Amidst Pandemic-Ravaged Budgets – PRNewswire

FORT COLLINS, Colo., July 28, 2020 /PRNewswire/ --DH2i, the leading provider of multi-platform Software Defined Perimeter and Smart Availability software, today announced it will present a live webinar titled, "How to Mitigate Microsoft SQL Server Costs Amidst Pandemic-Ravaged Budgets."

When: Wednesday, August 5, 11:00 am 11:30 am Pacific Time (2:00 pm 2:30 pm Eastern Time)

Why Attend:When it comes to supporting the organic growth of an organization, the SQL Server team's role seems pretty straightforward. Just spin up a new VM, set up a new physical box, or expand the organization's cloud footprint. That's not exactly the reality for the DBAs and other team members working on the front lines, though. There is often an elephant in the room that no one wants to acknowledge because they know it won't go over well with upper management. That unfortunate reality is the unmanageable growth in SQL Server deployments and costs. This is a difficult and expensive problem to fix, and especially in the face of a pandemicno organization has any excess budget to invest in a solution.

But what if a solution enabled consolidation AND cost-savings? Not only that, but what if you could also unlock peak high availability (HA) and disaster recovery (DR) with this solutionall without Microsoft Windows Server Failover Clustering (WSFC)?

Join DH2i's Connor Cox for a webinar introducing DH2i's DxEnterprise. Through a brief presentation and live demo, you'll learn how DxEnterprise can improve your SQL Server environment by allowing you to:

Learn more and register here:https://dh2i.com/how-to-mitigate-sql-server-costs-amidst-pandemic-ravaged-budgets/

Tweet this: @DH2ito Address How to Mitigate @Microsoft#SQLServerCosts Amidst Pandemic-Ravaged Budgets https://dh2i.com/how-to-mitigate-sql-server-costs-amidst-pandemic-ravaged-budgets/ #Consolidation#HighAvailability#HA#DisasterRecovery#DR#LowerCosts

About DH2iDH2i Companyis the leading provider of multi-platform Software Defined Perimeter and Smart Availability software for Windows and Linux. DH2i software products DxOdyssey and DxEnterprise enable customers to create anentire IT infrastructure that is "always-secure and always-on." To learn more, please visit: http://www.dh2i.com, call: 800-380-5405 or +44 20 3318 9204, or email: [emailprotected].

DH2i Company 2020. DH2i, Smart Availability, DxEnterprise, DxOdyssey, DxConsole, DxHADR, DxTransfer, DxCollect and InstanceMobility are trademarks of DH2i Company. All other brand or product names contained in this press release may be trademarks or registered trademarks of their respective holders.

PR Contact:Nicole GormanCorporate Communications / PR DH2iM: 508-397-0131[emailprotected]

SOURCE DH2i

Home

Link:
DH2i to Address How to Mitigate Microsoft SQL Server Costs Amidst Pandemic-Ravaged Budgets - PRNewswire

Read More..