Page 3,122«..1020..3,1213,1223,1233,124..3,1303,140..»

FingerprintJS raises $8 million to expand its enterprise identification API – VentureBeat

Chicago-based FingerprintJS, a company focused on browser fingerprinting-as-a-service, today announced the completion of an $8 million series A funding round led by Nexus Venture Partners. This brings FingerprintJS total raised to $12 million. The company plans to use the capital to expand its fraud prevention capabilities further into the enterprise market.

Fingerprinting technology identifies unique website visitors, including those who enter their session through incognito windows, use VPNs, or block cookies. Developers embed FingerprintJS API into their code to address issues like online fraud, spam, and account takeovers with more accurate user identification.

The company launched its enterprise-grade SaaS product FingerprintJS Pro in 2020. But FingerprintJS earliest product dates back to 2012, when cofounder Valentin Vasilyev began building a version of the browser fingerprinting library as an open source project.

Vasilyev found that fingerprinting implementation could prevent fraud more effectively than traditional cookie-based systems. While users could easily clear cookies for anonymity, browser fingerprintings scripts were stickier. And a browser is identifiable by a host of values, including its agent, language, timezone offset, screen color depth, and custom plug-ins.

Based in JavaScript, FingerprintJS compiles signals inside any browser and generates a unique identifier that can detect unusual behavior patterns. This core fingerprinting technology remains open-sourced and has garnered over 10 million downloads and 12,000 GitHub stars since its release.

FingerprintJS Pro expands on its open source predecessor with improved identification accuracy, from a reported 60% to 99.5%, and cloud hosting capabilities. The pro versions server-side analysis API appears to be key, enabling more complex analysis than was possible with the original versions single JavaScript file. The early version queries and then hashes browser attributes, which means it will create identical fingerprints in situations where more than one person is using the same browser, in the same version, on the same type of phone or laptop.

The new server-side API can process this data on the server to analyze large masses of auxiliary data, like URL changes or IP addresses, to differentiate between users who would otherwise have the same fingerprints. The API can also process information without browser exposure to reduce the risk of external tampering.

FingerprintJS Pro use cases login pages from unauthorized users, reducing fake account signups, reducing duplicate account creation, reducing credit card fraud, and many more

FingerprintJS claims eBay, Dell, and Coinbase among its clients. The company said it is now looking to identify and build additional fraud prevention tools.

See the article here:
FingerprintJS raises $8 million to expand its enterprise identification API - VentureBeat

Read More..

Developer jobs: Googles Go, Redux.js, Google Cloud, and AWS skills will get you the most interviews – ZDNet

While many people will face tough prospects in 2021, software engineers remain in high demand even in areas that tech companies and employees are supposedly fleeing from, like San Francisco.

But while employees might want to leave expensive cities, employers are offering slightly more to attract talent in traditional tech hubs.

"Average salaries for top software engineering roles increased in all major tech hubs last year by 5% in the San Francisco Bay Area, 3% in New York, 7% in Toronto, and 6% in London respectively," Hired notes in a new report.

SEE: Hiring Kit: Python developer (TechRepublic Premium)

Hired notes that programmers who know Google's Go programming language, the Redux JavaScript library, Google Cloud, and AWS get more interview requests from employers.

Remote working under the pandemic, however, has had some impact on traditional tech hubs as more remote roles appear elsewhere.

For example, Denver, in Colorado accounted for 34% remote role offers, while roles in London and Toronto accounted for 6% and 9% of remote roles, respectively.

Hired's survey, covering 10,000 participating companies and 245,000 job seekers, was conducted with hiring platform Vettery.

"Demand for software engineers and their skill set continued to grow despite the massive economic downturn amid the pandemic and one of the most difficult job markets in US history," said Josh Brenner, Vettery's chief.

"As many companies will pick up their hiring efforts again this year, they will have to compete even more for top engineering talent."

The companies found that 83% of software engineers were after "new challenges and continuous learning", meaning that companies will need to cater to an appetite among developers for remote work and career development opportunities.

Developers across the board are in demand. People with backend and full stack knowledge accounted for 58% and 57% of interview requests, while frontend software engineers accounted for 30% of all interview requests.

SEE: Programming languages: Microsoft TypeScript leaps ahead of C#, PHP and C++ on GitHub

Software engineers who know about Redux.js, Google Cloud, AWS and React.js are in luck. Engineers proficient in Redux.js received almost three times more interview requests than the marketplace average, while candidates with Google Cloud, AWS and React.js skills received 2.7 times more interviews.

The companies found that developers with knowledge of Go and Scala got twice as many interview requests.

AWS is where the jobs are though. "AWS was requested 8 [times] more in job listings compared to Google Cloud Platform and Microsoft Azure skills," Hired notes.

Developers who want a job also need to know Kubernetes and Docker, the predominant container technologies.

See the original post here:
Developer jobs: Googles Go, Redux.js, Google Cloud, and AWS skills will get you the most interviews - ZDNet

Read More..

Fusion BPO Selects NICE Workforce Management in the Cloud to Drive Efficiency Gains and Boost Customer as well as Employee Engagement – Business Wire

HOBOKEN, N.J.--(BUSINESS WIRE)--NICE (Nasdaq: NICE) today announced that its Workforce Management (WFM) solution in the cloud has been selected by Fusion BPO to improve efficiency and empower increased employee performance. Fusion BPO, a multichannel and multilingual contact center services provider, will also leverage NICEs WFM solution to generate and allow employees to select schedules suited to their needs, enhancing employee engagement and driving higher levels of service to customers. Moving to the cloud with NICE allows Fusion to optimize uptime and monitoring, support internal innovation via the latest software versions and adapt to changes in an agile way, all while lowering total cost of ownership (TCO).

Mr. Pankaj Dhanuka, CEO of Fusion BPO Services, said, Onboarding NICE WFM in the cloud into Fusions ecosystem is a huge step towards our future goals. Its AI-driven capabilities will help us increase productivity, customer retention and reduce the cost for our clients. In an age where digital solutions are the need of the hour, this is a step in the right direction.

NICE WFMs AI-based smart forecasting capabilities precisely predict volumes and demand depending on the historical data of the customer. This enables Fusion BPO to meet their customers workforce needs in terms of quantity and skill set. Using machine learning, the NICE solution generates schedules that positively impact attrition and shrinkage while taking into account employee availability and personal preferences. The solution also allows employees to suggest scheduling preferences such as break durations and working hours. With increased forecasting accuracy and intelligent scheduling, NICE WFM improves efficiency, employee engagement, quality of service and customer loyalty while reducing overall costs for Fusion BPO and its customers.

By choosing to move workforce management to the cloud with NICE, Fusion BPO is transforming its operations to become agile and is adapting to rapidly changing market and customer demands while also accurately meeting employee and business needs, said Darren Rushworth, President NICE APAC. NICE is pleased to be a part of Fusions innovation-driven journey to help meet their short-term goals and shape long term strategic objectives.

About NICENICE (Nasdaq: NICE) is the worlds leading provider of both cloud and on-premises enterprise software solutions that empower organizations to make smarter decisions based on advanced analytics of structured and unstructured data. NICE helps organizations of all sizes deliver better customer service, ensure compliance, combat fraud and safeguard citizens. Over 25,000 organizations in more than 150 countries, including over 85 of the Fortune 100 companies, are using NICE solutions. http://www.nice.com.

About Fusion BPOFusion BPO Services is a global multilingual, multichannel call center outsourcing solutions provider with 17 centers in 9 countries. Fusion offers end-to-end contact center solutions for clients across different industry verticals, including telecommunication, healthcare, retail and e-commerce, BFSI, utility, and travel and hospitality. We are equipped with the latest digital technology and AI-based solutions to ensure better efficiency to each of our clients.

Trademark Note: NICE and the NICE logo are trademarks or registered trademarks of NICE Ltd. All other marks are trademarks of their respective owners. For a full list of NICEs marks, please see: http://www.nice.com/nice-trademarks.

Forward-Looking StatementsThis press release contains forward-looking statements as that term is defined in the Private Securities Litigation Reform Act of 1995. Such forward-looking statements, including the statements by Mr. Rushworth, are based on the current beliefs, expectations and assumptions of the management of NICE Ltd. (the Company). In some cases, such forward-looking statements can be identified by terms such as believe, expect, seek, may, will, intend, should, project, anticipate, plan, estimate, or similar words. Forward-looking statements are subject to a number of risks and uncertainties that could cause the actual results or performance of the Company to differ materially from those described herein, including but not limited to the impact of changes in economic and business conditions, including as a result of the COVID-19 pandemic; competition; successful execution of the Companys growth strategy; success and growth of the Companys cloud Software-as-a-Service business; changes in technology and market requirements; decline in demand for the Company's products; inability to timely develop and introduce new technologies, products and applications; difficulties or delays in absorbing and integrating acquired operations, products, technologies and personnel; loss of market share; an inability to maintain certain marketing and distribution arrangements; the Companys dependency on third-party cloud computing platform providers, hosting facilities and service partners;, cyber security attacks or other security breaches against the Company; the effect of newly enacted or modified laws, regulation or standards on the Company and our products and various other factors and uncertainties discussed in our filings with the U.S. Securities and Exchange Commission (the SEC). For a more detailed description of the risk factors and uncertainties affecting the company, refer to the Company's reports filed from time to time with the SEC, including the Companys Annual Report on Form 20-F. The forward-looking statements contained in this press release are made as of the date of this press release, and the Company undertakes no obligation to update or revise them, except as required by law.

Read more here:
Fusion BPO Selects NICE Workforce Management in the Cloud to Drive Efficiency Gains and Boost Customer as well as Employee Engagement - Business Wire

Read More..

Bed Bath & Beyond adds Oracle ERP to its multicloud mix – ZDNet

Bed Bath & Beyond's tech stack now includes two bitter rivals after the home goods retailer announced Thursday that it's signed a deal with Oracle to use its enterprise resource planning (ERP) technology. The retailer said Oracle Cloud will become its provider of financial, supply chain and merchandising software, replacing its legacy suite of technology systems and bolstering its planning capabilities.

Bed Bath and Beyond is also a Google Cloud customer since October, having tapped Google's BigQuery service for machine learning and analytics, along with Spanner, Google Compute Engine, and Google Kubernetes Engine to create a singular view of customer data. Bed Bath & Beyond is also using Google Cloud to optimize its fulfillment strategy.

With the Oracle deal, the retailer said the ERP deployment is the first key component in its $250 million technology investment roadmap.In regulatory filingslast year, Bed Bath & Beyond said it will spend $250 million on investments in digital and strategic growth plans for fiscal 2020. Key areas of investments include search and navigation across digital channels, data integration, CRM, analytics, marketing, and e-commerce. During fiscal 2019, Bed Bath & Beyond also spent about $187 million on logistics, digital capabilities, and analytics.

"We are building authority in Home, Baby, Beauty and Wellness with a digital-first, omni-always and customer-inspired approach," said Bed Bath & Beyond's COO John Hartmann. "Oracle's proven leadership and state-of-the-art technologies will allow us to better serve customers and improve the efficiency and effectiveness of our business. Additionally, the agile partnership will enable continual innovation and improvement as our enterprise evolves."

Also:Top cloud providers in 2021: AWS, Microsoft Azure, and Google Cloud, hybrid, SaaS players|Enterprise tips for managing the multicloud (free PDF)

Looking at the broader retail market, cloud players such as Google, Microsoft and Amazon have beentouting the perksof their respective platforms andpromising to support retailerson their digital transformation journeys. In 2019, Microsoftannounced a retail-as-a-service (RaaS) partnership with supermarket chain Kroger, which is splitting its cloud buying between Azure and Google Cloud Platform. Meanwhile,Walmart is partnering with Microsoftto use itsAI, Internet of Things tools and Azure. Best Buy has signed up with Google Cloud to unify its data sources across various legacy platforms, and Home Depot has tapped both Google Cloud Platform and Microsoft Azure for its multicloud strategy.

Link:
Bed Bath & Beyond adds Oracle ERP to its multicloud mix - ZDNet

Read More..

Microsoft, HPE Bringing AI, Edge, Cloud to Earth Orbit in Preparation for Mars Missions – EnterpriseAI

The International Space Station will soon get a delivery of powerful AI, edge and cloud computing tools from HPE and Microsoft Azure to expand technology experiments aimed at preparing NASA for launching future crewed exploratory missions to Mars.

The new equipment and software, including HPEs specialized, second-generation Spaceborne Computer-2 (SBC-2), will mark the first time that broad AI and edge computing capabilities will be available to researchers on the space station, Tom Keane, Microsofts vice president of Azure Global, wrote in a Feb. 11 post on the Azure blog.

The new hardware, software and services are scheduled for launch to the ISS at 12:36 p.m. on Feb. 20 aboard Northrop Grummans 15th(NG-15) Commercial Resupply Services cargo mission. The NG-15 missions launch from the Wallops Flight Facility at Wallops Island, Virginia, is contracted by NASA to bring needed supplies.

The new SBC-2 computer thats heading to the space station follows the original Spaceborne Computer-1 that was sent to the ISS in 2017 as part of a validation study to test it in the rigors of space aboard the orbiting laboratory. SBC-1 returned to earth in 2019 after completing its mission. Both Spaceborne Computer-1 and Spaceborne Computer-2 are sponsored by the ISS National Lab.

Dr. Mark Fernandez of HPE

SBC-2 will bring ISS researchers a wide range of new capabilities they did not have with the original Spaceborne machine from 2017 to 2019, Dr. Mark Fernandez, solution architect for converged edge systems at HPE and principal investigator for SBC-2 told EnterpriseAI. Technological advancements in AI, cloud and more will provide more possibilities in the new machine for ISS researchers, he said.

Hardware-wise, we're sending up the HPE Edgeline Converged EL4000 Edge system, which is purpose-engineered and built to operate on the edge and take advantage of AI and ML capabilities with its onboard Nvidia T4 GPUs, said Fernandez. These are enterprise-class, commercial off-the-shelf servers that go into data centers.

Featuring CPUs and GPUs

The Edgeline EL4000 servers will use Nvidia T4 GPUs for AI and machine learning, image processing, video processing and other tasks. Previously, the first SBC-1 used CPUs for those tasks. The latest SBC-2 will include CPUs and GPUs to allow for comparison performance experimentation in space.

The 1U boxes insert into standard data center 19-inch racks on the ISS. The racks are then inserted into lockers aboard the ISS to hold them securely. Also provided are an enterprise-class compute node, HPEs ProLiant DL360, for intense compute requirements, said Fernandez.

HPE's Spaceborne Compuer-2

For the second generation of the SBC, NASA asked HPE to send up twice the compute power of the original version, said Fernandez. So, we're sending up twice the number of servers. You'll see two lockers and each contains two servers.

One is a CPU-based Intel server for those that love Intel and traditional computing, and we'll have a GPU-based Edgeline server for those that are doing image processing, artificial intelligence, machine learning, etc., said Fernandez.

NASA asked for double the computing power in the SBC-2 so the agency can continue its work toward sending humans to Mars, he said. SBC-1 was a proof-of-concept device for 18 months and now the new SBC-2 will be tested to see how it reacts to two to three years in space to accommodate a mission to Mars, he added.

Azure in Space

The Azure cloud capabilities will be used with the machines to allow experiments with getting data back and forth from the ISS to Earth as quickly and efficiently as possible, said Fernandez. Such data transfers are done today using existing NASA technologies.

The ISS is only 220 miles up in Earth orbit, but the networking is circa-1980, said Fernandez. We have speeds of two megabits a second up and down to the ISS. I have 50 megabits a second in my home.

Increasing those speeds will be critical for Mars missions, he said.

Microsoft is enabling that, and they have aspirational plans to come up with some AI and machine learning that we'll look at, said Fernandez. One idea they will look at is running data on SBC-2 and then sending small amounts of data back to Earth, and then comparing that to bursting data to Azure and seeing what works faster.

We're sitting right on top of the same NASA network, but we're going to encode and compress messages back and forth in order to take the most advantage of that two megabytes per second, he said. I have a brilliant scientist who is going to run the same experiment on CPUs, on GPUs and in the cloud. And he will report back to the community, if you have this type of data, it's best if you process it this way because we are given those three options.

The experiments will begin after the equipment arrives at the ISS and following their installation and setup. Those tasks are expected to take some time to complete, including several days for the cargo mission to arrive at the space station. We've got three pre-canned experiments for three different users that we're going to hope to fire off right away, said Fernandez.

How Azure Views Its Mission Aboard the ISS

The crux of this work is about making the capabilities of Azure available toastronauts, space explorers andresearchers to learn and advance science and the use of the cloud to support their goals, a Microsoft spokesperson told EnterpriseAI. Through this project we will be able to continue to gain knowledge onhow wecanbest support thescience andresearch community, wherever they are, on and off the planet.

With SBC-2, Microsofts research and Azure space engineering teams are evaluating the potential of HPEs [space-based] state-of-the-art processing in conjunction with hyperscale Azure, alongside the development of advanced artificial intelligence (AI) and machine learning models to support new insights and research advancements, the spokesperson said.

That includes weather-modeling of dust storms to enable future modeling for Mars missions, plant and hydroponics analysis to support food growth and life sciences in space, and medical imaging experiments using an ultrasound on the ISS to support astronaut healthcare.Also being created is a platform for the development and testing of hybrid edge-cloud environments before contributing additional experiments to the ISS.

We are exploring the potential of empoweringexciting newexperiments thatleveragethe far-reaching potential of the cloudin conjunction with theHPE Edge capabilities, the spokesperson said. To date, researchershavehad tooften limitthe scope of theirstudytowhat computationalresources theyhadavailable to conduct their research.

Using bursting capabilities with Azure will add to future capabilities, according to Microsoft. Bursting down to the cloud provides access to more computation/resources than can be hosted in the ISS, while leveraging SBC-2s power and proximityat the edge, the spokesperson said. We are excitedto empower others, even in space,tobeable to leveragethe power ofMicrosoftAzure-making it possible forastronauts, space explorers, andresearchers to thinkbig astheytackle theirtoughestquestions.

Related

View post:
Microsoft, HPE Bringing AI, Edge, Cloud to Earth Orbit in Preparation for Mars Missions - EnterpriseAI

Read More..

Products and Solutions to Internet Security and Privacy – TFOT – The Future of Things

Photo bycottonbrofromPexels

Internet security and data security has become less protected due to the varying data retention laws. People looking to maintain their online privacy are opting for online services to ensure their data is secured. There are various services offered such as VPN, antivirus, password manager services, and email protection. There are several providers are in the market and it is important to get these services from trusted providers. Websites likePrivacy Sharksoffer reviews on the top and best providers for anything to do with online security.

A virtual private network is a means of ensuring that ones mobile, computer or laptop is secured. It works by protecting the users identity and allowing them to roam the internet freely. VPN tricks the device to believe it is in another location and encrypting the users internet activity hence allowing them to access geo-restricted content. People who use devices to work at home require protection from hackers and cyber threats. Some users use these services to keep their information away from their internet provider or the state government. Most people use VPN to ensure online anonymity and online privacy. It helps to keep the users location data untraceable by showing a different region.

Internet browsing through unsecured networks exposes ones private information to the public. Public Wi-Fi networks make passwords and data accessed by a browser vulnerable to access by unauthorized persons. VPN keeps the online activity and IP address private by scrambling data sent over a public network. Search history from someones online activity and web surfing can be accessed by various people and used to locate the user. This is why people get targeted ads and pages as per their search content. Service providers and web browsers can access someones information anytime without a VPN. The best VPN services should be able to provide privacy and anonymity by hiding the IP address, protect user private information from the public, and allow one to access and watch their favourite shows at whatever location they are when traveling.

Antivirus software is installed on user devices to protect them from known internet threats such as viruses and malware. The software should be frequently updated to ensure that it remains effective since it can also harm the device. A universal antivirus is best preferred for information security since cyber threats that involved a maliciously cooked code may fail to be detected by some antivirus programs. Some antivirus software has the capability of running predictive analysis and are also equipped with artificial intelligence and can therefore detect any malicious software depending on what it does to the users device. Users should be keen when selecting the type of antivirus software to install. The weaknesses and strengths of the software should be identified before settling on an antivirus company. An antivirus with frequent scanning and updates helps ensure the users cybersecurity hygiene is taken care of and that they are free from threats.

Email security ensures that a users email communication and accounts are safe from unauthorized access, loss, or compromised by other persons. Email threats include malware, spam, deceptive messages that confuse users to expose sensitive information, hyperlinked malware, and phishing attacks. Email security is essential for both individual and business email accounts. Enterprise and company valuable data should be protected from cyberthreats. There are several ways in which one can secure their email and information. Some of them include strong passwords, secure logins and encryption, email encryption, data scanners, and data protection solutions to protect sensitive information. For companies, it is key to train employees and make them aware of the various types of threats around them. Avoiding risky behaviours such as opening emails and clicking unverified links should also be avoided since malware can cause massive destruction when installed on a device. Email attackers have skills and methods of luring the user to believe that they are supposed to share with them certain information. It is important to select the best email security services to ensure the security of data and private information in personal emails, business emails, or company emails.

Password managers help maintain online security and privacy by offering services that enable users to create strong and unique passwords that are not easy to guess or recreate. It is advisable to avoid repeating and reusing passwords and for most people, it is a challenge to remember every one of them. Password managers also ensure that the generated password is stored in a safe and encrypted location and the user needs to remember only one major password to access the others. the user has to decide on what place they prefer their password saved in. The passwords can be saved on their personal computer, in the cloud, or on someone else server.

Device stored passwords are hard to be accessed by hackers since they will need advanced tools and keyloggers to access the passwords. The user has control over the password access and security. However, when the device is lost, all the passwords can be compromised especially if the device is stolen by someone targeting the user. Cloud storage of passwords makes them easily accessible to the user as well as synchronizes all the passwords on several other devices through cloud servers. The passwords saved on cloud servers can also be recovered when forgotten by the user or in the case of a stolen device. The only threat of saving passwords on the cloud is that they can be breached hence do not provide total security to user data.

Conclusion

Internet security requires the use of various technologies and products to ensure data and information safe from unauthorized personnel and cyber-attack. There are several solutions and products to choose from and they all have different pros and cons on the devices and to the user. It is key to select the right product and package that serves the purpose of ensuring that the device is free from any type of online insecurity. Various sophisticated threats are surrounding online users especially for companies or persons who strive to protect sensitive and private information from target groups.

Other Posts You Will Enjoy

Visit link:
Products and Solutions to Internet Security and Privacy - TFOT - The Future of Things

Read More..

Microsoft said the number of web shells has doubled since last year – ZDNet

Image: Microsoft

Microsoft says the number of malicious web shells installed on web servers has almost doubled since its last count, last year in August 2020.

In a blog post yesterday, the Redmond company said it detected roughly140,000 web shells per month between August 2020 and January 2021, up from the77,000 averageit reported last year.

The number has increased as a result of a shift in how hackers view web shells. Once considered a tool for script kiddies defacing websites and the go-to tool of DDoS botnet operators, web shells are now part of the arsenal of ransomware gangs and nation-state hackers alike and are crucial tools used in complex intrusions.

Two of the reasons they have become so popular is their versatility and access they provide to hacked servers.

Web shells, which are nothing more than simple scripts, can be written in almost any programming language that runs on a web server such as PHP, ASP, JSP, or JS and such, can be easily hidden inside a website's source code. This makes detecting them a difficult operation, which often involves a manual analysis from a human operator.

In addition, web shells provide hackers with a simple way to execute commands on a hacked server via a graphical or command-line interface, providing attackers with a simple way to escalate attacks.

As the corporate IT space has moved towards hybrid cloud environments, the number of companies running web servers has increased over the past few years, and, in many cases, public-facing servers often have direct connections to internal networks.

As Microsoft's stats have shown, attackers appear to have figured out this change in the makeup of corporate IT networks as well, and have amped up their attacks on public-facing systems.

Web shells now play a crucial role in their attacks, providing a way to control the hacked server and then orchestrate a pivot to a target's internal network.

These types of attacks are exactly what the US National Security Agency warned about in April 2020 when it publisheda list of 25 vulnerabilitiesthat were often used to install web shells.

The NSA report didn't just warn about web shells used on public-facing systems but also about their use inside internal networks, where they're used as proxies to jump to non-public-facing systems.

Microsoft urges companies to re-prioritize their approach to dealing with web shells, which are slowly becoming one of today's biggest security threat. As ways to keep networks secure, the OS maker recommends a few basic actions:

Read more from the original source:
Microsoft said the number of web shells has doubled since last year - ZDNet

Read More..

Ways to keep the organization’s cloud costs under control – YourStory

Cloud costs can prove to be a tricky if not monitored carefully. This makes it important for organisations to reduce their overall cloud costs. All these techniques and strategies can be implemented quickly and can help users tremendously with their cloud costs.

But before finding out how you can reduce your organisation's cloud costs, let's take a brief look at cloud computing services.

Cloud services are specialised IT services offered to companies to reduce their burden and ease their workflow. Maintaining IT infrastructure within a company can be expensive due to the high costs involved in purchasing and maintaining these servers. Usually, an entire department is needed to run these big server rooms, which is why companies prefer to outsource these services.

The number of organisations using cloud services has gone up exponentially. According to studies, the average use of cloud services in an organisation has gone up from 5 percent to 30 percent in just a few years, and it's set to increase even more.

The increase in usage of cloud services can greatly burden a companys finances. But these five ways can help an organisation can reduce its cloud services costs and keep them under control.

The first and foremost step is to understand the need for cloud services in your organization. For that, IT professionals must be brought into the picture and asked how the company can benefit from using cloud services. Going back to the whiteboard and understanding your needs will allow you to assess the extent to which you require the services.

Do you need to improve your workflow or strengthen security? Or do you need to increase flexibility or facilitate work distribution among team members? Do you need private cloud services or public cloud services? Do you need SaaS (software as a service), IaaS (infrastructure as a service), or PaaS (platform as a service)?

Answers to these questions will improve your understanding of cloud services, which translates into better budgeted and managed services.

After gaining a better understanding of your organisation's cloud needs, budgeting your service expenses is the next important step. It's imperative to strike a balance between how your need and spending capacity. Understanding what licence you are going for is critical. Will per-user or overall usage benefit your organisation more in terms of cost reduction and better workflow? Companies are continually spending more on cloud services. It is not a bad thing necessarily, but something organisations might want to watch out for.

But organisations should ensure room for flexibility while preparing this budget because any impetuous decision can hamper usage. The bottom line: assess the needs and strike a balance.

Cloud management platforms help you in creating visibility of cloud service usage. An organisation that is aware of the usage of its cloud services will be able to better utilise the services.

Keeping a tab on departmental usage can help the organisation to implement cost cutting where needed. This will not only uncover loopholes in allocation and capacity, but will also provide the organisation with any data-saving method that's already in place. Investing in cloud management platforms is therefore an important tool. These platforms give accurate information about your use of these services and can help you to optimise the same.

Managing the usage of services is often the most overlooked part, but is an essential factor in increasing or reducing costs.

Staff should understand the usage of cloud services because they are the users. Mistakes made by employees when accessing services result in increased costs, which burden the organisation's overall finances. It's simple: if you want to achieve lower costs, educate your employees about the usage of these services.

While opting for cloud services, larger organisations use a mixture of services to keep the costs down. They understand their needs diligently and invest in services that are most cost-effective and beneficial.

For some, a single cloud service may do the trick, but these are not necessarily cost-effective. The right cloud services depend on your organisation's needs and capacity. Also, it is beneficial to integrate your cloud services with your systems to facilitate ease of use and data transfer.

Often a network architect is hired in a company to help in decision making on where the company should invest in. It is the job of the network architect to devise a cost-efficient strategy for the organisation and oversee the working of these services.

Cloud computing is the future. Efficient usage of these services can unlock a lot of opportunities, but managing them well is essential for your growth.

(Disclaimer: The views and opinions expressed in this article are those of the author and do not necessarily reflect the views of YourStory.)

Read the rest here:
Ways to keep the organization's cloud costs under control - YourStory

Read More..

FogHorn and IBM to Collaborate on Edge-to-Hybrid Cloud Solutions – ARC Viewpoints

FogHorn announced plans to collaborate with IBM. The common goal is to provide an open and secured next-generation hybrid cloud platform with advanced, edge-powered artificial intelligence (AI) and closed-loop system control capabilities. By bringing together edge and cloud capabilities, FogHorn and IBM plan to help customers rapidly deploy, process, store, analyze and train critical data from edge to cloud and enhance business processes.

FogHorn Lightning Edge AI offerings, which deliver low latency for onsite data processing and real-time AI, analytics and machine learning capabilities, combined with IBM Edge Application Manager, which runs on Red Hat OpenShift, will be designed to automate the deployment of edge AI applications to available enterprise edge compute. The solution is being engineered to run and manage workloads on virtually any edge endpoint, including devices, clusters and servers, gateways and machines supporting RHEL and other Linux operating systems, with Red Hat OpenShift, and Podman and other Docker runtimes. This gives organizations the choice and flexibility to extend their operations from any public or private cloud to any edge server or asset, and is planned to allow for a single system of record in the enterprise that is enriched with quality data and insights to be acted on with intelligent automation.

FogHorns offerings can also be integrated with IBM Maximo Application Suite to optimize the performance of physical assets and accelerate transformation of maintenance, monitoring and reliability options, powered by Lightning Edge AI Platform.

Read more:
FogHorn and IBM to Collaborate on Edge-to-Hybrid Cloud Solutions - ARC Viewpoints

Read More..

Fifth-generation cyberattacks are here. How can the IT industry adapt? – World Economic Forum

Cyberattacks are continuing to grow in sophistication and scale.

The coronavirus pandemic has increased the attack surface for cybercriminals, leading to a possible cyber-pandemic.

Healthcare is one industry that has been particularly exposed.

If you look back at early 2020 "new year predictions", you will find nowhere a reference to an unprecedented global pandemic that will shut down, in many ways, the way we live and begin a new normal.

But it happened. And with the new normal came "new everything".

With the rapid shift to more cloud servers, the popularity of network-connected smartphones, in addition to the shift to remote work, organizations had to quickly adapt their security measures to make sure they are secured at all times, from any remote places they might connect from. This has now become the new security perimeter.

The new landscape has generated a surge of sophisticated fifth-generation cyberattacks. As organizations adapted to remote work, and all its digital implications, cyber-criminals seized the global crisis to launch a series of large-scale cyber exploits.

Cyberattacks have reached a new level of sophistication, ranging from international espionage to massive breaches of personal information to large-scale internet disruption.

Advanced weapons-grade hacking tools have been leaked, allowing attackers to move fast and infect large numbers of businesses and entities across huge swaths of geographic regions. Large-scale, multi-vector mega-attacks have sparked a need for integrated and unified security structures.

Most businesses are still in the world of second- or third-generation security, which only protects against viruses, application attacks and payload delivery. Networks, virtualized data centres, cloud environments and mobile devices are all left exposed. To ensure a cybersecure organization, businesses must evolve to fifth-generation security: advanced threat prevention that uniformly prevents attacks on a businesss entire IT infrastructure.

Just as we thought 2020 could not have brought any more bad news or cybercrime advancements, along came the SolarWinds incident, which swiftly qualified for the title of the most significant attack of the year: sophisticated, multi-vector attacks with clear characteristics of a cyber pandemic, where the malicious activity is spread within the organization in a manner of seconds. This was a manifestation of fifth-generation cyber-attack.

The scope of the incident became clearer several days later when Microsoft, FireEye, SolarWinds, and the US government all admitted they suffered an attack made possible by a hack to SolarWinds, a common IT-management software. Further investigation revealed that the attackers added a backdoor, called Sunburst, to a component of the SolarWinds system, which was then distributed to its customers via an automatic software update. That granted remote access to multiple high-profile organizations making it one of the most successful supply-chain attacks ever observed.

Several aspects of the SolarWinds supply-chain attack make it unprecedented in the ever-evolving cyber-landscape. Its scope was uniquely broad, with an estimated 18,000 SolarWinds customers affected, including most Fortune 500 firms.

COVID-19 forced organizations to set aside their existing business and strategic plans, and quickly pivot to delivering secure remote connectivity at massive scale for their workforces. Security teams also had to deal with escalating threats to their new cloud deployments, as hackers sought to take advantage of the pandemics disruption: 71% of security professionals reported an increase in cyber-threats since lockdowns started.

As COVID-19 continues to dominate headlines in 2021, news of vaccine developments or new national restrictions will continue to be used in phishing campaigns, as they have been through 2020. The pharma companies that developed vaccines will also continue to be targeted by malicious attacks from criminals or nation states looking to exploit the situation.

Recent Check Point research shows that healthcare is currently the most targeted industry in the US, with a 71% increase in attacks compared to September. The chart below shows the sharp increase of healthcare-sector attacks compared to the global increase; since November, there has been an increase of over 45% in the amount of attacks in the sector, double the global increase in amount of attacks over the same time period (22%).

Healthcare sector cyberattacks in 2020

Image: Check Point

As the coronavirus spread worldwide, the social distancing policies enacted due to the COVID-19 pandemic shifted a substantial portion of businesses from corporate offices to employees home offices. Network admins had to rapidly adjust to the requirements of working remotely and implement remote-access platforms within their organizations. Unfortunately, these often resulted in misconfigurations and vulnerable connections, allowing attackers to leverage these flaws to access corporate information.

As a result, the first half of 2020 saw an increase in attacks against remote access technologies such as RDP (Remote Desktop Protocol, developed by Microsoft to provide an interface for remote connection) and VPN. The following chart displays the increase in attacks exploiting vulnerabilities in remote connection products.

Attacks targetting remote connection vulnerabilities

Image: Check Point

Schools and universities have pivoted to large-scale use of e-learning platforms, so perhaps its no surprise that the sector experienced a 30% increase in weekly cyberattacks during the month of August, in the run-up to the start of new semesters. Attacks launched by these digital class clowns will continue to disrupt remote-learning activities over the coming year, if and when the pandemic spread will peak.

With this new world, comes a new opportunity to redefine the role of cybersecurity and ensure every organization is stepping up the fifth generation of security. Below are three guiding principles:

As weve learned, vaccination is far better than treatment. The same applies to your cybersecurity. Real time prevention of attacks, before they infiltrate, places your organization in a better position to defend against the next cyber-pandemic.

2. Consolidation and visibility

Solutions applied in individual areas of attack will probably leave you with security gaps, fragmented visibility, complex management and limited options to scale. Consolidated security architecture will guarantee you the security effectiveness needed to prevent sophisticated cyberattack. Unified management and risk visibility fill out your security posture.

Next-generation technologies such as AI, ubiquitous connectivity and quantum computing have the potential to generate new risks for the world, and at this stage, their full impact is not well understood.

There is an urgent need for collective action, policy intervention and improved accountability for government and business in order to avert a potential cyber pandemic.

The Forum's Centre for Cybersecurity launched the Future Series: Cybercrime 2025 initiative to identify what approaches are required to manage cyber risks in the face of the major technology trends taking place in the near future.

Find out more on how the Forum is leading over 150 global experts from business, government and research institutions, and how to get involved, in our impact story.

3. Keep your threat intelligence up to date

To prevent zero-day attacks, organizations first need incisive, real-time threat intelligence that provides up-to-minute information on the newest attack vectors and hacking techniques. Threat intelligence must cover all attack surfaces including cloud, mobile, network, endpoint and IoT.

See the rest here:
Fifth-generation cyberattacks are here. How can the IT industry adapt? - World Economic Forum

Read More..