Page 937«..1020..936937938939..950960..»

Citrix ShareFile vulnerability actively exploited (CVE-2023-24489) – Help Net Security

CVE-2023-24489, a critical Citrix ShareFile vulnerability that the company has fixed in June 2023, is being exploited by attackers.

GreyNoise has flagged on Tuesday a sudden spike in IP addresses from which exploitation attempts are coming, and the Cybersecurity and Infrastructure Agency (CISA) has added the vulnerability to its Known Exploited Vulnerabilities Catalog.

Unearthed and reported by Assetnote researcher Dylan Pindur, CVE-2023-24489 affects the popular cloud-based file-sharing application Citrix ShareFile, more specifically its storage zones controller (a .NET web application running under IIS).

You can use the ShareFile-managed cloud storage by itself or in combination with storage that you maintain, called storage zones for ShareFile Data. The storage zones that you maintain can reside in your on-premises single-tenant storage system or in supported third-party cloud storage, Citrix explains.

Storage zones controller allows users to securely access SharePoint sites and network file shares through storage zone connectors, which enable ShareFile client users to browse, upload, or download documents.

In essence, CVE-2023-24489 is a cryptographic bug that may allow unauthenticated attackers to upload files and (ultimately) execute code on and compromise a vulnerable customer-managed installation.

CVE-2023-24489 has been fixed in ShareFile storage zones controller v5.11.24 and later, and customers have been urged to upgrade ever since.

Vulnerabilities in enterprise-grade file-sharing applications are often exploited by attackers, especially the Cl0p cyber extortion gang, who previously targeted organizations using Accellion File Transfer Appliance (FTA) devices, the GoAnywhere MFT platform, and the MOVEit Transfer solution.

The existence of CVE-2023-24489 and of the fix has been publicly revealed in June 2023, but it wasnt until July 4 that Assetnote published additional technical details and a proof-of-concept (PoC) exploit. Other PoCs have been released on GitHub since then, so it was just a matter of time until attackers used them to create working exploits and leverage them.

According to GreyNoises online tracker of exploit activity related to this vulnerability, first signs have been registered on July 25.

There are still no public details about the attacks exploiting the flaw, but CISA has mandated that US Federal Civilian Executive Branch agencies apply patches for it by September 6th, 2023.

Organizations in the private sector should do the same (if they havent already). If youre not sure which storage zones controller youre using, follow these instructions to find out.

Read the original post:
Citrix ShareFile vulnerability actively exploited (CVE-2023-24489) - Help Net Security

Read More..

Hardware fails, but I’ve never lost data thanks to this backup plan – ZDNet

Olemedia/Getty Images

In May,severaloutletsreported thatReddituserswerecomplainingabout failing SanDisk Extreme SSDs. Subsequently, replacement drives provided by Western Digital, SanDisk's parent company,were also reported to be failing.

The issue affects SanDisk Extreme Portable SSD V2, SanDisk Extreme Pro Portable SSD V2, and WD My Passport SSD products, and appears to be limited to drives manufactured after November 2022.

Western Digital has also released a firmware update to address the issue.

Also: Top network-attached storage devices: Synology, QNAP, Asustor, and more

Data loss is bad. Hardware can be replaced, but data is irreplaceable. Often, the data can still be retrieved by data recovery specialists, but that's a painfully expensive route to travel.

I handle, process, and store quite a lot of data in the form of photos and videos, and -- as a measure of caution -- I've pulled all affected SanDisk and WD drives, irrespective of manufacture date, out of use. (I only had two.)

In working as a pro-am photographer and videographer for many years, and having grown acutely aware of just how sudden and catastrophic data loss can be, I have developed a workflow that limits my exposure to this risk. This workflow relies on the fact that storage is relatively cheap.

When handling data, I work by the principle of "two is one, one is none, and three is best." What do I mean? If I have two copies of something, and remember they need to be on separate devices, not two copies on your laptop, and one fails, I still have one. If I have one copy and that goes bye-bye, well, I have none. And just to be on the safe side, I prefer to have three copies of everything, spread across different storage devices.

When I'm capturing photos and video, I normally copy the data off the storage cards onto both a laptop and an external drive. (If there's not enough space on the laptop, I'll copy it onto two external drives.)

Also: The top cloud storage services

I also keep the original data on the storage cards for as long as possible before formatting them and reusing them. This is why I prefer having a lot of smaller SD and microSD cards (in the 64GB to 256GB range), rather than a couple of huge 1TB cards.

To move the data off of storage cards onto my laptop and external drives I use a program called Carbon Copy Cloner by Bombich Software. I've been using this software for many years, and it's absolutely packed with features that make copying data from one place to another as fast and reliable as possible.

I'll also move a copy of the raw data onto cloud storage as soon as possible. (Again, I'm minimizing the chances of total loss.)

Also:The other shoe finally dropped on my Google Enterprise cloud storage plan

Note that's just the raw capture data.

Once I start to edit, I like to do something similar. For editing, I make another copy of the data onto a storage drive I use for editing, and I have that backed up to a separate drive using Carbon Copy Cloner on a regular basis, while again also making cloud backups.

You're probably wondering what drives I use. I have a mix of external SSDs and HDDs from a variety of manufacturers. No, I don't buy several drives of one brand from one maker, because I know from past experience that issues like the one plaguing SanDisk can happen.

Currently, I'm using drives from Samsung, Crucial, and OWC. I also back up to Synology NAS boxes.

For cloud storage, I use Backblaze, Dropbox, and also have storage with Amazon.

Also:How I recovered 'irreplaceable' photos off an SD card for free

Another thing that I do is rotate storage drives and storage media every few years. Because I don't want to be using storage media that is five years old, I generally pull them out of service after three years. (Tip: Write the date you started to use something on the device.)

I'll be the first to admit that this all adds up to a fair bit of extra workload, hassle, and expense, but following this regimen has kept my data safe. Yes, I've had drives fail, and that was very annoying. But making sure that there are always multiple copies of my data on multiple devices means I've never lost data as a result of those failures.

Two is one, one is none, and three is best!

Visit link:
Hardware fails, but I've never lost data thanks to this backup plan - ZDNet

Read More..

Revolutionizing HR: Cloud Solutions for Efficiency & Growth – Spiceworks News and Insights

Discover how cloud-based HR systems revolutionize the workplace, enhancing efficiency, engagement, and strategic capabilities while preparing for AI-driven advancements.

Human resource management experiences a revolution with cutting-edge technologies in todays rapidly changing environment. Technology transforms important HR tasks, including recruitment, employee onboarding, and performance evaluation. With the integration of automated processes, data analytics, and AI-driven solutions, decision-making becomes more efficient, remote work becomes more smooth, and employee experiences become more personalized. Check out the potential of cloud HR management solutions, how theyre transforming existing practices, and the upcoming innovations primed to change the future.

Cloud systems in HR management refer to technological solutions that store and manage HR data, processes, and applications on remote servers. Cloud systems use external data centers and offer greater flexibility and efficiency than traditional on-premises solutions that require physical hardware. According to DeloitteOpens a new window , 84% of the firms polled are changing or preparing to improve their HR management operations utilizing cloud technologies.

Cloud HR systems typically lead to operational changes because they are more efficient and easier to use than older, on-premises systems. According to PwC US Cloud Business SurveyOpens a new window , more than 55% of the respondents say the biggest potential impact of the cloud on businesses would be changes to processes and ways of working.

Typically, there is a flurry of activity in the human resources department. With cloud HR software, HR managers now have the capabilities to lessen and streamline administrative activities like payroll and record management. HR managers can now concentrate on employee engagement, which can make or break a company in this era of strong competition.

The transition from traditional HR software to cloud platforms marks a pivotal shift in HR management. Unlike traditional systems that incur hefty infrastructure and maintenance costs, cloud solutions enable swift implementation, regular updates, and heightened security. This shift enhances HR operations across recruitment, onboarding, performance management, and analytics; drives operational efficiency and employee satisfaction in todays digital landscape. Moving to the cloud is a huge change that needs careful planning.

Although switching your HR system is a big decision, your company should gain a lot from moving to modern cloud HR software or from your outdated on-premise HR software to a new SaaS HR system:

With the help of cloud HR solutions, HR tasks are becoming more secure and productive. It involves more than just the fundamental HR duties; it also entails obtaining information immediately and using data to forecast future trends. It enables more invention and development, which is crucial for a companys success.

A survey shows 93%Opens a new window of IT companies plan to adopt cloud technology during the next five years. Many businesses, such as Microsoft, IBM, Hitachi, Coca-cola, and Netflix, have already implemented cloud HR systems, and companies are still adopting it to streamline and improve the efficiency of their HR operations. For example, UnileverOpens a new window announced cloud migrations to speed up product launches, improve customer service, and increase operational efficiency.

Furthermore, automation has become a key feature of most cloud systems. It allows businesses to streamline their workflow process from end to end, eliminate bottlenecks, and improve visibility into operations, which results in better decision-making capabilities.

Future cloud HR management is full of exciting opportunities, especially as we explore the integration of AI-driven predictive analytics and its revolutionary effects on hiring and training procedures. Cloud-based HR management solutions herald the dawn of a new era characterized by increased productivity, employee engagement, and strategic empowerment. By introducing cutting-edge technology, businesses are given the means to skillfully handle the dynamic transitions occurring within their workforce, ultimately giving them a competitive edge.

As the HR landscape evolves, incorporating cloud-based systems will be crucial to stay competitive and thriving in a digitally driven world. Collaboration between HR and IT departments and comprehensive security measures are important to protect sensitive employee data. Moreover, HR professionals must continuously upskill to leverage the potential of AI and immersive technologies, embrace a culture of lifelong learning and stay updated on technical breakthroughs.

Have you considered using cloud solutions to revolutionize HR operations? How has it helped to streamline your processes? Share your thoughts with us on FacebookOpens a new window , XOpens a new window , and LinkedInOpens a new window . Wed love to hear from you!

Image Source: Shutterstock

Continue reading here:
Revolutionizing HR: Cloud Solutions for Efficiency & Growth - Spiceworks News and Insights

Read More..

Data Migration: Strategy and Best Practices – Datamation

Datamation content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

Every organization at some point will encounter the need to migrate data for any number of business and operational reasons: required system upgrades, new technology adoption, or a consolidation of data sources, to name a few. While the process of moving data from one system to another may seem deceptively straightforward, the unique dependencies, requirements, and challenges of each data migration project make a well-defined strategy instrumental to ensuring a smooth data transitionone that involves minimal data loss, data corruption, and business downtime.

In this article, well explore the crucial strategies and best practices for carrying out a successful data migration, from planning and preparation to post-migration validation, as well as essential considerations for ensuring replicable results.

Since data can reside in various different places and forms, and data transfer can occur between databases, storage systems, applications, and/or a variety of other formats and systems, data migration strategies will vary depending on the migration data source and destination.

Some of the more common data migration types include the following.

An application migration involves moving applications and their data from one environment to another, as well as moving datasets between different applications. These migration types often occur in parallel with cloud or data center migrations.

A cloud migration occurs when an organization moves its data assets/infrastructure (e.g., applications, databases, data services) from a legacy, on-premises environment to the cloud, or when it transfers its data assets from one cloud provider to another. Due to the complexity of cloud migrations, organizations commonly employ third-party vendors or service providers to assist with the data migration process.

A data center migration involves moving an entire on-premises data center to a new physical location or virtual/cloud environment. The sheer scale of most data center migration projects requires extensive data mapping and preparation to carry out successfully.

A database or schema migration happens when a database schema is adjusted to a prior or new database version to make migrations more seamless. Because many organizations work with legacy database and file system formats, data transformation steps are often critical to this data migration type.

A data storage migration involves moving datasets from one storage system or format to another. A typical use case for data storage migration involves moving data from tape-based media storage or hard disk drive to a higher-capacity hard disk drive or cloud storage.

Learn more: Data Migration vs. ETL: Whats the Difference?

Depending on the data complexity, IT systems involved, and specific business and/or industry requirements, organizations may adopt either a Big Bang or a Trickle Data migration strategy.

A Big Bang data migration strategy involves transferring all data from the source to the target in a single large-scale operation. Typically, an organization would carry out a Big Bang data migration over an extended holiday or weekend. During this period, data-dependent systems are down and unavailable until the migration is complete. Depending on the amount of data involved, the duration of downtime could be significant.

Though the Big Bang migration approach is typically less complex, costly, and time-consuming than the Trickle Data migration approach, it becomes a less viable option as an organizations data complexity and volume increases.

Big Bang data migrations typically take less time and are less complex and costly than Trickle Data migrations. However, they require data downtime and pose a higher risk of failure. For this reason, the approach is best suited for smaller organizations or data migration projects that use limited data volumes and datasets, as well as straightforward migration projectsbut should be avoided for complex migrations and mission-critical data projects.

A Trickle Data migration strategy involves taking an Agile approach to data migrations, adopting an iterative or phased implementation over an extended period. Like an Agile project, a Trickle Data migration project is separated into smaller sub-migrations chunks, each with its own timeline, goals, scope, and quality checks. Migration teams may also use the same vernacular and tools as Agile teams in breaking the migration up into Epics, Stories, and Sprints. By taking Trickle Datas Agile approach to data migration, organizations can test and validate each phase before proceeding to the next, reducing the risk of catastrophic failures.

A key attribute of the Trickle Data migration approach is source/target system parallelismthat is, the source and target systems are running in parallel as data is migrated incrementally. The legacy system continues to function normally during the migration process until the migration completes successfully and users are switched to the new target system. Once the data is fully validated in the new system, the legacy system can be safely decommissioned.

Because of its incremental approach and source/target system parallelism, Trickle Data migration allows for zero downtime and is less prone to unanticipated failures. However, keeping the source and target systems running at the same time incurs a cost, so organizations evaluating this migration strategy should expect a more expensive and time-consuming migration journey. Developers and data engineers must also keep both systems synchronized continuously until the migration completes, which again requires significant technical expertise and overhead to successfully carry out.

Regardless of which data migration strategy is in play, a successful data migration project starts with an initial comprehensive analysis and assessment of the datas journey. This includes the following planning tasks and preparation activities:

After completing planning and assessment activities, the data migration project should commence with data migration process testing. The following activities should be carried out to ensure the accuracy and reliability of the data in the new system.

Perform a trial migration by creating a test environment that mirrors the production environment. This will allow you to identify and resolve issues without impacting live data.

To assess the accuracy of the migration and identify any potential data quality issues, test the migration process using a representative data sample.

In software engineering, UAT is the crucial final phase in the software development life cycle (SDLC) before a software product is deployed to production. This phase plays a pivotal role in ensuring the successful delivery of a software application, as it verifies that the achieved success criteria matches the end-users expectations. For this reason, its also referred to as End-User Testing or Beta Testing, since the actual users or stakeholders test the software.

During this phase, real-world scenarios are simulated to ensure that the software meets the intended user/business requirements and is ready for release.

Taking cues from the software world, modern organizations will often incorporate UAT testing into their data migration processes in order to validate that they meet data end-users specific requirements and business needs. Adopting UAT in the migration process will bring end-users into the fold, incorporate their feedback, allow for necessary adjustments as needed, and validate that the migrated data is working as expected.

Although every data migration is unique, the following principles and best practices apply universally to every data migration project. Be sure to keep these procedures top-of-mind during the course of your data migration project.

Your data migration project may involve downtime or service disruptions, which will impact business operations. Schedule the data migration during off-peak hours or weekends to minimize its impact on regular business activities.

Incremental data migrations are usually the safest route to followif feasible, migrate your data incrementally and allow the system to remain operational during the migration. This may require the implementation of load balancing to distribute the migration workload efficiently and avoid overloading the target system.

Ongoing stakeholder communications is crucial throughout the data migration process. This should include keeping everyone informed about the migration schedule, potential disruptions, and expected outcomes, as well as providing end-user training/instructions to smooth the transition and prevent any post-migration usability issues.

Once the migration is complete, perform post-migration validation to verify that all data is accurately transferred and that the new system functions as expected. Conduct regular audits to ensure data integrity and compliance with data regulations.

Ongoing monitoring of the new systems performance is vital for surfacing any post-migration data loss and/or data corruption issues. Regularly assess the target systems performance and investigate any potential data-related performance bottlenecks/issues.

Last but certainly not least, ensure that data security and compliance requirements are met during and after the migration process. This may include implementing data encryption at rest and in transit, access controls, and data protection measures to safeguard sensitive information.

Data migrations may be unavoidable, but data migration failures can certainly be avoided by following a well-defined data migration strategyone that incorporates comprehensive planning, ongoing data quality analysis, proper testing, and continuous monitoring. By planning ahead, choosing the right approach, and following best practices, organizations can minimize the risk of data loss, ensure data integrity, and achieve a successful and seamless transition to new systems or environments.

Read next: Top 5 Data Migration Tools of 2023

More:
Data Migration: Strategy and Best Practices - Datamation

Read More..

Threat and Vulnerability Roundup for the week of August 13th to 19th – CybersecurityNews

Welcome to Cyber Writes weekly Threat and Vulnerability Roundup, where we provide the most recent information on cybersecurity news. Take advantage of our extensive coverage and keep yourself updated.

All significant flaws exploits, and modern attack techniques have been highlighted. To keep your devices secure, we also provide the most recent software updates available.

These alarming findings have pushed businesses all across the world to review their cybersecurity postures and take urgent action. To be safe, keep up with our daily updates.

Ford recently identified a buffer overflow flaw in the Wi-Fi driver used by it in the SYNC 3 infotainment system. After the discovery, Ford quickly alerted about this flaw and disclosed the vulnerability publicly.

Car hijacking by hackers exploiting various functions of the car is known, but the real-world execution of such attacks remains challenging.

The vulnerabilities, CVE-2023-38401 and CVE-2023-38402, affect the HPE Aruba Networking Virtual Intranet Access (VIA) client for the Microsoft Windows operating system. If the exploit is successful, the attacker can overwrite arbitrary files.

HPE Aruba Networking has issued an upgrade to address these multiple high-severity vulnerabilities. There is no workaround for these vulnerabilities.

An SQL injection vulnerability was discovered in the web-based management interface of Cisco Unified Communications Manager (Unified CM) and Cisco Unified Communications Manager Session Management Edition (Unified CM SME).

Cisco Unified CM is used for handling voice and video calls, whereas Cisco Unified CM SME is used for session routing intelligence.

This SQL injection vulnerability allows an authenticated remote attacker to conduct SQL injection attacks on any affected system. However, Cisco has released software updates to fix this vulnerability.

The Airplane mode in smartphones ensures safe device use on flights, as this feature prevents interference with critical flight systems by deactivating all the wireless functions of the smartphone.

Researchers at Jamf Threat Labs have recently developeda post-exploit persistence method for iOS 16. If it is exploited successfully, it lets attackers set up a fake Airplane Mode with all the original Airplane Modes user interface features to hide their malicious app. Thisallowstheattackertokeepaccesstothedeviceevenwhentheuserthinksitisoffline.

As per reports, Several vulnerabilities were discovered in Zooms Zero Touch Provisioning (ZTP) that allows threat actors to gain full remote administration of the devices resulting in activities like eavesdropping, pivoting through devices, and building a botnet with compromised devices.

In addition to this, threat actors can also reconstruct the cryptographic routines with AudioCodes devices to decrypt sensitive information like passwords and configurations that are available due to improper authentication.

ACommand Injection vulnerabilitywas recently discovered on IBM Security Guardium which allows threat actors to execute arbitrary commands on the affected system remotely.

This vulnerability was due to improper neutralization of special elements used in OS command (CWE-78).

IBM Security Guardium is a data protection platform that can be used by security teams to automatically analyze data environments considered sensitive.

The CryptoService function in the Cisco Duo Device Health Application for Windows has a vulnerability tracked as (CVE-2023-20229).

This might allow a low-privileged attacker to carry out directory traversal attacks and overwrite arbitrary files on a susceptible device.

Cisco has issued software upgrades to address this vulnerability. There are no workarounds for this issue.

Organizations use Citrix ShareFile, a cloud-based platform, to store and share large files. It also allows users to create branded, password-protected files through their services.

ShareFile Storage Zone enables administrators to choose between ShareFile-managed, secure cloud, or IT-managed storage zones (On-Prem) within an organizations data center.

ShareFile Storage Zone Controller is an extended ShareFile Software as a Service cloud storage that offers private data storage with a ShareFile account.

This years Hack-A-Satcompetition challengedteams to hack into an actual satellite in orbit. The US Air Force Moonlighter, which was launched especially for the event, was the first real satellite the hackers were permitted to target.

The Aerospace Corporation and the U.S. Air Force Research Laboratory developed thesmall cubesat known as Moonlighter, launched on June 5, 2023, on a SpaceX Falcon 9 rocket alongside a cargo payload for the International Space Station.

Five teams participated in the challenge, with mHACKeroni, a team of five Italian cyber research firms members, taking first place this year. $50,000 was awarded for first place.

Reports indicate that there seems to be an ongoing campaign that lures victims into installing a Remote Administration Tool called NetSupport Manager with fake Chrome browser updates.

Threat actors use this remote administration software as an info stealer and to take control of the victims computers. Investigations point this to a suspected SocGholish campaign which was previously conducted by a Russian threat actor but remains inconclusive.

However, the SVP of Trellix Advanced Research Center stated that Chromium with 63.55% of market share is now the de facto most targeted browser for NetSupport RAT attacks, due to the global usage. Organizations need holistic global threat intelligence and innovative security solutions to get the governance and tools needed to reduce the cyber risk.

A malware campaign targeting the Ministries of Foreign Affairs of NATO-aligned countries was recently discovered, which used PDF files masquerading as a German Embassy email. One of the PDF files consists of Duke malware which was previously linked with a Russian-state-sponsored cyber espionage group, APT29.

APT29 was attributed to Russias Foreign Intelligence Service (SVR) and uses Zulip, an open-source chat application for command and control. This evades and hides the malicious network traffic behind legitimate traffic.

ScrutisWeb is a secure solution that aids global organizations in monitoring ATMs, enhancing issue response time, and this solution is accessible through any browser.

The following things could be done with the help of this secure solution Monitor hardware, Reboot a terminal, Shut down a terminal, Send files, Receive files, Modify data remotely, and Monitor the bank card reader.

Cybersecurity researchers at Synack recently discovered several vulnerabilities in the ScrutisWeb ATM fleet monitoring software developed by Iagona.

The Monti ransomware was found in June 2022 that attracted notice due to its close resemblance to the Conti ransomware, both in name and tactics, drawing attention from cybersecurity experts and organizations.

Monti ransomware group has been observed to employ tactics similar to those of the Conti team, including utilizing their TTPs and leaked source code and tools.

Apart from this, Monti also consistently targeted the companies and posted their breaches to expose their details on a leaked site built by the operators of Monti.

In the current world of cybersecurity, security threats are evolving at a rapid pace, as there are always new problems to deal with.

Among the ever-evolving threats, SMS Bomber attacks are one of the modern attacks in the current threat landscape that can cause severe and adverse effects.

In SMS Bomber attacks the attacker hit the victim by flooding their phone number with numerous text messages. These large amounts of SMS overload the phones with unwanted triggers that flood the device with unwanted Vibrations, Alert sounds, and Notifications.

Web servers are a prime target for threat actors due to their open and volatile nature. However, these servers must remain open to provide various web services to users.

Web services that are provided on Windows servers by the Web servers include the following elements:-

The Cuba ransomware seems to be gaining more pace with each passing year, and this ransomware has been operating and active since 2019.

Until now, the operators of the Cuba ransomware have executed several high-profile attacks to target many industries and sectors. Besides this, it has already completed various prominent cross-industry episodes throughout early 2023.

Cybersecurity analysts at the BlackBerry Threat Research team recently analyzed a June campaign in which they revealed that this ransomware group attacked critical US infrastructure and a Latin American IT integrator.

Businesses are looking to digital transformation and cloud services to support new working practices. This would be extremely simple for criminals to get into essential data center power management gear, turn off electricity to numerous linked devices, and interrupt all types of servicesfrom crucial infrastructure to commercial applications.

The Trellix Advanced Research Centre focused exclusively on the power supply and management systems used in data centers.

Researchers discovered four vulnerabilities in CyberPowers PowerPanel Enterprise Data Centre Infrastructure Management (DCIM) platform and five vulnerabilities in Dataprobes iBoot Power Distribution Unit (PDU).

Hackers use legitimate Amazon Web Services (AWS) S3 buckets to send phishing emails. Recent trends have seen cybercriminals leveraging well-known platforms like Google, QuickBooks, and PayPal to send out phishing emails, making detection challenging for both security services and end-users.

In this new wave of phishing attacks, hackers are turning to AWS S3 Buckets to host phishing links, providing them with a more convincing and legitimate faade.

The data of 760,000 Discord.io members has been advertised for sale on a darknet forum by a hacker using the pseudonym Akhirah.

On Monday, August 14, 2023, a data breach seriously endangered almost 760K customers data privacy.

Using the platform Discord.io, users can make unique, personalized Discord invites. Email addresses, hashed passwords, and other user-specific information are included in the database that is being provided.

The CryptoRom scam uses ChatGPT to trick victims into downloading fake crypto-trading mobile applications. Android and iPhone users have reported increased instances of similar fraud utilizing apps from official app stores.

Within the app where they first establish contact with the target, the scammer(s) engage in an initial dialogue.

Once on a private chat platform like WhatsApp, Telegram, or LINE, they promote the concept of exchanging cryptocurrency. They promise to teach the victim how to use a (fraudulent) cryptocurrency trading program and lead them through the installation and transfer of cash, ultimately diverting off as much of the victims money as they can.

Besides Windows OS, now threat actors are also actively targeting Mac systems to accomplish their illicit goals. Cybersecurity analysts at AT&T Alien Labs recently observed that threat actors are actively turning Mac systems into proxy exit nodes.

The OSX malware, AdLoad, emerged in 2017, and since then, its two major campaigns were highlighted in 2021 by SentinelOne and in 2022 by Microsoft.

Microsofts report on UpdateAgent reveals that AdLoad, a malware that spreads through drive-by compromise, hijacks users traffic and injects advertisements and promotions into webpages and search results by redirecting it through the adware operators servers.

The Cloudflare R2 hosting service like the following platforms, which provides a cost-effective large-scaledata storage platform to developers with no exit bandwidth charges:-

For beta testing, the Cloudflare R2 was initially launched in May 2022, and in August 2022, Cloudflare launched its R2 cloud hosting service publicly.

The cybersecurity analysts at Netskope Threat Labs recently noteda shocking 61-fold surge in traffic to Cloudflare R2-hosted phishing pages from February to July 2023.

Threat actors are known to use several methods to lure victims into their websites and make them download their malicious payload, which will allow them to take full control of the system.

However, a recent report indicated that threat actors have been using a malvertising campaignfor dropping info stealers and other malware that are probably used for initial compromise for ransomware operations.

Structured audit logs, known as provenance graphs, outline system execution history, and recent studies investigate using them for automated host intrusion detection, stressing on APTs mainly.

It has been discovered that an attacker installed web shells on susceptible Citrix NetScalers, exploiting the CVE-2023-3519 flaw to acquire persistent access.

This critical zero-day vulnerability poses a significant risk as it can enable remote code execution (RCE) on both NetScaler ADC and NetScaler Gateway.

Exploiting this vulnerability, malicious actors have been successful in implanting web shells into the crucial infrastructure of an organization.

Link:
Threat and Vulnerability Roundup for the week of August 13th to 19th - CybersecurityNews

Read More..

Application Delivery Network Market Size & Share Analysis – Growth … – GlobeNewswire

New York, Aug. 18, 2023 (GLOBE NEWSWIRE) -- Reportlinker.com announces the release of the report "Application Delivery Network Market Size & Share Analysis - Growth Trends & Forecasts (2023 - 2028)" - https://www.reportlinker.com/p06483790/?utm_source=GNW The Application Delivery Network Market size is expected to grow from USD 7.82 billion in 2023 to USD 13.13 billion by 2028, at a CAGR of 10.92% during the forecast period (2023-2028).

An Application Delivery Network refers to the collection of services deployed simultaneously over a network to offer application availability, security, visibility, and acceleration from application servers to application end users. Application delivery networking comprises WAN optimization controllers (WOCs) and application delivery controllers (ADCs).

Key HighlightsThe demand for performing various tasks, such as load balancing, complex traffic management, SSL encryption, web application firewall, DDoS protection, authentication, and SSL VPN, are now integral elements to application delivery. Thus, the application delivery network has gained vital importance across the end-user industries undergoing rapid digital transformation.A significant driver for the market is the explosive growth in the number of applications hosted in the cloud, which poses the challenges of deployment and management for organizations with a vast portfolio of applications in multi-cloud environments.Virtualization is another prominent trend augmenting the market for ADN. As enterprises seek to assert consumer and employee-friendly interfaces while maintaining uniformity, security, and control across various devices, the application delivery network plays a critical role. The growing adoption of desktop virtualization, web-based applications, and virtualized mobile applications is increasing the importance of ADN, particularly across the BFSI, IT and Telecom, and government sectors that have registered the highest adoption of digitization across their data-centric operations.The factors such as a lack of skilled workforce and the absence of standards and protocols limit the market growth. Also, complex integrated systems and the integration AND into the existing systems is a difficult task that confines the growth.Due to COVID-19, Digital transformation initiatives worldwide are driving rapid adoption of multi-cloud and hybrid environments to serve customers and facilitate workforce transformation, particularly with the recent surge in work-from-home (WFH) requirements.

Application Delivery Network Market Trends

Cloud-based Delivery to Witness the Highest Growth

Complex business models are being improved by cloud platforms, which are also controlling more global integration networks. Cloud platforms are particularly adaptable to changing business needs and offer the same features as an on-premises solution.The popularity of cloud-based application delivery networking has been fueled by ongoing trends in cloud computing, SaaS platforms, and the use of public and private clouds, even though traditional on-premise delivery network solutions still have a sizable market share.Because cloud storage is only viable with a strong application delivery network, the migration of end users and businesses to cloud storage is anticipated to open up enormous market prospects.Moreover, By allowing enterprise IT teams, who frequently lack the necessary security expertise, to offload security management to the cloud, cloud adoption for application delivery improves the capacity to meet bandwidth surges while saving time and resources for the organization.The market for application delivery networks is anticipated to experience significant growth throughout the projected period due to the rapid expansion of cloud-based apps and the growing trend of BYOD in many companies.

Asia-Pacific is the Fastest Growing Region

The BYOD trend and expanding cloud computing adoption are anticipated to fuel the market in this area. As public cloud computing becomes more widely used in China, many businesses are moving their business systems to cloud platforms.Data security, tenant isolation, and access control issues have steadily risen to the forefront of these businesses concerns. Increased cloud delivery network solutions have been made possible by these causes.Moreover, the expansion of applications and the increased utilization of data centers by social media businesses and cloud service providers. Together with this, the expansion of the ADN market in this region would be aided by more companies switching to cloud services.Additionally, the constantly shifting web traffic patterns in video, voice, ERP and unstructured data would present opportunities for essential players to expand their application data networks (ADN) across the region and offer their clients high-quality services.One of the primary factors propelling the growth of the application delivery network market is the rising need for big data, cloud computing, and virtualization in China and India, which raises the need for effective and dependable web solutions.The need for cloud-based application delivery services is anticipated to rise as other financial institutions adopt the trend. Similarly, government laws have catalyzed expanding cloud services.

Application Delivery Network Industry Overview

The application delivery market is highly competitive, with many big and small players always competing against each other. The major players in the market are using technological innovations to stay ahead of the competition. Many players are adopting strategies like mergers and acquisitions to retain their position in the market. Some major players are Cisco systems, Citrix systems, Symantec Corp, and Dell Inc., among others.

February 2023 - Cisco Systems Inc. announced innovations in cloud-managed networking, delivering on its promise to help customers simplify their IT operations. With powerful new cloud management tools for industrial IoT applications, simplified dashboards to converge IT and OT operations, and flexible network intelligence to see and secure all industrial assets, Cisco delivers a unified experience that provides true business agility.December 2022 - A10 Networks, Inc has launched A10 Defend, a trial software-as-a-service (SaaS) offering that combines threat insights with in-depth knowledge of network data with examinations of indicators of compromise used in attacks. A10 develops distinctive and useful insights into the use case requirements of its customers by combining internal networking expertise gained from dealing with a wide global customer base with cybersecurity research.December 2022 - Juniper Networks Inc has announced a collaboration with Indonet to help automate, modernize, and facilitate an experience-first expansion of its network infrastructure. Indonet utilized Apstra to validate the design, deployment, and operation of the EVPN/VXLAN overlay and IP fabric underlay of its latest data center, both built on Juniper QFX Series Switches. Using validated templates and zero-touch provisioning has resulted in reduced deployment times and reliable data center operations, allowing Indonet to significantly streamline the day-to-day management of its data center networks and unify them in a virtual environment seamlessly.

Additional Benefits:

The market estimate (ME) sheet in Excel format3 months of analyst supportRead the full report: https://www.reportlinker.com/p06483790/?utm_source=GNW

About ReportlinkerReportLinker is an award-winning market research solution. Reportlinker finds and organizes the latest industry data so you get all the market research you need - instantly, in one place.

__________________________

Link:
Application Delivery Network Market Size & Share Analysis - Growth ... - GlobeNewswire

Read More..

The Good, the Bad and the Ugly in Cybersecurity – Week 33 – SentinelOne

The Good | DigiHeals Aims to Boost Resilience of Healthcare Sector to Fight Off Cyber Attacks

The healthcare sector has borne a particularly tough brunt of attacks over the last few years as ransomware-wielding cybercriminals have sought easy-pickings from often-under-resourced public services. Good news this week, then, as the Biden-Harris administrations ARPA-H project has launched a digital health security initiative to help ensure patients continue to receive care in the wake of a medical facility cyberattack.

The initiative, dubbed DigiHeals, aims to encourage proposals for proven technologies developed for national security and apply them to civilian health systems, clinical care facilities, and personal health devices.

The aim is to focus on cutting-edge security protocols, vulnerability detection, and automatic patching in order to limit the ability for threat actors to attack digital health software, with the ultimate objective being to ensure continuity of care for patients in the wake of a cyberattack on a medical facility.

Aside from a lack of cybersecurity resources, healthcare services present unique problems for digital defense, as medical facility networks are typically made up of a vast patchwork of disparate devices, systems, and services. The DigiHeals project hopes to encourage submissions from researchers, both amateur and professional, from a wide range of fields and expertise. Accepted proposals related to vulnerability detection, software hardening, and system patching, as well as the expansion or development of security protocols, will receive funding and further support from the project.

Bad news for Citrix users this week as CISA are warning that cyber adversaries are making widespread use of two n-day vulnerabilities, CVE-2023-24489 and CVE-2023-3519. Neither are new, but in-the-wild exploitations are on the rise, with some admins having patched their systems but failing to check whether they had already been breached.

CVE-2023-3519 is a vulnerability in Citrixs networking product NetScalers, first disclosed last month. Researchers say that almost 70% of patched NetScalers still contain a backdoor, indicating that admins applied the patch after the bug had been successfully exploited and did not check or discover the compromise.

According to the researchers, it appears an adversary exploited the bug in an automated fashion in mid-July, dropping webshells on vulnerable systems. The webshells allow for the execution of arbitrary commands, even if the NetScaler is subsequently patched or rebooted.

Equally concerning, CVE-2023-24489 is a bug with a CVSS score of 9.1 out of 10 affecting the Citrix Content Collaboration tool ShareFile. Exploitation allows an unauthenticated attacker to remotely compromise customer-managed ShareFile storage zones controllers.

CISA advised on Wednesday that the bug was being actively exploited. Researchers at GreyNoise reported a steep spike in attacker activity around CVE-2023-24489 after the advisory went public, indicating that attackers are racing against time to exploit vulnerable instances before security teams plug the gap.

Researchers believe there are anywhere between 1000-6000 vulnerable instances that are accessible from the public internet.

In both cases, admins are urged both to patch without delay and to investigate whether a compromise may have already occurred.

Cloud security is in the spotlight again this week as cloud storage service Cloudflare R2 has reportedly seen a 61-fold increase in hosted phishing pages in the last six months. R2, which offers a similar service to Azure blob and AWS S3, is being used for campaigns that primarily phish for Microsoft login credentials, although Adobe, Dropbox and other cloud apps login pages have also been targeted.

The massive increase may relate to the fact that R2, a relatively new entrant in the field of cloud storage, offers some free services to attract customers that threat actors have found useful to abuse. First, fake login pages are hosted on a free subdomain that can be reused without limit. The domains all have the pattern:

Second, Cloudflare offers a free CAPTCHA service called Turnstile to help legitimate websites reduce spam. The threat actors have deployed Turnstile to prevent URL scanners and internet analyzers from examining the phishing pages content and marking them as dangerous. The use of the CAPTCHA has the added bonus of making the site seem more legitimate to unsuspecting users.

In addition, victims are redirected to the phishing pages from other malicious websites, and the former only serve up the fake login pages if the referring sites are recognized as the source. Researchers say that referring web pages include a timestamp after a hash (#) symbol in the URL. If the URL parameter is missing, the visitor is instead redirected to Googles home page, helping to ensure only intended victims can see the phishing content.

The news comes as the same researchers report that the number of cloud apps being abused to deliver malware has increased to 167, with Microsoft OneDrive, Squarespace, GitHub, SharePoint, and Weebly topping the list. Amazon AWS login pages were also recently targeted in a cloud phishing campaign using Google ads, underlying the efforts attackers are now making to capitalize on the rise of cloud services in the enterprise.

Visit link:
The Good, the Bad and the Ugly in Cybersecurity - Week 33 - SentinelOne

Read More..

Fayetteville judge to decide whether to block online age verification … – Arkansas Online

FAYETTEVILLE -- A federal judge said Tuesday he'll likely decide whether to block a new online age verification requirement for minors using social media before the new state law is set to take effect Sept. 1.

Arkansas legislators moved to restrict social media access for minors in April when it passed Senate Bill 396, now Act 689, which will require large social media companies to contract with third-party vendors to do age verification checks before people can create a new social media profile. Those younger than 18 will be required to seek parental permission before they can open an account under the law.

In response, a group representing large tech firms filed a lawsuit in federal court in June challenging the law.

NetChoice, a Washington, D.C.-based nonprofit firm representing major social media companies such as Meta, the parent company of Facebook and Instagram; X, formerly known as Twitter; and TikTok sued the state arguing the new law, known as the Social Media Safety Act, violates the First Amendment rights of internet users and could put their private information at risk.

NetChoice asked for a preliminary injunction to prevent the state from enforcing the law while the lawsuit proceeds in federal court. U.S. District Judge Timothy L. Brooks said during a hearing Tuesday he wants to rule on the motion before Sept. 1.

The law is unconstitutional because it empowers the state to tell Arkansans what types of information they're allowed to access online, forces them to hand over their most sensitive documents to use the internet and seizes decision making from parents and families, according to NetChoice.

"That is an unconstitutional power grab, and we're petitioning to put a stop to it," Chris Marchese, an attorney and director of litigation at NetChoice, said in a news release when the lawsuit was filed.

Arkansas Attorney General Tim Griffin said in a brief opposing the preliminary injunction that governments have always had authority to protect minors by regulating where they can go and what they can see.

"Throughout our nation's history, governments have designated certain areas that are not appropriate for minors to occupy. From bars to casinos, state and local governments have regulated minors' access to such establishments, due to the potentially harmful nature of what lies inside," Griffin wrote. "The Constitution has always allowed such regulations. The Social Media Safety Act of 2023 follows in those footsteps to address a new frontier, protecting minors from the harmful and predatory environments of social media. It does so by requiring all potential users to verify their age and by requiring minors to have parental permission to create an online profile."

Griffin argued the law is narrowly tailored to serve the compelling government interest of protecting minors.

"Act 689, however, does not infringe on free speech because it regulates nonexpressive conduct by treating social media as places (like bars or casinos), and sets parameters for when minors can be present in those places," Griffin wrote.

The law defines social media companies as a forum allowing users to upload, create or view content from other accounts and allowing users to "interact with other account holders or users, including without limitation establishing mutual connections through request and acceptance."

The law contains exemptions, including for email providers; companies providing direct messaging services; streaming services; news, sports, and entertainment websites; online shopping; or "other content that is pre-selected by the provider and not user-generated."

The law also would exempt companies generating less than $100 million, which according to the lawsuit could spare social media platforms such as Truth Social, Twitch, Mastodon and Discord. Social media platforms whose "primary purpose is not social interaction," such as LinkedIn, or those providing cloud storage service would also be exempted.

Social media services making up less than 25% of a company's revenue would be exempted, which could mean YouTube -- owned by Google -- would be immune from requiring age verification, the lawsuit claims.

Companies allowing a user to generate short video clips of dancing, voice-overs or other acts of entertainment, such as TikTok, aren't exempted.

Social media companies violating the new age checks could be subjected to a $2,500 fine for each violation.

Critics of the act say the implications of age verification checks, which will require new users to submit a digital version of a government-issued ID to prove their age, would potentially expose sensitive information to an online database. Personal information wouldn't be submitted directly to social media companies but rather a third-party firm that specializes in verifying users, according to the law.

The law would prohibit social media companies from retaining access to personal data.

The lawsuit argues the Social Media Safety Act illegally preempts federal law, which already regulates what information websites can collect on children and argues the law is unconstitutionally vague and attempts to illegally regulate out-of-state businesses.

More here:
Fayetteville judge to decide whether to block online age verification ... - Arkansas Online

Read More..

IBM developing S3 interface tape library Blocks and Files – Blocks & Files

IBM is adding a server and software to its Diamondback tape library to build an on-premises S3 object storage archive.

The DiamondBack (TS6000), introduced in October last year, is a single-frame tape library with up to 14 TS1170 tape drives and 1,458 tape LTO-9 cartridges, storing 27.8PB of raw data, and 69.6PB with 2.5:1 compression. It transfers data at up to 400MBps with 12 drives active and has a maximum 17.2TB/hour transfer rate.

DiamondBack S3 has an added x86 server, as the image shows, which provides the S3 interface and S3 object-to-tape cartridge/track mapping. Client systems will send Get (read) and Put (write) requests to DiamondBack S3 and it will read S3 objects from, or write the objects to, a tape cartridge mounted in one of the drives.

IBMs Advanced Technology Group Tape Team is running an early access program for this Diamondback S3 tape library. Julien Demeulenaere, sales leader EMEA Tape & High-End Storage, says Diamondback S3 will be a low-cost repository target for a secure copy of current or archive data.It will enable any user familiar with S3 to move their data to Diamondback S3. A storage architect can sign up for a 14-day shared trial on a Diamondback S3 managed by IBM, so they can verify the behavior of S3 for tape.

The S3-object-on-tape idea is not new, as seen with Germanys PoINT Software and Systems and its Point Archival Gateway product. This provides unified object storage with software-defined S3 object storage for disk and tape, presenting their capacity in a single namespace. It is a combined disk plus tape archive product with disk random access speed and tape capacity.

Archiving systems supplier XenData has launched an appliance which makes a local tape copy of a public cloud archive to save on geo-replication and egress fees.

Quantum has an object-storage-on-tape tier added to its ActiveScale object storage system, providing an on-premises Amazon S3 Glacier-like managed service offering. SpectraLogics BlackPearl system can also provide an S3 interface to a backend tape library.

DiamondBack S3 does for objects and tape what the LTFS (Linear Tape File System) does for files and tape, with its file:folder interface to tape cartridges and libraries. Storing objects on tape should be lower cost than storing them on disk, once a sufficient amount has been put on the tapes but at the cost of longer object read and write times compared to disk. IBM suggests it costs four times less than AWS Glacier with, of course, no data egress fees.

Demeulenaere told us: There is no miracle, we cant store bucket on tape natively. Its just a software abstraction layer on the server which will present the data as S3 object to the user. So, from a user point of view, they just see a list of bucket and can only operate it with S3 standard command (get/put). But it is still files that are written by the tape drive. The server will be accessed exclusively through Ethernet; dual 100GB port for the S3 command, one GB Ethernet port for admin.

The server is exclusively for object storage. It cant be a file repository target. For that, you will need to buy the library alone (which is possible)and operate it as everybody is doing (FC, backup server).

Continued here:
IBM developing S3 interface tape library Blocks and Files - Blocks & Files

Read More..

6 Key Takeaways About IT Spending In 2023 So Far: IDC – CRN

Channel News Joseph F. Kovar August 18, 2023, 03:44 PM EDT

In our downside scenario, theres no realistic likelihood of a contraction in IT spending. The last time we had a contraction in IT spending was back in the financial crisis in 2008 and 2009. And since then, theres been a radical transformation in the IT industry and how much tech revenue is allocated to opex versus capex based on longer term cloud and subscription based revenues, says Stephen Minton, IDCs vice president of data and analytics.

Where IT Spending Is And Isnt Growing

IT spending has proven resilient, except on the PC side, despite pressures from a variety of factors within and outside of the IT industry, according to IDC this week.

The market research firm Wednesday updated IT industry watchers on the state of IT spending, with a resilient worldwide economy despite issues related to China and the war in Ukraine leaving the door open to IT spending growth, even though it was slower than that of last year.

Stephen Minton, IDCs vice president of data and analytics, said Wednesday during a webinar on the topic that expectations of a recession in the U.S. have yet to be realized, although the possibility is still there. Meanwhile, uncertainties in other key countries including China, Germany, and the U.K. are keeping open the possibility of a slowdown in spending.

[Related: IDC, Gartner: 2022 PC Demand Dropped To Its Lowest Level In Years]

That could lead to a longer period of slower growth extending into 2024 if the U.S. does start to lean more into a recession in the next 12 months and other regions start experiencing negative spillover effects from China and the Ukraine war during the winter months, he said during IDCs State of the Market: IT Spending Mid-Year Update by Industry. The IT market next year could [then] be relatively sluggish again, growing by around 4 or 5 percent vs. our baseline forecast which is that well get back up to growth of around 8 percent in 2024.

Post-COVID pandemic PC spending remains below its peak, but other parts of the IT business including servers, storage, software, and services are seeing growth even if that growth is not as strong as it was last year, Minton said.

And while PC spending and macroeconomic and geopolitical issues remain a drag on IT spending growth, spending on newer technologies is expected to be a bright spot going forward, helping to increase the growth of overall IT spending, Minton said.

Businesses are continuing to invest in cloud and digital transformation, all those longer term projects which are tied to this strong growth in software and services, which is continuing in the first half of the year, he said. Of course, a lot of interest has grown in artificial intelligence.

There are a lot of factors that contribute to both a positive and a negative outlook on IT spending. For a look at those factors, click through our slideshow.

Joseph F. Kovar is a senior editor and reporter for the storage and the non-tech-focused channel beats for CRN. He keeps readers abreast of the latest issues related to such areas as data life-cycle, business continuity and disaster recovery, and data centers, along with related services and software, while highlighting some of the key trends that impact the IT channel overall. He can be reached at jkovar@thechannelcompany.com.

See the original post here:
6 Key Takeaways About IT Spending In 2023 So Far: IDC - CRN

Read More..